Prosecution Insights
Last updated: April 19, 2026
Application No. 18/355,540

METHOD, AERIAL VEHICLE AND SYSTEM FOR DETECTING A FEATURE OF AN OBJECT WITH A FIRST AND A SECOND RESOLUTION

Final Rejection §102§103
Filed
Jul 20, 2023
Examiner
HUTCHINSON, ALAN D
Art Unit
3669
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Top Seven GmbH & Co. Kg
OA Round
2 (Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
2y 8m
To Grant
96%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
389 granted / 496 resolved
+26.4% vs TC avg
Strong +17% interview lift
Without
With
+17.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
18 currently pending
Career history
514
Total Applications
across all art units

Statute-Specific Performance

§101
9.0%
-31.0% vs TC avg
§103
44.8%
+4.8% vs TC avg
§102
24.1%
-15.9% vs TC avg
§112
15.6%
-24.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 496 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The Amendment filed 10 August 2025 has been entered. Claims 1-20 remain pending in the application. Response to Arguments Applicant's arguments filed 10 August 2025 have been fully considered but they are not persuasive. In response to applicant's argument that the references fail to show certain features of the invention of Claim 1, it is noted that the features upon which applicant relies (i.e., “The higher resolution images are captured at the same time as the lower resolution images and not in response to a classification of the lower resolution images. The higher resolution images show the same area of a landscape as the lower resolution images and not selected areas of an object having damage already detected.”) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). In response to applicant's argument that the references fail to show certain features of the invention of Claim 12, it is noted that the features upon which applicant relies (i.e., ““Accordingly, images of first and second resolution overlap, but there is not a plurality of second, higher resolution images for a respective first, lower resolution image.”) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., Scharf has higher and lower resolution images showing the exactly same area at the same point in time and not to have a lower resolution image of an area and a plurality of higher resolution images of the area, each higher resolution image showing a subsection of said area in more detail.”) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). In response to Applicant’s argument that the “teachings of Scharf, areas are captured irrespective of any classification of lower resolution images. Accordingly, none of the disclosed aerial vehicles, such as airplanes or satellites are configured to receive information from the external computer that indicate the areas of an object to be optically detected by the second resolution or to perform such a classification by themselves.” The Examiner respectfully disagrees, specifically the claim specifically states that the UAV will “receive information from the external computer that indicate the areas of the object to be optically detected” Scharf discloses that a learning computer determines damage from the low resolution image, and then use those contraints to obtain a high resolution image of the targeted area (¶40; An image selector 323, interfaced with the data repository 322, may be used to select appropriate images for further processing, e.g., to select a high-resolution image corresponding to some required geographical location/area and time constraints, and select the counterpart low-resolution image corresponding to the selected high-resolution image” Claim Rejections - 35 USC § 102 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1-7, 11-18, and 20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Scharf et. al. (US Patent Publication 2024/0011917). Regarding claim 1, Scharf discloses a method for detecting a damage of an object, comprising: (¶3) (a) flying along the object and optically detecting at least a part of the object by at least one capturing unit with the first resolution to generate a plurality of images, wherein each image represents an at least partly different area of the object, (¶4, 38) (b) evaluating the plurality of images to classify the generated images into images that do not comprise the damage and into images that comprise the damage, and (¶37) (c) optically detecting again those areas of the object whose allocated images comprise the damage with a second resolution that is higher than the first resolution. (¶55-57) Regarding claim 2, Scharf further discloses wherein step (b) is performed after flying along the object, and step (c) comprises approaching those areas of the object whose allocated images comprise the damage. (¶55-57) Regarding claim 3, Scharf further discloses wherein in step (a) and in step (c), the capturing unit generates one image each with the same focal length (inherent in satellite imagery), in step (a), the object is approached such that the first capturing unit has a first distance to the object when generating an image, and in step (c), the object is approached such that the capturing unit has a second distance to the object that is lower than the first distance when generating an image. (¶55-57) Regarding claim 4, Scharf further discloses wherein in step (a) and in step (c), the object is approached such that the capturing unit has the same or similar distance to the object when generating an image, in step (a), the capturing unit generates an image with a first focal length, and in step (c), the capturing unit generates an image with a second focal length that is greater than the first focal length. (¶55-57; when both the low resolution and high-resolution images come from satellite imagery the focal length must necessarily be greater than the first focal length) Regarding claim 5, Scharf further discloses wherein optically detecting the area again in step (c) comprises generating a plurality of partial images of the area, each with the second resolution. (¶55) Regarding claim 6, Scharf further discloses wherein position and/or location information of the capturing unit is allocated to each image generated in step (a), and in step (c), the areas of the object that are to be flown along are determined by using the position and/or location information of the images comprising the damage. (¶43) Regarding claim 7, Scharf further discloses wherein an unmanned aerial vehicle, such as a drone comprising the capturing unit, flies along the object, and step (b) comprises transmitting the images generated in step (a) from the unmanned aerial vehicle to a computer, for example a laptop computer, and evaluating the images by the computer; and evaluating the images comprises evaluating the images in an automated manner. (¶35) Regarding claim 11, Scharf further discloses wherein steps (a) to (c) are performed during flying along the object such that in step (a), an image of an area is generated, in step (b), the image generated in step (a) is classified for a further area prior to generating an image, when the image is classified as comprising the damage in step (b), prior to generating the further image, the area is optically detected again in step (c) before an image is generated for the further area and when the image is classified as not comprising the damage in step (b), an image is generated for the further area. (¶55-57) Regarding claim 15, Scharf further discloses wherein step (b) comprises AI or machine learning. (¶47) Regarding claim 12, Scharf discloses a method for detecting a damage of an object, the method comprising: (¶3) (a) flying along the object and optically detecting at least a part of the object by at least one capturing unit to generate a plurality of images, wherein each image represents an at least partly different area of the object, and wherein, for one area, an image with the first resolution and a plurality of partial images, each with a second resolution that is higher than a first resolution, are generated, (¶4, 38) (b) evaluating the plurality of images to classify the generated images into images that do not comprise the damage and into images that comprise the damage, and (¶37) (c) providing the partial images of those areas of the object whose allocated images comprise the damage. (¶55-57) Regarding claim 13, Scharf further discloses wherein an unmanned aerial vehicle, e.g., a drone comprising the capturing unit, flies along the object autonomously, step (b) comprises transmitting the images and partial images generated in step (a) from the unmanned aerial vehicle to a computer, e.g., laptop computer, and evaluating the images by the computer, wherein evaluating the images comprises evaluating the images in an automated manner and step (c) comprises providing the partial images of the area allocated to the image by the computer. (¶35) Regarding claim 14, Scharf further discloses wherein an unmanned aerial vehicle, e.g., a drone comprising the capturing unit, flies along the object autonomously, the unmanned aerial vehicle comprises a computer, wherein step (b) comprises evaluating the images and the partial images by the computer of the unmanned aerial vehicle, wherein evaluating the images and the partial images comprises evaluating the images and the partial images in an automated manner and in step (c), the unmanned vehicle transmits the partial images to an evaluating unit, e.g., for classifying or cataloging the detected damages. (¶55-57) Regarding claim 16, Scharf discloses an unmanned aerial vehicle, e.g., drone for detecting a damage of an object, comprising: at least one capturing unit for generating images by optical detection, wherein the unmanned aerial vehicle can be controlled to (¶3) • fly along the object and to optically detect at least part of the object by the capturing unit with a first resolution to generate a plurality of images, wherein each image represents an at least partly different area of the object and • optically detect again those areas of the object whose allocated images comprise the damage with a second resolution that is higher than a first resolution; (¶4, 38) wherein the unmanned aerial vehicle is configured to • transmit the plurality of images to an external computer, e.g., laptop computer that classifies the generated images into images that do not comprise the damage and into images that comprise the damage, • receive information from the external computer that indicate the areas of the object to be optically detected by the second resolution, or (¶35, 40) wherein the unmanned aerial vehicle comprises a computer that is configured to evaluate the plurality of images to classify the generated images into the images that do not comprise the damage and into the images that comprise the damage. (¶55-57) Regarding claim 17, Scharf discloses an unmanned aerial vehicle, e.g., drone, for detecting a damage of an object comprising: at least one capturing unit for generating images by optical detection, wherein the unmanned aerial vehicle can be controlled to (¶3-4, 38) • fly along the object and optically detect at least a part of the object by the capturing unit to generate a plurality of images, wherein each image represents an at least partly different area of the object and (¶37) • generate, for each area, an image with a first resolution and a plurality of partial images, each with a second resolution that is higher than the first resolution. (¶55-57) Regarding claim 18, Scharf discloses a system for detecting a damage of an object comprising: (¶3) an unmanned aerial vehicle, e.g., a drone, wherein the unmanned aerial vehicle can be controlled to fly along the object to optically detect at least a part of the object by at least one capturing unit with a first resolution to generate a plurality of images, wherein each image represents an at least partly different area of the object, (¶4, 35, 38) wherein the system is configured to evaluate the plurality of images to classify the generated images into images that do not comprise the damage and into images that comprise the damage and (¶37) wherein the unmanned aerial vehicle can be controlled to optically detect again those areas of the object whose allocated images comprise the damage with a second resolution that is higher than a first resolution. (¶55-57) Regarding claim 20, Scharf discloses a system for detecting a damage of an object, comprising: an unmanned aerial vehicle, e.g., drone, wherein the unmanned aerial vehicle can be controlled to (¶3, 35) • fly along the object and optically detect at least a part of the object by the capturing unit to generate a plurality of images, wherein each image represents an at least partly different area of the object and (¶4, 38) • generate, for each area, an image with a first resolution and the plurality of partial images, each with a second resolution that is higher than the first resolution, (¶55-57) wherein the system is configured to evaluate the plurality of images to classify the generated images into images that do not comprise the damage and into images that comprise the damage and provide the partial images of those areas of the object whose allocated images comprise the damage, e.g., for classifying or cataloging the detected damages. (¶37) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 8-10, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Scharf as applied to claims 1, and 18 above, and further in view of Tofte et. at. (US Patent 8,818,572). Regarding claim 8, Tofte teaches wherein in step (a), the unmanned aerial vehicle flies along the object autonomously, step (b) comprises generating waypoints by using the position and/or location information of the images comprising the damage and transmitting the waypoints to the unmanned aerial vehicle, and in step (c), the unmanned aerial vehicle approaches the areas of the object autonomously by using the waypoints. (C7, L10-20) It would have been obvious to one of ordinary skill in the art at the time of filing to provide the invention of Scharf with wherein in step (a), the unmanned aerial vehicle flies along the object autonomously, step (b) comprises generating waypoints by using the position and/or location information of the images comprising the damage and transmitting the waypoints to the unmanned aerial vehicle, and in step (c), the unmanned aerial vehicle approaches the areas of the object autonomously by using the waypoints as taught by Tofte with a reasonable expectation of success because the technique for improving a particular class of devices was part of the ordinary capabilities of a person of ordinary skill in the art, in view of the teaching of the technique for improvement in other situations, would have yielded predictable results to one of ordinary skill in the art at the time of the invention. Regarding claim 9, Tofte teaches wherein an unmanned aerial vehicle, e.g. a drone comprising the capturing unit, flies along he object autonomously, the unmanned aerial vehicle comprises a computer, wherein step (b) comprises evaluating the images and generating waypoints by using the position and/or location information of the images comprising the damage by the computer of the unmanned aerial vehicle, wherein evaluating the images comprises evaluating the images in an automated manner, and in step (c), the unmanned aerial vehicle approaches the areas of the object autonomously by using the waypoints. (C7, L10-20) It would have been obvious to one of ordinary skill in the art at the time of filing to provide the invention of Scharf with wherein an unmanned aerial vehicle, e.g. a drone comprising the capturing unit, flies along he object autonomously, the unmanned aerial vehicle comprises a computer, wherein step (b) comprises evaluating the images and generating waypoints by using the position and/or location information of the images comprising the damage by the computer of the unmanned aerial vehicle, wherein evaluating the images comprises evaluating the images in an automated manner, and in step (c), the unmanned aerial vehicle approaches the areas of the object autonomously by using the waypoints as taught by Tofte with a reasonable expectation of success because the technique for improving a particular class of devices was part of the ordinary capabilities of a person of ordinary skill in the art, in view of the teaching of the technique for improvement in other situations, would have yielded predictable results to one of ordinary skill in the art at the time of the invention. Regarding claim 10, Tofte teaches wherein flying along the object in step (a) is flying along the object autonomously by an unmanned aerial vehicle, wherein the unmanned aerial vehicle comprises the at least one capturing unit; and wherein step (b) comprises generating waypoints by using the position and/or location information of the images comprising the damage and transmitting the waypoints to the unmanned aerial vehicle; and step (c) comprises flying along the area of the object autonomously by using the waypoints. (C7, L10-20) It would have been obvious to one of ordinary skill in the art at the time of filing to provide the invention of Scharf with wherein flying along the object in step (a) is flying along the object autonomously by an unmanned aerial vehicle, wherein the unmanned aerial vehicle comprises the at least one capturing unit; and wherein step (b) comprises generating waypoints by using the position and/or location information of the images comprising the damage and transmitting the waypoints to the unmanned aerial vehicle; and step (c) comprises flying along the area of the object autonomously by using the waypoints as taught by Tofte with a reasonable expectation of success because the technique for improving a particular class of devices was part of the ordinary capabilities of a person of ordinary skill in the art, in view of the teaching of the technique for improvement in other situations, would have yielded predictable results to one of ordinary skill in the art at the time of the invention. Regarding claim 19, Scharf further discloses wherein the unmanned aerial vehicle comprises the at least one capturing unit and wherein the unmanned aerial vehicle can be controlled to fly along the object autonomously and approach those areas of the object autonomously Scharf appears to be silent as to whose allocated images comprise the damage by using the waypoints and wherein the computer is configured to generate waypoints by using the position and/or location information of the images comprising the damages; and wherein the computer is configured to transmit the waypoints to the unmanned aerial vehicle. Tofte however teaches whose allocated images comprise the damage by using the waypoints and wherein the computer is configured to generate waypoints by using the position and/or location information of the images comprising the damages; and wherein the computer is configured to transmit the waypoints to the unmanned aerial vehicle. (C7, L10-20) It would have been obvious to one of ordinary skill in the art at the time of filing to provide the invention of Scharf with whose allocated images comprise the damage by using the waypoints and wherein the computer is configured to generate waypoints by using the position and/or location information of the images comprising the damages; and wherein the computer is configured to transmit the waypoints to the unmanned aerial vehicle as taught by Tofte with a reasonable expectation of success because the technique for improving a particular class of devices was part of the ordinary capabilities of a person of ordinary skill in the art, in view of the teaching of the technique for improvement in other situations, would have yielded predictable results to one of ordinary skill in the art at the time of the invention. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALAN D HUTCHINSON whose telephone number is (571)272-8413. The examiner can normally be reached 7-5 Mon-Thur. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Navid Mehdizadeh can be reached on (571) 272-7691. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALAN D HUTCHINSON/Primary Examiner, Art Unit 3669
Read full office action

Prosecution Timeline

Jul 20, 2023
Application Filed
Apr 05, 2025
Non-Final Rejection — §102, §103
Aug 10, 2025
Response Filed
Sep 05, 2025
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602150
ENERGY STORAGE MANAGEMENT SYSTEM FOR AN AT LEAST PARTIALLY ELECTRICALLY DRIVEN VEHICLE, AND METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12576720
SYSTEM AND METHOD FOR OPERATING A VEHICLE WITH ELECTRIC POWER TAKE-OFF
2y 5m to grant Granted Mar 17, 2026
Patent 12570180
CONTROL DEVICE FOR ELECTRIFIED VEHICLE
2y 5m to grant Granted Mar 10, 2026
Patent 12570266
Automotive Electronic Control Unit
2y 5m to grant Granted Mar 10, 2026
Patent 12570156
ELECTRIC VEHICLE EMULATION SYSTEM AND METHOD
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
96%
With Interview (+17.2%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 496 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month