Prosecution Insights
Last updated: April 19, 2026
Application No. 18/591,542

UNMANNED AERIAL VEHICLE SEVERE LOW-POWER PROTECTION METHOD AND UNMANNED AERIAL VEHICLE

Non-Final OA §103§DP
Filed
Feb 29, 2024
Examiner
VORCE, AMELIA J.I.
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Autel Robotics Co. Ltd.
OA Round
2 (Non-Final)
72%
Grant Probability
Favorable
2-3
OA Rounds
2y 10m
To Grant
94%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
190 granted / 264 resolved
+20.0% vs TC avg
Strong +22% interview lift
Without
With
+22.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
23 currently pending
Career history
287
Total Applications
across all art units

Statute-Specific Performance

§101
13.1%
-26.9% vs TC avg
§103
34.1%
-5.9% vs TC avg
§102
16.0%
-24.0% vs TC avg
§112
33.1%
-6.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 264 resolved cases

Office Action

§103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This Office Action is in response to Applicant’s Amendments and Remarks filed on 1/12/2026. Response to Arguments Claim objections of the most recent Office action have been removed due to Applicant’s amendments, except for the claim objections regarding claim 13 as described below. Nonstatutory double patenting rejections of the most recent Office action have been removed due to Applicant’s timely filed Terminal Disclaimer under 37 CFR 1.321 on 1/12/2026. Claim Objections Claim(s) 13-18 is/are objected to because of the following informalities: Claim 13 recites an “and” after the “obtaining landing safety judgement information…” limitation. While the scope of the claim(s) is reasonably ascertainable, the examiner suggests, deleting the redundant “and” since it is again recited after the “acquiring a manual control command…” limitation. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 4, 7, 10, 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Akanuma et al. (US 20220274717 A1) in view of Dupray et al. (US 20170069214 A1). Regarding claim 13, and similarly claims 1 and 7, Akanuma teaches An unmanned aerial vehicle (UAV) (“drone aircraft 10”, Figs. 1-2), comprising: a body (see Fig. 1); arms, connected to the body (see Fig. 1); a power apparatus sensor (“motor 16”, Fig. 2), disposed on the arms, configured to provide flight power for the UAV (“The motor 16 controls the rotational speed of the propellers of the drone aircraft 10.”, [0045]); and a flight controller, disposed on the arms (“control unit 11”, [0033], Fig. 2, see also “information processing apparatus 100”, [0050], Fig. 3); a ground detection sensor (“camera 13”, Fig. 2), configured to acquire ground environment information (“images of the landing surface, which are captured by the camera 13 in time series”, [0037]); wherein the flight controller comprises: at least one processor (“The control units 11 and 21 may be the CPU 101.”, [0051], Fig. 3); and a memory (“memory 715”, Fig. 7) communicatively connected to the at least one processor, the memory storing instructions executable by the at least one processor, the instructions, when executed by the at least one processor, enabling the at least one processor to perform the following operations (“a memory 715 for storing information and instructions to be executed by processor(s) 710”, [0051]): acquiring ground environment information when the UAV is in a severe low-power protection state (“the drone aircraft itself needs to determine whether the drone aircraft can land on the landing surface in order to prevent the battery from being depleted and the drone aircraft from falling, for example, when the battery level is low and emergency landing must be performed.”, [0114], “In response to the instruction from the action generating unit 119, the landing possibility and approach determining unit 116 outputs an instruction to discriminate the environment of the landing surface imaged by the camera 13 to the environment discriminating unit 117.”, [0078], see “S102”, Fig. 4, where the information output by the “camera 13” corresponds to Applicant’s “ground information”); obtaining landing safety judgment information according to the ground environment information (“the wind shake characteristic calculating unit 118 calculates the wind shake characteristic value of the landing surface in the image subjected to image processing. At that time, for example, the wind shake characteristic calculating unit 118 detects a feature point of the landing surface from the images in which the landing surface is continuously captured by the camera 13, and calculates a temporal change of the feature point between the frames as a wind shake characteristic value. Such a temporal change is, for example, a moving distance between the frames of the detected feature point.”, [0082], see “S103”, Fig. 4, where the “wind shake characteristic value” corresponds to Applicant’s “landing safety judgment information”); and controlling a flight state of the UAV according to the landing safety judgment information to realize a safe landing of the UAV (“if the altitude of the drone aircraft 10 exceeds threshold values L1 and L3 corresponding to the environment of the landing surface, which is discriminated in the previous Step 102, (YES in Step S104), the camera 13 has difficulty of satisfactorily image the landing surface. Further, the wind pressure of the propellers of the drone aircraft 10 cannot be sufficiently transmitted to the landing surface, and thus the landing possibility and approach determining unit 116 outputs an instruction to the action generating unit 119 to cause the drone aircraft 10 to approach the landing surface to a predetermined altitude (Step S105)”, [0088], Fig. 4); controlling the UAV to hover and keep still when the landing safety judgment information is dangerous landing information (“If the environment of the landing surface discriminated in the preceding Step S102 is a rocky ground, and the wind shake characteristic value calculated in the preceding Step S103 is equal to or less than a threshold value L2 corresponding to a rocky ground (NO in Step S106), the landing possibility and approach determining unit 116 determines that the environment of the landing surface is an environment in which the shape of the landing surface does not change much even if the landing surface receives a sufficient wind pressure from the drone aircraft 10. Therefore, if the drone aircraft 10 is landed on the landing surface in such an environment, the drone aircraft 10 may fall and be damaged because the landing surface does not completely absorb the landing shock. Thus, the landing possibility and approach determining unit 116 outputs an instruction not to land the drone aircraft 10 on the landing surface to the action generating unit 119.”, [0092-0093], see “NO” at “S106”, Fig. 4, “In response to the instruction from the landing possibility and approach determining unit 116, the action generating unit 119 notifies the UI 30 of the fact that the drone aircraft 10 cannot be landed on the landing surface, and controls the rotational speed of the motor 16 via the flight controller 15. This maintains the drone aircraft 10 hovering at an altitude equal to or less than the threshold value L1 (Step S108).”, [0094]); keeping acquiring the ground environment information (“The environment discriminating unit 212 discriminates the environment of the landing surface from the images in which the landing surface is continuously imaged by the camera 13 at a predetermined frame rate by referring to the reference data stored in the environment database 122, and outputs the discrimination result to the landing possibility and approach determining unit 116.”, [0079]), and controlling the flight state of the UAV according to the landing safety judgment information to realize the safe landing of the UAV (“the landing possibility and approach determining unit 116 outputs an instruction to land the drone aircraft 10 on the landing surface to the action generating unit 119 (Step S107).”, [0100], “As a result, the drone aircraft 10 lands on the landing surface.”, [0101]). Akanuma does not explicitly teach acquiring a manual control command, and controlling the UAV to deviate from a current position according to the manual control command. However, Akanuma teaches “The UI 30 is an interface for exchanging information with the drone aircraft 10.” [0048] and “a remotely controllable drone aircraft to adopt a technique for accurately determining whether or not landing is possible on the landing surface in order to prevent damage” [0114]. One of ordinary skill in the art before the effective filing date would have recognized that the UAV of Akanuma can be controlled by a manual control command to deviate from a current position. See, Dupray teaches acquiring a manual control command, and controlling the UAV to deviate from a current position according to the manual control command (“In one type of a operating mode referred to as an “hybrid operating mode”, the UAV may be in automatic operating mode for certain portions (operations) of navigation and/or flight, and directed operating mode for certain portions (operations) of navigation and/or flight. For example, a UAV may be in directed operating mode with a remote human operator responsible for the direct duty of flying the UAV. However, the flight must be within certain rules or parameters (e.g., area, speed, or height restrictions as provided by certain regulatory limits). The UAV may be pre-programmed to take over the flight in an automatic operating mode to satisfy such flight rules and overriding the remote human operator. Further, a UAV in the hybrid operating mode may still retain the ability to direct certain safe operating instructions/procedures (e.g., emergency landing or avoidance procedures) as discussed above and herein with respect to this disclosure with respect to the manual operating mode.”, [0185]). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify the invention of Akanuma with the teachings of Dupray such that processor of Akanuma is further configured to control the UAV to change its position based on a manual control command, as suggested by Dupray, with a reasonable expectation of success. The motivation for doing so would be such that “the UAV may be in automatic operating mode for certain portions (operations) of navigation and/or flight, and directed operating mode for certain portions (operations) of navigation and/or flight” [0185], as taught by Dupray. Regarding claim 16, and similarly claims 4 and 10, Akanuma in view of Dupray teaches The unmanned aerial vehicle according to claim 13, and Akanuma further teaches wherein the at least one processor further performs the following operations: (“the landing possibility and approach determining unit 116 outputs an instruction to land the drone aircraft 10 on the landing surface to the action generating unit 119 (Step S107).”, [0100], “As a result, the drone aircraft 10 lands on the landing surface.”, [0101]). However, Dupray further teaches controlling the UAV to deviate from the current position according to the manual control command (“In one type of a operating mode referred to as an “hybrid operating mode”, the UAV may be in automatic operating mode for certain portions (operations) of navigation and/or flight, and directed operating mode for certain portions (operations) of navigation and/or flight. For example, a UAV may be in directed operating mode with a remote human operator responsible for the direct duty of flying the UAV. However, the flight must be within certain rules or parameters (e.g., area, speed, or height restrictions as provided by certain regulatory limits). The UAV may be pre-programmed to take over the flight in an automatic operating mode to satisfy such flight rules and overriding the remote human operator. Further, a UAV in the hybrid operating mode may still retain the ability to direct certain safe operating instructions/procedures (e.g., emergency landing or avoidance procedures) as discussed above and herein with respect to this disclosure with respect to the manual operating mode.”, [0185]). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to further modify the invention of Akanuma with the teachings of Dupray such that processor of Akanuma is further configured to control the UAV to change its position based on a manual control command, as suggested by Dupray, with a reasonable expectation of success. The motivation for doing so would be such that “the UAV may be in automatic operating mode for certain portions (operations) of navigation and/or flight, and directed operating mode for certain portions (operations) of navigation and/or flight” [0185], as taught by Dupray. Claim(s) 2-3, 8-9, 14-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Akanuma et al. (US 20220274717 A1) in view of Dupray et al. (US 20170069214 A1) in view of Ma et al. (US 20210192934 A1). Regarding claim 14, and similarly claims 2 and 8, Akanuma in view of Dupray teaches The unmanned aerial vehicle according to claim 13, and Akanuma further teaches wherein the at least one processor further performs the following operations: acquiring a hovering control command (“In response to the instruction from the landing possibility and approach determining unit 116, the action generating unit 119 notifies the UI 30 of the fact that the drone aircraft 10 cannot be landed on the landing surface, and controls the rotational speed of the motor 16 via the flight controller 15. This maintains the drone aircraft 10 hovering at an altitude equal to or less than the threshold value L1 (Step S108).”, [0094]); and controlling the UAV to hover and keep still according to the hovering control command (see “This maintains the drone aircraft 10 hovering at an altitude equal to or less than the threshold value L1 (Step S108).”, [0094]). However, Ma teaches acquiring a (“At S201, transmission instruction information of a remote control command (also referred to as a "remote control command transmission instruction information") is obtained.", [0022], Figs. 3-4, "the remote control device may obtain the transmission instruction information of the remote control command from the mobile platform, that is, the transmission instruction information may include a flag bit fed back by the mobile platform.", [0024]); and controlling the UAV (see “This maintains the drone aircraft 10 hovering at an altitude equal to or less than the threshold value L1 (Step S108).”, [0094]). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify the invention of Akanuma in view of Dupray with the teachings of Ma such that the hovering control command of Akanuma has a corresponding bit flag, as suggested by Ma, with a reasonable expectation of success. The motivation for doing so would be to use bit flags to indicate whether a control command is a correct control command, as taught by Ma [0018]. Regarding claim 15, and similarly claims 3 and 9, Akanuma in view of Dupray and Ma teaches The unmanned aerial vehicle according to claim 14, and Akanuma further teaches wherein the at least one processor further performs the following operations: controlling an altitude and a position of the UAV to keep unchanged according to the hovering control command (“In response to the instruction from the landing possibility and approach determining unit 116, the action generating unit 119 notifies the UI 30 of the fact that the drone aircraft 10 cannot be landed on the landing surface, and controls the rotational speed of the motor 16 via the flight controller 15. This maintains the drone aircraft 10 hovering at an altitude equal to or less than the threshold value L1 (Step S108).”, [0094]). However, Ma further teaches controlling (“In response to the instruction from the landing possibility and approach determining unit 116, the action generating unit 119 notifies the UI 30 of the fact that the drone aircraft 10 cannot be landed on the landing surface, and controls the rotational speed of the motor 16 via the flight controller 15. This maintains the drone aircraft 10 hovering at an altitude equal to or less than the threshold value L1 (Step S108).”, [0094]). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify the invention of Akanuma in view of Dupray with the teachings of Ma such that the hovering control command comprising an altitude and height of Akanuma has a corresponding bit flag, as suggested by Ma, with a reasonable expectation of success. The motivation for doing so would be to use bit flags to indicate whether a control command is a correct control command, as taught by Ma [0018]. Claim(s) 5-6, 11-12, 17-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Akanuma et al. (US 20220274717 A1) in view of Dupray et al. (US 20170069214 A1) in view of Jin et al. (CN 108803645 A). Regarding claim 17, and similarly claims 5 and 11, Akanuma in view of Dupray teaches The unmanned aerial vehicle according to claim 13, and Akanuma further teaches wherein the at least one processor further performs the following operations: triggering a re-landing command (“In response to the instruction from the action generating unit 119, the landing possibility and approach determining unit 116 outputs an instruction to discriminate the environment of the landing surface imaged by the camera 13 to the environment discriminating unit 117.”, [0078], “if the altitude of the drone aircraft 10 exceeds threshold values L1 and L3 corresponding to the environment of the landing surface, which is discriminated in the previous Step 102, (YES in Step S104), the camera 13 has difficulty of satisfactorily image the landing surface. Further, the wind pressure of the propellers of the drone aircraft 10 cannot be sufficiently transmitted to the landing surface, and thus the landing possibility and approach determining unit 116 outputs an instruction to the action generating unit 119 to cause the drone aircraft 10 to approach the landing surface to a predetermined altitude (Step S105).”, [0088], “S101” and “S105”, Fig. 4); and controlling the UAV to land safely according to the re-landing command (“In response to the instruction from the landing possibility and approach determining unit 116, the action generating unit 214 controls the rotational speed of the motor 16 via the flight controller 15. Thus, the drone aircraft 10 approaches the landing surface to a preset altitude and hovers while maintaining the altitude equal to or less than the threshold values L1 and L3. Next, the control unit 11 executes Steps S101 to S104 again while the drone aircraft 10 is hovering at an altitude equal to or less than the threshold values L1 and L3.”, [0088]). However, Jin teaches triggering a re-landing command after a preset interval time since detection of a speed change of the UAV (“In the first aspect, the embodiment of the present invention provides a UAV forced landing method, which is applied to the autopilot in the UAV, and the UAV is equipped with multiple vertical rotor power systems. The method includes: determining the The UAV is in a first emergency state, wherein the first emergency state represents that the UAV is in a state that needs to be adjusted to hover; adjust the horizontal speed command value output to the plurality of vertical rotor power systems , making the UAV in a hovering state; judging whether the current horizontal speed value of the UAV is less than a first preset speed value and lasts for a first preset time; when the current horizontal speed value of the UAV When it is less than the first preset rate value and lasts for the first preset time, it is determined that the UAV is in the second emergency state, so as to adjust the vertical rate command value output to the plurality of vertical rotor power systems, so that the drone A man-machine landing, wherein the second emergency state represents a state that needs to be triggered to adjust the vertical speed command values of multiple vertical rotor power systems to land the UAV.”, [0008], see also [0046-0055]). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify the invention of Akanuma in view of Dupray with the teachings of Jin such the triggering a re-landing command of Akanuma happens after a time since a speed change has occurred, as suggested by Jin, with a reasonable expectation of success. This would require the simple substitution of using altitude to trigger a re-landing as taught by Akanuma [0088] for using a UAV speed to trigger a re-landing as suggested by Jin [0008]. KSR International Co. v. Teleflex Inc. (KSR), 550 U.S. 398, 82 USPQ2d 1385 (2007) Regarding claim 18, and similarly claims 6 and 12, Akanuma in view of Dupray and Jin teaches The apparatus according to claim 17, and Akanuma further teaches wherein the at least one processor further performs the following operations: acquiring a current flight altitude of the UAV when the UAV is in the severe low-power protection state (“The wind shake characteristic calculating unit 118 outputs the calculated wind shake characteristic value and the sensor data acquired from the altitude sensor 14 to the landing possibility and approach determining unit 116.”, [0086], “S104”, Fig. 4); determining whether the current flight altitude exceeds a preset altitude threshold (“(Step S104: Does Altitude Exceed Threshold Value?)”, [0086], “S104”, Fig. 4); and in response to determining the current flight altitude exceeds the preset altitude threshold, shielding an upward flight control command (“if the altitude of the drone aircraft 10 exceeds threshold values L1 and L3 corresponding to the environment of the landing surface, which is discriminated in the previous Step 102, (YES in Step S104), the camera 13 has difficulty of satisfactorily image the landing surface. Further, the wind pressure of the propellers of the drone aircraft 10 cannot be sufficiently transmitted to the landing surface, and thus the landing possibility and approach determining unit 116 outputs an instruction to the action generating unit 119 to cause the drone aircraft 10 to approach the landing surface to a predetermined altitude (Step S105).”, [0088], “S105”, Fig. 4). Conclusion The prior art made of record and not relied upon is considered pertinent to Applicant's disclosure: See Notice of References Cited. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMELIA VORCE whose telephone number is (313) 446-4917. The examiner can normally be reached on Monday-Friday, 9AM-6PM, Central Time. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMELIA VORCE/Primary Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Feb 29, 2024
Application Filed
Oct 27, 2025
Non-Final Rejection — §103, §DP
Jan 12, 2026
Response Filed
Jan 29, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601121
CONTROL OF A WORK MACHINE USING GROUND SURFACE WORK RECORDS
2y 5m to grant Granted Apr 14, 2026
Patent 12602055
VEHICULAR SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12597358
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD, AND PROGRAM
2y 5m to grant Granted Apr 07, 2026
Patent 12578731
POSITION ESTIMATION DEVICE, AUTOMATED DRIVING SYSTEM, POSITION ESTIMATION METHOD, AND STORAGE MEDIUM STORING PROGRAM
2y 5m to grant Granted Mar 17, 2026
Patent 12576993
METHOD AND SYSTEM FOR OPERATING METAVERSE PLATFORM FOR IMPLEMENTING VIRTUAL UNIVERSE SPACE
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

2-3
Expected OA Rounds
72%
Grant Probability
94%
With Interview (+22.5%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 264 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month