Prosecution Insights
Last updated: April 19, 2026
Application No. 18/316,399

FLIGHT CONTROL METHOD, DEVICE, AIRCRAFT, SYSTEM, AND STORAGE MEDIUM

Non-Final OA §103
Filed
May 12, 2023
Examiner
GOODBODY, JOAN T
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sz DJI Technology Co. Ltd.
OA Round
3 (Non-Final)
49%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
89%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
98 granted / 199 resolved
-2.8% vs TC avg
Strong +40% interview lift
Without
With
+39.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
28 currently pending
Career history
227
Total Applications
across all art units

Statute-Specific Performance

§101
17.0%
-23.0% vs TC avg
§103
56.6%
+16.6% vs TC avg
§102
6.6%
-33.4% vs TC avg
§112
15.6%
-24.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 199 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant’s claim for priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application PCT/CN2018/07387, filed on January 23, 2018. Status of Claims Claims 1, 7-10, 12-15, 17, and 19 are amended. Claims 2, 4-6, 16, 18, 20, 22, and 23 were cancelled or previously cancelled. Claims 24 and 25 are new. Claims 1, 3, 7-15, 17, 19, 21, 24, and 25 are pending. Claim Objections Claims 24 and 25 objected to because of the following informalities: In the claims the word preset is used and in the SPEC it is pre-set. They should read the same in the claims and the specification. Appropriate correction is required. Response to Arguments/Remarks Rejection of Claims 1, 3-17, 19, and 21-23 under 35 U.S.C. § 103 Applicant argues: That the Art of record and the obviousness of this art does not contain “clear articulation of the reason(s) why the claimed invention would have been obvious...” Without acquiescing to the Office's assertions, Applicant respectfully submits that Wang and Chen, whether taken alone or in combination, do not disclose or suggest "in response to detecting two hands of the target user in the plurality of images and detecting that a horizontal distance between the two hands is gradually changing, generating a moving control command to control the aircraft to fly in a direction moving away from or moving closer to the target user," as recited in amended claim 1, (emphases added). Without acquiescing to the Office's assertions, Applicant respectfully submits that Wang and Chen, whether taken alone or in combination, do not disclose or suggest that "the triggering operation includes a scanning operation of a characteristic object or an interactive operation of a smart accessory," as recited in amended claim 1, (emphases added). Examiner respectfully disagrees. The claims in their broadest reasonable interpretation are analyzing multiple images of hand gestures to determine the actions of the vehicle to be controlled at any given time. Just specializing the gestures would not alter the use of the art indicated. But to further prosecution, the Examiner is including another piece of art that explains the gestures in more detail. (see 103 below). Also note that under a broadest reasonable interpretation (BRI), words of the claim must be given their plain meaning, unless such meaning is inconsistent with the specification. The plain meaning of a term means the ordinary and customary meaning given to the term by those of ordinary skill in the art at the relevant time. The ordinary and customary meaning of a term may be evidenced by a variety of sources, including the words of the claims themselves, the specification, drawings, and prior art. However, the best source for determining the meaning of a claim term is the specification - the greatest clarity is obtained when the specification serves as a glossary for the claim terms. The words of the claim must be given their plain meaning unless the plain meaning is inconsistent with the specification. 2111.01 (I). See also In re Marosi, 710 F.2d 799, 802, 218 USPQ 289, 292 (Fed. Cir. 1983) ("'[C]laims are not to be read in a vacuum, and limitations therein are to be interpreted in light of the specification in giving them their ‘broadest reasonable interpretation.'"2111.01 (II) With respect to the interpretation of claim terms, MPEP 2111 states: The Patent and Trademark Office ("PTO") determines the scope of claims in patent applications not solely on the basis of the claim language, but upon giving claims their broadest reasonable construction "in light of the specification as it would be interpreted by one of ordinary skill in the art." In re Am. Acad. of Sci. Tech. Ctr., 367 F.3d 1359, 1364[, 70 USPQ2d 1827, 1830] (Fed. Cir. 2004). Indeed, the rules of the PTO require that application claims must "conform to the invention as set forth in the remainder of the specification and the terms and phrases used in the claims must find clear support or antecedent basis in the description so that the meaning of the terms in the claims may be ascertainable by reference to the description." 37 CFR 1.75(d)(1). The words of the claim must be given their plain meaning unless the plain meaning is inconsistent with the specification In re Zletz, 893 F.2d 319, 13 USPQ2d 1320 (Fed. Cir. 1989). "Though understanding the claim language may be aided by explanations contained in the written description, it is important not to import into a claim limitations that are not part of the claim. For example, a particular embodiment appearing in the written description may not be read into a claim when the claim language is broader than the embodiment." Superguide Corp. v. DirecTV Enterprises, Inc., 358 F.3d 870, 875, 69 USPQ2d 1865, 1868 (Fed. Cir. 2004).(see MPEP 2111.01). During patent examination, the pending claims must be "given their broadest reasonable interpretation consistent with the specification." The broadest reasonable interpretation does not mean the broadest possible interpretation. Rather, the meaning given to a claim term must be consistent with the ordinary and customary meaning of the term (unless the term has been given a special definition in the specification), and must be consistent with the use of the claim term in the specification and drawings. Further, the broadest reasonable interpretation of the claims must be consistent with the interpretation that those skilled in the art would reach. In re Cortright, 165 F.3d 1353, 1359, 49 USPQ2d 1464, 1468 (Fed. Cir. 1999) (see PMEP 2111). Accordingly, the claims herein will be interpreted in accordance with the MPEP 2111. Also the Examiner respectfully disagrees that the art and the obvious statements are incorrect. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This is shown in the action, Examiner, to further prosecution, will attempt to clear up any issues in the obvious statements. Also, the Applicant’s arguments with respect to Claims 1, 3, 7-15, 17, 19, 21, 24, and 25 have been considered and are not persuasive and primarily deal with the amendments and additional new claims thus are moot in view of the new ground(s) of rejection as necessitated by applicant's amendments. Examiners notes: The Examiner wants to point out that some of the amendments have altered the scope of the claims in minor ways: The Gimble mount is in the preamble of Claims 1, but mostly has been taken out in other claims; The gestures have been changed to specific gestures (which is supported in the Specification) but under BRI it does not alter the scope or the rejection below; You have changed the “point click” to scanning, which is a change of scope on that limitation (see action below). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 3, 7-15, 17, 19, 21, 24, and 25 are rejected under 35 U.S.C. 103 as being unpatentable over Wang [US20160313742, now Wang], with Chen [US9587804, now Chen], further with Bedikian et al. [US20140201666, now Bedikian]. Claim 1 Wang discloses/suggests a method for controlling flight of an aircraft carrying an imaging device, the imaging device being mounted at a gimbal carried by the aircraft [see at least Wang, ¶ 0007 (“The detection of a visual signal can involve detecting a gesture or movement of a human body.”); 0052 (“ the positional change or positional state may be detected by onboard and/or off-board sensors such as discussed herein. For example, the positional change may be detected by an inertial sensor, GPS receiver, compass, magnetometer, altimeter, proximity sensor (e.g., infrared sensor, LIDAR sensor), visual or image sensor (such as a camera or video camera), photo sensor, motion detector, and the like.”); 0127], the method comprising: after obtaining the triggering operation, controlling the imaging device to scan and photograph in a predetermined photographing range [see at least Wang, ¶ 0009 (“In practicing any of the subject methods for launching a UAV, the detected positional change, visual signal, and/or release of the grip to the UAV may trigger the activating the UAV resulting in a lift and/or thrust.”)]; and in response to recognizing that a palm of the target user is facing the imaging device and moving upwardly or downwardly in a direction perpendicular to a ground generating a height control command to control the aircraft to adjust a flight height of the aircraft [see at least Wang, ¶ 0001 (“ For many years, both amateur and professional operators need to spend many hours of practice and training to master the control of unmanned aerial vehicles (UAVs) including multi-rotor aircraft. In particular, landing and takeoff remain the two most challenging aspects of operating a UAV. Such challenge is exacerbated when encountering uneven surfaces, strong wind, and other environmental factors affecting the operation of the UAV. Therefore, there exists a need for simplified or improved methods, as well as new designs of UAV that would render landing and takeoff easier even for amateur UAV users with little training or practice.”); 0067 (“a gesture or movement of an object such as a body part. FIGS. 13-15 illustrate exemplary methods for launching the UAV, in accordance with this embodiment of the present invention. As illustrated in FIG. 13, a gesture 1310 may be detected by an onboard sensor 1305 and used to trigger the launching of the UAV such as discussed herein. The gesture may be made by any body part such as by a hand, arm, head, facial features, eye, and the like. For example, the gesture may include the wave of a hand or arm, the turn of the head, the movement of the eye, and the like.”)]; in response to detecting two hands of the target user in the plurality of images and detecting that a horizontal distance between the two hands is gradually changing, generating a moving control command to control the aircraft to fly in a direction moving away from or moving closer to the target user [see at least Wang, ¶ 0001; 0067 (“a gesture or movement of an object such as a body part. FIGS. 13-15 illustrate exemplary methods for launching the UAV, in accordance with this embodiment of the present invention. As illustrated in FIG. 13, a gesture 1310 may be detected by an onboard sensor 1305 and used to trigger the launching of the UAV such as discussed herein. The gesture may be made by any body part such as by a hand, arm, head, facial features, eye, and the like. For example, the gesture may include the wave of a hand or arm, the turn of the head, the movement of the eye, and the like.”); Note that if one hand gestures work it would be obvious that a two hand gesture would also work if the gesture file incorporates two and one hand gestures.]; Wang also discloses using photograph/photograph as sensor data [see at least Wang, ¶ 0110 (“ an infrared imaging device, or an ultraviolet imaging device. The sensor can provide static sensor data (e.g., a photograph) or dynamic sensor data (e.g., a video).”)]. Wang does not specifically teach but Chen does teach in response to receiving a triggering operation that triggers the aircraft to operate in an image control mode, obtaining an environment image captured by the imaging device [see at least Chen, Col. 25, lines 10-21 (“ The hand gesture image 1920 of FIG. 30B is a front view, taken by a front camera. The hand gesture image 1930 of FIG. 30C is a side view, taken from a side camera. The hand gesture 1910 of FIG. 30A and the hand gesture 1930 of FIG. 30C illustrate a triggering motion by the index finger. However, image shown in FIG. 30A has more motion than the image shown in FIG. 30C because the motion is a rotation in the image plane with respect to the image of FIG. 30A. A button pressing motion by the thumb can be seen in FIGS. 30B and 30C. However, FIG. 30C has more motion than FIG. 30B because the motion is a rotation in the image plane of FIG. 30A.”)], Chen more specifically teaches after obtaining the triggering operation, controlling the imaging device to photograph in a predetermined photographing range [see at least Chen, Abstract; Col. 2, lines 18-32 (“In an embodiment, the beam steering mechanism includes a two-axis gimbal on which are mounted the light source module and the sensor for steering a line of sight (LOS) of the light source module and the sensor module, and wherein the two-axis gimbal can be of a pitch-yaw type, a pitch-roll type, or yaw-roll type. (14) In an embodiment, the beam steering mechanism comprises a mirror and a gimbal that adjusts gimbal angles of the mirror to steer a LOS of a light source generating the beam of light. (15) In an embodiment, the gimbal includes an outer ring and an inner ring, and the beam steering mechanism includes a motor on the outer ring and rotates the inner ring about an axis.”); Note that controlling the imaging device can be done using a gimble as the limitations have stated in the previous set of claims]; and obtaining the environment image including a characteristic part of the target user that is captured by the imaging device through photographing in the predetermined photographing range [see at least Chen, Col. 2, lines 57-61 (“In an embodiment, the tracking and control sensor comprises a thermal sensor and a visible camera for capturing an image of a hand making the hand gesture and recognizing the hand gesture, and wherein the thermal sensor comprises a lens and a thermal detector.”)]. Chen supports Wang and also teaches a method for controlling flight of an aircraft carrying an imaging device, the imaging device being mounted at a gimbal carried by the aircraft [see at least Chen, Abstract; Col. 25, lines 10-21]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the “methods and apparatus disclosed herein utilize positional change of the UAV, visual signal, or other means to effect the launch or landing (Abstract)” of Wang, with the further gesture control of a vehicle control system for a UAV that also adjusts the sensors/cameras taught in Chen. Providing a more effective, efficient technique to control a UAV using specific gestures of an operator. Neither Wang or Chen specially disclose/teach/suggest, but Bedikian does teach during a flight of the aircraft, controlling the imaging device to capture a plurality of images [see at least Bedikian, Abstract (“Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user.”); 0015 (“ sensing a variation of position of at least one control object using an imaging system comprises capturing a plurality of temporally sequential images of one or more control objects manipulated by the user. Determining from the variation one or more primitives describing a motion made by the control object and/or the character of the control object may involve computationally analyzing the images of the control object(s) to recognize a gesture primitive including at least a portion of a trajectory (trajectory portion) describing motion made by the control object…”); and wherein the triggering operation includes a scanning operation of a characteristic object or an interactive operation of a smart accessory [see at least Bedikian, ¶ 0044 (“capturing image data and tracking a control object based thereon in accordance with various embodiments”); Note: capturing an image is the same concept as scanning data; 0086 (“using a wireless device such as a tablet or smart phone.”); 0143 (“The position and motion sensing device can be embodied as a stand-alone entity or integrated into another device, e.g., a computer, workstation, laptop, notebook, smartphone, tablet, smart watch or other type of wearable intelligent device(s) and/or combinations thereof.”)]. Bedikian also teaches in response to recognizing that a palm of the target user is facing the imaging device and moving upwardly or downwardly in a direction perpendicular to a ground generating a height control command to control the aircraft to adjust a flight height of the aircraft [see at least Bedikian, ¶ 0019 (“method further includes computationally determining a dominant gesture (e.g., by filtering the plurality of gestures); and presenting an action on a presentation device based on the dominant gesture. For instance, each of the gestures may be computationally represented as a trajectory, and each trajectory may be computationally represented as a vector along six Euler degrees of freedom in Euler space, the vector having a largest magnitude being determined to be the dominant gesture. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the “methods and apparatus disclosed herein utilize positional change of the UAV, visual signal, or other means to effect the launch or landing (Abstract)” of Wang, with the further gesture control of a vehicle control system for a UAV that also adjusts the sensors/cameras taught in Chen, further with the use of smart accessories and “images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user [abstract] of Bedikian. Providing a more effective, efficient technique to control a UAV using specific gestures of an operator. Claim 2. (Canceled) Claim 3 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further discloses/suggests the characteristic part of the target user includes a head of the target user [see at least Wang, ¶ 0067; 0084 (“the methods discussed in connection with FIGS. 13-15 may be similarly used to land the UAV. As illustrated in FIG. 13, a gesture 1310 may be detected by an onboard sensor 1305 and used to trigger the landing of the UAV such as discussed herein. The gesture may be made by any body part such as by a hand, arm, head, facial features, eye, and the like. As illustrated by FIG. 14, a recognizable visual sign, symbol or pattern 1410 may be detected by the onboard sensor 1405 and used to trigger the landing of the UAV. Such predetermined visual sign, symbol or pattern may be of predetermined color, shape, dimension, size and the like.”)]. Claim 4. (Canceled) Claim 5. (Canceled) Claim 6. (Canceled) Claim 7 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further discloses/suggests based on the flight control gesture, generating the control command to control the aircraft to perform the action corresponding to the control command includes: in response to recognizing that the flight control gesture is a drag control gesture, generating a drag control command to control the aircraft to fly in a horizontal direction indicated by the drag control command [see at least Wang, ¶ 0026; 0028 (“For detecting the visual signal (including but not limited to human gesture) can be any visual sensor.”); 0051 (“A positional change may include translational changes (e.g., in altitude, latitude and/or longitude) or rotational changes. A positional change may also include changes in the velocity, acceleration, and/or orientation of the UAV. A positional change may further include a change in location of the UAV with respect to a frame of reference or a reference object.”); 0067; 0098 (“the controller 1810 can be used to adjust the state of the UAV via one or more actuators 1820. For example, the controller may be used to control the rotors of the UAV (e.g., rotational speed of the rotors) so as to adjust the spatial disposition of the UAV or a component thereof (e.g., a payload, a carrier of the payload) with respect to up to six degrees of freedom…, the controller can be configured to adjust the velocity or acceleration of the UAV with respect to six degrees of freedom. In some embodiments, the controller can control the UAV based on predetermined control data or positional, external contact or external signal information for the UAV obtained by processing data from one or more sensing systems.”)]. Examiners note: If a gesture can control an AFV in some ways, then all commands can be controlled using gestures. Thus, taking BRI of the instant claims into account, all controls can be operated by a gesture or with a smart accessory. Claim 8 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further discloses/suggests wherein based on the flight control gesture, generating the control command to control the aircraft to perform the action corresponding to the control command includes: in response to recognizing that the flight control gesture is a rotation control gesture, generating a rotation control command to control the aircraft to rotatably fly in a direction indicated by the rotation control command [see at least Wang, ¶ 0051]. Claim 9 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further teaches wherein based on the flight control gesture, generating the control command to control the aircraft to perform the action corresponding to the control command includes: in response to recognizing that the flight control gesture is a landing control gesture, generating a landing control command to control the aircraft to land [see at least Wang, Abstract; ¶ 0019 (“In another aspect, the present invention provides methods of decelerating a UAV. In some aspects, the present invention provides alternative methods of landing a UAV.”)]. Claim 10 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further discloses/suggests wherein based on the flight control gesture, generating the control command to control the aircraft to perform the action corresponding to the control command includes: in response to not recognizing the flight control gesture, but recognizing the characteristic part of the target user in the flight environment image, controlling, based on the characteristic part of the target user, the aircraft to use the target user as a tracking target and to follow a movement of the target user [see at least Wang, ¶ 0100 (“The controller can be operatively coupled to a communication module 1840 configured to transmit and/or receive data from one or more external devices (e.g., a terminal, display device, or other remote controller). Any suitable means of communication can be used, such as wired communication or wireless communication.”)]. Chen also teaches this limitation [see at least Chen, Col. 9, lines 15-40 (“ In some embodiments, the user can activate and interrupt the illumination region 102 without restriction. In some embodiments, the control system 100 includes a controller (described below) that can adjust the brightness, intensity, distribution, size, color, and/or other characteristic of the emitted light in response to a hand gesture or other signal-generating motion. The control system 100 can perform control operations without aid of a physical device, such as a hand-held device.”)]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the “methods and apparatus disclosed herein utilize positional change of the UAV, visual signal, or other means to effect the launch or landing (Abstract)” of Wang, with the further gesture control of a vehicle control system for a UAV that also adjusts the sensors/cameras taught in Chen. Providing a more effective, efficient technique to control a UAV using specific gestures of an operator. Claim 11 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 10. Wang further discloses/suggests wherein controlling the aircraft to follow the movement of the target user includes: adjusting a photographing state, to cause the target user to be included in an image captured by the imaging device after the photographing state is adjusted [see at least Wang, ¶ 0098 (“The controller 1810 can be used to adjust the state of the UAV via one or more actuators 1820. For example, the controller may be used to control the rotors of the UAV (e.g., rotational speed of the rotors) so as to adjust the spatial disposition of the UAV or a component thereof (e.g., a payload, a carrier of the payload) with respect to up to six degrees of freedom (three translational movement (along the X, Y and Z axes) and three rotational movement (along the roll, pitch and yaw axes)). Alternatively or in combination, the controller can be configured to adjust the velocity or acceleration of the UAV with respect to six degrees of freedom. In some embodiments, the controller can control the UAV based on predetermined control data or positional, external contact or external signal information for the UAV obtained by processing data from one or more sensing systems, as described herein. For example, the controller may provide acceleration or deceleration signals to the actuators based on the determination of whether launching or landing is required.”)]; wherein adjusting the photographing state includes adjusting at least one of a location of the aircraft, an attitude of the gimbal carried by the aircraft, or an attitude of the aircraft [see at least Wang, ¶ 0026]. Chen also teaches these limitations {see at least Chang, Col. 2, lines 18-23; col. 9, lines 15-40; Col. 9, line 60 – col. 10, line 9]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the “methods and apparatus disclosed herein utilize positional change of the UAV, visual signal, or other means to effect the launch or landing (Abstract)” of Wang, with the further gesture control of a vehicle control system for a UAV that also adjusts the sensors/cameras taught in Chen. Providing a more effective, efficient technique to control a UAV using specific gestures of an operator. Claim 12 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further discloses/suggests based on the flight control gesture, generating the control command to control the aircraft to perform the action corresponding to the control command includes: in response to recognizing that the flight control gesture is a photographing gesture, generating a photographing control command to control the imaging device of the aircraft to capture a target image [see at least Wang, ¶ 0095 (“some sensors (such as visual sensors) may be optionally coupled to a field programmable gate array (FPGA, not shown). The FPGA can be operatively coupled to the controller (e.g., via a general purpose memory controller (GPMC) connection). In some embodiments, some sensors (such as visual sensors) and/or the FPGA can be optionally coupled to a transmission module. The transmission module can be used to transmit data captured by the sensors (e.g., image data) to any suitable external device or system, such as a terminal or remote device as described herein.”); 0110 (“The payload can be coupled to the UAV via the carrier, either directly (e.g., directly contacting the UAV) or indirectly (e.g., not contacting the UAV). Optionally, the payload can be mounted on the UAV without requiring a carrier. The payload can be integrally formed with the carrier. Alternatively, the payload can be releasably coupled to the carrier. In some embodiments, the payload can include one or more payload elements, and one or more of the payload elements can be movable relative to the UAV and/or the carrier, as described above. The payload can include one or more sensors for surveying one or more targets. Any suitable sensor can be incorporated into the payload, such as an image capture device (e.g., a camera), an audio capture device (e.g., a parabolic microphone), an infrared imaging device, or an ultraviolet imaging device. The sensor can provide static sensor data (e.g., a photograph) or dynamic sensor data (e.g., a video). In some embodiments, the sensor provides sensor data for the target of the payload. Alternatively or in combination, the payload can include one or more emitters for providing signals to one or more targets. Any suitable emitter can be used, such as an illumination source or a sound source. In some embodiments, the payload includes one or more transceivers, such as for communication with a module remote from the UAV.”)]. Claim 13 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further discloses/suggests in response to recognizing a video-recording gesture, generating a video-recording control command to control the imaging device of the aircraft to capture a video; and while the imaging device of the aircraft captures the video, generating an ending control command to control the imaging device of the aircraft to stop capturing the video in response to recognizing the video-recording gesture of the target user again [see at least Wang, ¶ 0052; 0096; 0097 (“The memory units of the non-transitory computer readable medium 1830 store sensor data from the one or more sensing systems to be processed by the controller. In some embodiments, the memory units of the non-transitory computer readable medium can store the positional and/or motion information of the UAV, the detected external contact information, and/or the detected external signal information. Alternatively or in combination, the memory units of the non-transitory computer readable medium can store predetermined or pre-stored data for controlling the UAV (e.g., predetermined threshold values for sensor data, parameters for controlling the actuators, predetermined flight path, velocity, acceleration or orientation of the UAV).”)]. Claim 14 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further discloses/suggests in response to not recognizing a flight control gesture of the target user, but recognizing a replacement control gesture of a new target user, generating, based on the replacement control gesture, the control command to control the aircraft to perform the action corresponding to the control command [see at least Wang, ¶ 0012 (“the present invention provides an unmanned rotorcraft, comprising: a visual sensor configured to detect a visual signal generated by an operator of said rotorcraft; a controller configured to provide an actuating signal for activating the UAV in response to the detected visual signal; and an actuator configured to cause the UAV rotor blades to move and generate a lift and/or thrust in response to the actuating signal.”); Note, thus is a replacement control gesture]. Chen also teaches this limitation [see at least Chen, Col. 13, line 65 – Col. 14, line 6 (“The controller 308, when in an activation state, can receive command information regarding consecutive images of the hand gesture, which can change a state of the lamp. For example, the controller 308 can receive commands indicating that three consecutive images taken are the same, e.g., no change in hand gestures. The controller 308 can determine from this data that the user wishes to change the state of the system 100 based on the same or similar hand gestures in the images.”)]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the “methods and apparatus disclosed herein utilize positional change of the UAV, visual signal, or other means to effect the launch or landing (Abstract)” of Wang, with the further gesture control of a vehicle control system for a UAV that also adjusts the sensors/cameras taught in Chen. Providing a more effective, efficient technique to control a UAV using specific gestures of an operator. Claim 15 Claim 15 is a device comprising a memory [see at least Wang 0096] and processor configured to generate a travel path, as seen in Claim 1, thus is analogous to Claim 1. For the reasons given above with respect to claim 1, claim 15 is rejected. Claim 16. (Canceled) Claim 17 Claim 17 has similar limitations to claim 8, therefore claim 17 is rejected with the same rationale as claim 8. Claim 18. (Canceled) Claim 19 Claim 19 has similar limitations to claim 10, therefore claim 19 is rejected with the same rationale as claim 10. Claim 20. (Canceled) Claim 21 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further teaches imaging device carried by the aircraft is not started to scan and photograph for gesture detection before the triggering operation is received [see at least Wang, ¶ 0052 (“ In some embodiments, the positional change or positional state may be detected by onboard and/or off-board sensors such as discussed herein. For example, the positional change may be detected by an inertial sensor, GPS receiver, compass, magnetometer, altimeter, proximity sensor (e.g., infrared sensor, LIDAR sensor), visual or image sensor (such as a camera or video camera), photo sensor, motion detector, and the like. For example, an onboard inertial sensor (including one or more gyroscopes and/or accelerometers) may be used to detect a change in acceleration and/or orientation experienced by the UAV.”); Thus showing onboard sensors included and are activated as needed. Claim22. (Canceled) Claim 23. (Canceled) Claim 24 Wang in combination with Chen and Bedikian disclose/teach/suggest the method of Claim 1. Wang further teaches in response to recognizing that a hand of the target user in the plurality of images and detecting that the hand of the target user moves downwardly continuously while facing the ground, generating a first landing control command to control the aircraft to land to a location with a preset distance above the ground; and in response to recognizing that the hand of the target user in the plurality of images and detecting that the hand of the target user moves downwardly while facing the ground and stays at the location with the preset distance above the ground for more than a predetermined time period, generating a second landing control command to control the aircraft to land to the ground [see at least Wang, Abstract; ¶ 0001 (“ For many years, both amateur and professional operators need to spend many hours of practice and training to master the control of unmanned aerial vehicles (UAVs) including multi-rotor aircraft. In particular, landing and takeoff remain the two most challenging aspects of operating a UAV.”); 0083-84 (“external signals may be used to trigger the landing of the UAV in addition to or instead of the external contact and/or positional changes. Such external signals may include visual signals, audio signals, gesture signals or a combination thereof. In one embodiment, a subject landing method involves the steps of: (a) detecting a visual signal generated by an operator of said UAV; and (b) in response to the detected positional change and/or the visual signal, generating a deceleration signal to said UAV to bring said UAV to a stop… [0084] For example, the methods discussed in connection with FIGS. 13-15 may be similarly used to land the UAV. As illustrated in FIG. 13, a gesture 1310 may be detected by an onboard sensor 1305 and used to trigger the landing of the UAV such as discussed herein.”)]; Claim 25. Claim 25 has similar limitations to claim 1, therefore claim 25 is rejected with the same rationale as claim 1. The limitations are not exact but in the BRI of Claim 1 limitations and the limitations in Claim 25have similar concept and scope. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOAN T GOODBODY whose telephone number is (571) 270-7952. The examiner can normally be reached on M-TH 7-3 (US Eastern time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at https://www.uspto.gov/patents/uspto-automated-interview-request-air-form.html. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, RACHID BENDIDI can be reached at (571) 272-4896. The Fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspot.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at (866) 217-9197 (toll-free). If you would like assistance from the USPTO Customer Serie Representative or access to the automated information system, call (800) 786-9199 (IN USA OR CANADA) or (571) 272-1000. /JOAN T GOODBODY/ Primary Examiner, Art Unit 3664 (571) 270-7952
Read full office action

Prosecution Timeline

May 12, 2023
Application Filed
Apr 28, 2025
Non-Final Rejection — §103
Jul 31, 2025
Response Filed
Nov 21, 2025
Final Rejection — §103
Jan 22, 2026
Response after Non-Final Action
Feb 09, 2026
Request for Continued Examination
Mar 04, 2026
Response after Non-Final Action
Mar 19, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12595032
SYSTEMS AND METHODS FOR MONITORING BATTERY RANGE FOR AN ELECTRIC MARINE PROPULSION SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12586461
CLOUD-BASED MODEL DEPLOYMENT AND CONTROL SYSTEM (CMDCS) FOR PROVIDING AUTOMATED DRIVING SERVICES
2y 5m to grant Granted Mar 24, 2026
Patent 12560444
JOINT ROUTING OF TRANSPORTATION SERVICES FOR AUTONOMOUS VEHICLES
2y 5m to grant Granted Feb 24, 2026
Patent 12532794
SYSTEM AND METHOD FOR CONTROLLING AN AGRICULTURAL SYSTEM BASED ON SLIP
2y 5m to grant Granted Jan 27, 2026
Patent 12525134
METHODS OF A MOBILE EDGE COMPUTING (MEC) DEPLOYMENT FOR UNMANNED AERIAL SYSTEM TRAFFIC MANAGEMENT (UTM) SYSTEM APPLICATIONS
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
49%
Grant Probability
89%
With Interview (+39.7%)
3y 5m
Median Time to Grant
High
PTA Risk
Based on 199 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month