Prosecution Insights
Last updated: April 19, 2026
Application No. 18/181,780

SYSTEMS AND METHODS FOR MANAGING UNMANNED VEHICLE INTERACTIONS WITH VARIOUS PAYLOADS

Non-Final OA §102§103§112
Filed
Mar 10, 2023
Examiner
INSERRA, MADISON RENEE
Art Unit
3662
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Xtend Reality Epansion Ltd.
OA Round
3 (Non-Final)
68%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
121 granted / 179 resolved
+15.6% vs TC avg
Strong +38% interview lift
Without
With
+38.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
35 currently pending
Career history
214
Total Applications
across all art units

Statute-Specific Performance

§101
17.7%
-22.3% vs TC avg
§103
45.9%
+5.9% vs TC avg
§102
17.8%
-22.2% vs TC avg
§112
15.9%
-24.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 179 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Status of Claims This Office action is in response to the request for continued examination filed on 01/13/2026. Claims 1-20 are currently pending and are presented for examination. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/13/2026 has been entered. Response to Arguments Applicant's arguments filed 01/13/2026 have been fully considered. Regarding claim objections: The examiner notes that the objections to claims 13 and 18-20 are overcome by the filed amendment and have been withdrawn accordingly. Regarding claim rejection under 35 U.S.C. § 112(b): Applicant has argued that the rejection of claim 16 under 35 U.S.C. § 112(b) is overcome by the filed amendment. The examiner agrees and has withdrawn this rejection accordingly. Regarding claim rejections under 35 U.S.C. § 101: Applicant has argued that the claim rejections under 35 U.S.C. § 101 are overcome by the filed amendment. The examiner agrees and has withdrawn these rejections accordingly. Regarding claim rejections under 35 U.S.C. § 103: On p. 13 of the remarks, applicant has argued that “Hall is directed to optimizing flight of drone swarms, not individually ladened UAV,” and that “isolating one UAV out of the collective UAV configuration for the purpose of somehow inferring the flight of that UAV is optimized, amounts to impermissible ‘stitch[ing] together an obviousness finding from discrete portions of prior art references without considering the references as a whole.’ In re Enhanced Security Research, LLC, 739 F.3d 1347, 1355 (Fed. Cir. 2014).” The examiner respectfully disagrees, as control of a UAV swarm naturally requires control of the individual UAVs within the swarm. Further, Hall continually mentions that its disclosed operation can be applied to “one or more UAVs” (e.g., see Hall ¶ 43), and Hall ¶ 90 teaches that some missions may require the use of a single UAV rather than a swarm of multiple UAVs. On p. 13 of the remarks, applicant has additionally argued that “the issue of payload identification is well recognized in Hall,” asserting that the disclosure of Hall ¶ 44 “would certainly be understood by the skilled artisan to identify the payload.” Upon reconsideration, the examiner agrees that Hall ¶ 44 teaches the reception of payload identification data under the broadest reasonable interpretation of the claim, and has updated the grounds of rejection to rely on Hall for teaching this feature. On pp. 13-14 of the remarks, applicant has referenced the Response to Arguments provided in the advisory action mailed 11/26/2025, arguing that “The Examiners seems to provide a wildly broad interpretation to the claims that is neither claimed nor was contemplated, since all commands are executed when the UAV is already ladened and mated with the proper payload. Assumptions of what ‘Conceivably’ happen in a separate step of the method are irrelevant. (See e.g., ‘Patentability may not be denied on the basis of assumptions or speculation.’—In re Warner, 379 F.2d 1011 (CCPA 1967).” The examiner notes that the statement in the advisory action was merely meant to clarify that the UAV being ladened with the payload would not preclude the prior art from being combined. No assumption has been made regarding whether the UAV recited in the instant claims actually does or does not disconnect from the payload upon identifying an incorrect mating. On p. 14 of the remarks, applicant has additionally argued that “in the Advisory Action, the Examiner did not respond to the argument that there is no motivation for any skilled artisan to combine features recognized in Hall by the redundant and very narrow features taught by Jones.” The examiner notes that this argument was fully addressed in the advisory action mailed 10/10/2025, which was then referenced in the last paragraph of the advisory action mailed 11/26/2025. Regardless, this argument is now moot in view of the new grounds of rejection that do not rely on Jones to teach the limitations related to payload identification data. On p. 14 of the remarks, applicant has also argued that the previously cited prior art fails to teach the amended step of, “based on the payload identification data received from the UAV, the determined UAV context, and the determined burdened flight profile, executing a command sequence” as recited in claim 1, or the amended step of “executing a command sequence based on the payload identification data received from the UAV, the determined UAV context, and the determined burdened flight profile” as recited in claim 12. The examiner respectfully disagrees, because Hall ¶ 112 discloses that “Based on the determined UAVs, determined collective UAV configuration, and determined resource distribution, instructions are sent to each UAV that is be included in the collective UAV to configure into the collective UAV and distribute resources according to the determined resource distribution, as in 1308. The instructions may be sent to the UAVs as the ordered items are packed and prepared for departure, as part of their navigation instructions, etc.” Instructing the UAVs to follow the determined flight plan reads on executing a command sequence as claimed. Also, Hall ¶ 76 discloses that “The collective UAV configuration may take any form and may vary depending on, for example, … the number and/or weight of items carried by UAVs of the collective UAV,” which demonstrates that the command sequence is executed based on the payload identification data as claimed. Additionally, Hall ¶ 117 discloses that the IMU sensor can be used during the navigation process, which means that the command sequence is executed based on the determined UAV context as claimed. On p. 17 of the remarks, applicant has argued that Downey fails to teach the queueing being initiated by the GCS as claimed. The examiner respectfully disagrees, because Downey ¶ 79 states that “the system can store data messages provided by payload modules in a data structure, e.g., a queue, and provide the data messages to their intended targets upon the occurrence of a different flight phase” and Downey ¶ 83 discloses that “radio system 604 transmits, and receives, data and messages from the camera system 606 to, and from, communication systems outside of the UAV, e.g., ground based control systems.” Therefore, Downey teaches the use of a ground control system with the functionality to initiate the queueing system as claimed. Additionally, Downey ¶ 58 states that “payload processing engine 320 can communicate with a configuration utility, e.g., operated by a user, which can provide configuration and programming information to the payload modules 122,” and Downey ¶ 96 discloses that “the configuration utility can ensure that a ground datalink includes complementary settings to the UAV radio datalink. That is, the configuration utility can connect to a ground based system, and present a user interface with selectable options already selected based on the UAV selections 802.” Therefore, Downey also teaches that the ground control system can be used to receive a configuration package associated with the payload as claimed. On pp. 18-19 of the remarks, applicant has further argued that “The probability of success in implementing the methods of correcting lifting and lowering payload disclosed in Shannon, with those in Hall does not seem to be based on sound rationale. After all, Hall specifically states ‘Rather than having to maintain multiple UAV configurations or utilize a UAV configuration that is not necessary for the majority of the item deliveries, the implementations described herein utilize multiple UAVs to form a collective UAV that is capable of transporting larger and/or heavier items or aerially navigating longer distances’.” The examiner respectfully disagrees, as it is not clear why this disclosure of Hall would preclude the incorporation of the UAV-payload attachment verification as taught by Shannon. Hall continually mentions that its disclosed operation can be applied to “one or more UAVs” (e.g., see Hall ¶ 43), and Hall ¶ 90 teaches that some missions may require the use of a single UAV rather than a swarm of multiple UAVs. Further, checking whether a payload is coupled to the UAV(s) could be applied to the method of Hall regardless of whether a specific mission requires one UAV or multiple UAVs. On p. 20 of the remarks, applicant has argued that “Lopez is entirely silent with regard to the ‘UAV context corresponds to a ground truth reading’” as required by claim 20 since Lopez “will not have use for ground truth reading, since the neural network is based on ‘estimating a current camera pose corresponding to a current point in time using a previous camera pose corresponding to a previous point in time’.” The examiner respectfully disagrees since Lopez ¶¶ 62-65 discuss the use of training data such as training image frames for training the neural network. It is implied that such a training process would use ground truth data to properly align the neural network. Applicant’s remaining arguments regarding the claim rejections under 35 U.S.C. § 103 are moot in view of the new grounds of rejection provided in this Office action. Claim Objections Claims 1 and 12 are objected to because of the following informalities: Line 4 of claim 1 should end with a semicolon. In line 4 of claim 12, it appears that “the UAVs” should be changed to “the UAV[[s]].” Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-11 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 1: Claim 1 includes the limitations that “a microprocessor-based controller associated with a UAV” is configured to perform a process including “determining a UAV context based at least in part on data received from the UAV” and “determining a burdened flight profile based at least in part on the payload identification data received from the UAV.” Based on the claim language and in light of ¶ 166 of the instant specification and FIG. 1A, it appears that the microprocessor-based controller is a component of the UAV. Therefore, it is unclear how the UAV can receive the UAV context and payload identification data from itself. For examination purposes, the claim is interpreted as if the microprocessor-based controller is configured for receiving the UAV context and payload identification data from other component(s) of the UAV (e.g. sensor(s) that gather this data). Regardless of whether this interpretation is correct, clarification is required. Regarding claims 2-11: Claims 2-11 are rejected under 35 U.S.C. 112(b) due to their dependency upon rejected claim 1. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-3, 7, and 10-11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Hall et al. (US 2017/0144757 A1), hereinafter referred to as Hall. Regarding claim 1: Hall discloses the following limitations: “A system for optimizing flight of an unmanned aerial vehicle (UAV) including a payload, the system comprising: the payload ladened UAV.” (Hall ¶ 44: “the user may input one or more instructions for the operation of the one or more UAVs. In some embodiments, the instructions include delivery instructions for a payload.”) “an inertial measurement unit (IMU) operably coupled to the UAV.” (Hall ¶ 113: “The UAV control system 110 may also include … inertial measurement unit (IMU) 1412.”) “a ground control system (GCS).” (Hall ¶ 71 and FIG. 20 disclose “a collective UAV configuration system 1528 (FIG. [20]) operating on a remote computing resource and provided wirelessly to one or more of the UAVs 200A, 200B.”) “and a microprocessor-based controller associated with a UAV, the microprocessor-based controller including a non-transitory computer-readable storage medium having executable instructions stored thereon that when executed by the controller cause the controller to perform a method.” (Hall ¶ 113: “UAV control system 110 includes one or more processors 1402, coupled to a memory, e.g., a non-transitory computer readable storage medium 1420.”) “including: i. determining a UAV context based at least in part on IMU data received from the UAV.” (Hall ¶ 117: “ESCs 1404 communicate with the navigation system 1407 and/or the IMU 1412 and adjust the rotational speed of each lifting motor to stabilize the UAV and guide the UAV along a determined flight plan. The navigation system 1407 may include a GPS, indoor positioning system (IPS), IMU or other similar system and/or sensors that can be used to navigate the UAV 100 to and/or from a location.”) “ii. receiving payload identification data.” (Hall ¶ 101: “Upon coupling, the UAV configuration information of the coupled UAV is received from the coupled UAV, as in 1104. The UAV configuration information may include … weight of the UAV and/or payload.” Also, the examiner notes that on p. 13 of the remarks filed 01/13/2026, applicant has argued that “the issue of payload identification is well recognized in Hall, for example in Para. [0044], referring to ‘instructions […] include one or more of a payload weight, a payload shape, or one or more payload length dimensions’ which would certainly be understood by the skilled artisan to identify the payload.”) “iii. determining a burdened flight profile based at least in part on the payload identification data received from the UAV.” (Hall ¶ 76: “The collective UAV configuration may take any form and may vary depending on, for example, the number of UAVs forming the collective UAV, the weather, the number and/or weight of items carried by UAVs of the collective UAV, power requirements, whether one or more of the UAVs of the collective UAV is damaged or inoperable, etc.” Further, Hall ¶ 110: “The positioning of the UAVs in the collective UAV configuration may be determined based on the power capabilities of the UAVs, the motors, propellers and/or lifting capabilities of the UAVs, the size of the UAVs, the payload weight of the UAVs, the location of the delivery destinations of the UAVs, etc.” Determining UAV configuration and positioning based on the payload identification data is equivalent to determining a burdened flight profile based on the payload identification data as claimed.) “iv. determining one or more burdened flight parameters, wherein the one or more burdened flight parameters are based at least in part on the UAV context and the burdened flight profile.” (Hall ¶ 101: “The collective UAV configuration information may identify, for example, the navigation information of the collective UAV, operating parameters, the configuration of the collective UAV, the sensor locations of sensors that are being used by the collective UAV, etc.” Further, Hall ¶ 117: “The ESCs 1404 communicate with the navigation system 1407 and/or the IMU 1412 and adjust the rotational speed of each lifting motor to stabilize the UAV and guide the UAV along a determined flight plan.”) “and v. based on the payload identification data received from the UAV, the determined UAV context, and the determined burdened flight profile, executing a command sequence.” (Hall ¶ 76: “The collective UAV configuration may take any form and may vary depending on, for example, … the number and/or weight of items carried by UAVs of the collective UAV, … etc.” Further, Hall ¶ 112: “Based on the determined UAVs, determined collective UAV configuration, and determined resource distribution, instructions are sent to each UAV that is be included in the collective UAV to configure into the collective UAV and distribute resources according to the determined resource distribution, as in 1308. The instructions may be sent to the UAVs as the ordered items are packed and prepared for departure, as part of their navigation instructions, etc.” Additionally, Hall ¶ 117: “The navigation system 1407 may include a GPS, indoor positioning system (IPS), IMU or other similar system and/or sensors that can be used to navigate the UAV 100 to and/or from a location.” Instructing the UAV(s) to navigate according to the determined UAV configuration is equivalent to executing a command sequence as claimed.) Regarding claim 2: Hall discloses “The system of claim 1,” and Hall additionally discloses the system “further comprising at least one automated command sequence, wherein the at least one automated command sequence further comprises one or more of an object recognition sequence, an obstacle collision avoidance sequence, a pedestrian collision avoidance sequence, and an environmental collision avoidance sequence.” (Hall ¶ 121: “Input/output devices 1417 may, in some implementations, include one or more displays, imaging devices, thermal sensors, infrared sensors, time of flight sensors, accelerometers, pressure sensors, weather sensors, cameras, gimbals, landing gear, etc. Multiple input/output devices 1417 may be present and controlled by the UAV control system 110. One or more of these sensors may be utilized to assist in landing as well as to avoid obstacles during flight.” This at least teaches the “obstacle collision avoidance sequence” as claimed.) Note that under the broadest reasonable interpretation (BRI) of claim 2, consistent with the specification, the automated command sequence(s) comprising “one or more of an object recognition sequence, an obstacle collision avoidance sequence, a pedestrian collision avoidance sequence, and an environmental collision avoidance sequence” is being treated as an alternative limitation. Applicant has elected to use the phrase “one or more” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only the “obstacle collision avoidance sequence” has been addressed here, the claim is still rejected in its entirety. Regarding claim 3: Hall discloses “The system of claim 1,” and Hall also discloses the following limitations: “further comprising: a. a plurality of UAVs.” (Hall ¶ 47: “the system may include two or more UAVs, and in some such embodiments the two or more UAVs may be joined to form a collective UAV.”) “and b. a transceiver operably coupled to the GCS, in communication with each of the plurality of UAVs; and c. a microprocessor-based GCS controller associated with the GCS, the microprocessor-based GCS controller including a non-transitory computer-readable storage medium having executable instructions stored thereon that when executed by the GCS controller cause the GCS controller to perform a method.” (Hall ¶¶ 71 and 128 disclose that “The collective UAV configuration may be determined by one or more of the UAVs 200A, 200B, and/or may be determined by a collective UAV configuration system 1528 (FIG. 70) operating on a remote computing resource and provided wirelessly to one or more of the UAVs 200A, 200B,” wherein “The memory 1512 additionally stores program code and data for providing network services to UAVs, materials handling facilities, the inventory management system 1526, and/or the collective UAV configuration system 1528. The program instructions enable communication with a data store manager application 1521 to facilitate data exchange between the data store 1509, the inventory management system 1526 and/or the collective UAV configuration system 1528.” Also, Hall ¶ 119: “network interface 1416 may be configured to allow data to be exchanged between the UAV control system 110, other devices attached to a network, such as other computer systems (e.g., remote computing resources), and/or with UAV control systems of other UAVs.”) “including: i. associating a set of UAVs as group members within a group membership.” (Hall ¶ 75: “collective UAV configuration system 1528 may wirelessly send instructions to the collective UAV 202 and/or the UAV 200C instructing the coupling of the UAV 200C to the collective UAV 202.”) “ii. designating at least one UAV from the set of UAVs as a lead UAV within the group membership; iii. designating at least one UAV from the set of UAVs as a follower UAV within the group membership.” (Hall ¶ 77 and FIG. 8 disclose a collective UAV group with leading UAV 300A and follower UAVs 300B-300G.) “iv. receiving, by the GCS controller, a lead UAV flight command; v. determining, by the GCS controller, at least one follower flight path instruction for the at least one follower UAV based at least in part on the lead UAV flight command.” (Hall ¶ 82: “a collective UAV may operate in a distributed manner, with each UAV maintaining and operating the motors and/or other components of the UAV. Alternatively, the collective UAV may operate in a master-slave configuration in which one of the UAVs of the collective UAV operates as a master, providing navigation instructions, motor speed control instructions, etc., to the other UAVs of the collective UAV. Any control scheme may be utilized to maintain the operation and control of the collective UAV and the distributed configuration and master-slave configuration are provided only as examples. For example, the collective UAV configuration system 1528 may provide navigation instructions to each of the UAVs of the collective UAV.”) “vi. transmitting, by the transceiver, the at least one follower flight path instruction to at least one follower UAV within the group membership.” (Hall ¶ 82: “Any control scheme may be utilized to maintain the operation and control of the collective UAV and the distributed configuration and master-slave configuration are provided only as examples. For example, the collective UAV configuration system 1528 may provide navigation instructions to each of the UAVs of the collective UAV.”) Regarding claim 7: Hall discloses “The system of claim 1,” and Hall also discloses “wherein the burdened flight profile is determined based at least in part on one or more of dynamic payload management, payload identification, and semi-autonomous interception of a target using a queuing methodology.” (Hall ¶ 76: “The collective UAV configuration may take any form and may vary depending on, for example, the number of UAVs forming the collective UAV, the weather, the number and/or weight of items carried by UAVs of the collective UAV, power requirements, whether one or more of the UAVs of the collective UAV is damaged or inoperable, etc.” This at least teaches the burdened flight profile being determined based in part on “dynamic payload management” as claimed.) Note that under the BRI of claim 7, consistent with the instant specification, the burdened flight profile being determined “based at least in part on one or more of dynamic payload management, payload identification, and semi-autonomous interception of a target using a queuing methodology” is treated as an alternative limitation. Applicant has elected to use the phrase “one or more” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only “dynamic payload management” has been addressed here, the claim is still rejected in its entirety. Regarding claim 10: Hall discloses “The system of claim 1,” and Hall also discloses “wherein the burdened flight profile comprises one or more payload-specific modes of operation.” (Hall ¶ 88: “In this example, the payload 804 is heavier than a single UAV can aerially transport, so two UAVs 800B, 800C are coupled to form a collective UAV that is coupled to the payload 804 to enable aerial transport of the payload 804.” Flying as a collective UAV reads on the payload-specific mode of operation as claimed.) Regarding claim 11: Hall discloses “The system of claim 10,” and Hall also discloses “wherein the one or more payload-specific modes of operation comprises at least one of: a flight mode; a navigation mode; a power consumption mode; a VR display mode; a payload deployment mode; a security mode; a communication mode; a defense mode; or a failure mode.” (Hall ¶ 88: “In this example, the payload 804 is heavier than a single UAV can aerially transport, so two UAVs 800B, 800C are coupled to form a collective UAV that is coupled to the payload 804 to enable aerial transport of the payload 804.” Flying as a collective UAV reads on the “flight mode” as claimed.) Note that under the broadest reasonable interpretation (BRI) of claim 11, consistent with the instant specification, the one or more payload-specific modes of operation comprising “at least one of: a flight mode; a navigation mode; a power consumption mode; a VR display mode; a payload deployment mode; a security mode; a communication mode; a defense mode; or a failure mode” is treated as an alternative limitation. Applicant has elected to use the phrase “at least one” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only the “flight mode” has been addressed here, the claim is still rejected in its entirety. Claims 1-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Liani et al. (US 2025/0028335 A1), hereinafter referred to as Liani. Regarding claim 1: Liani discloses the following limitations: “A system for optimizing flight of an unmanned aerial vehicle (UAV) including a payload, the system comprising: the payload ladened UAV.” (Liani ¶ 36 discloses “a system for optimizing flight of an unmanned aerial vehicle (UAV) including a payload.”) “an inertial measurement unit (IMU) operably coupled to the UAV.” (Liani ¶ 161 discloses “using an inertial measurement unit (IMU) in a UAV.”) “a ground control system (GCS).” (Liani ¶ 40: “Embodiments may also include a ground command station (GCS).”) “and a microprocessor-based controller associated with a UAV, the microprocessor-based controller including a non-transitory computer-readable storage medium having executable instructions stored thereon that when executed by the controller cause the controller to perform a method including: i. determining a UAV context based at least in part on IMU data received from the UAV.” (Liani ¶ 36 discloses “the system including a microprocessor-based controller associated with a UAV, the microprocessor-based controller including a non-transitory computer-readable storage medium having instructions stored thereon that when executed by the controller cause the controller to perform a method including determining a UAV context based at least in part on Inertial Measurement Unit (IMU) data from the UAV.”) “ii. receiving payload identification data; iii. determining a burdened flight profile based at least in part on the payload identification data received from the UAV; iv. determining one or more burdened flight parameters, wherein the one or more burdened flight parameters are based at least in part on the UAV context and the burdened flight profile.” (Liani ¶ 37: “Embodiments may also include receiving payload identification data. Embodiments may also include determining a burdened flight profile based at least in part on the payload identification data. Embodiments may also include determining one or more burdened flight parameters. In some embodiments, the one or more burdened flight parameters may be based at least in part on the UAV context and the burdened flight profile.”) “and v. based on the payload identification data received from the UAV, the determined UAV context, and the determined burdened flight profile, executing a command sequence.” (Liani ¶¶ 39-40: “Embodiments may also include receiving one or more payload-initiated flight instructions includes receiving at least one automated command sequence. … In some embodiments, the automated command sequence includes one or more of a return home command, a takeoff command, a calibration maneuver, a landing command, a payload approach, a motor-on mode, a standby mode, a breach command, skid mode, and a fly-to-waypoint command.” Also, Liani ¶ 54: “the burdened flight profile may include one or more payload-specific modes of operation.” Further, Liani ¶ 56: “the system may include an instruction for initializing the burdened flight profile. In some embodiments, the instruction for initializing the burdened flight profile may be at least partially based on the payload identification data.”) Regarding claim 2: Liani discloses “The system of claim 1,” and Liani further discloses the system “further comprising at least one automated command sequence, wherein the at least one automated command sequence further comprises one or more of an object recognition sequence, an obstacle collision avoidance sequence, a pedestrian collision avoidance sequence, and an environmental collision avoidance sequence.” (Liani ¶ 39: “In some embodiments, the at least one automated command sequence includes one or more of an object recognition sequence, an obstacle collision avoidance sequence, a pedestrian collision avoidance sequence, and an environmental collision avoidance sequence.”) Regarding claim 3: Liani discloses “The system of claim 1,” and Liani also discloses the following limitations: “further comprising: a. a plurality of UAVs.” (Liani ¶ 40: “In some embodiments, the system may include a plurality of UAVs.”) “and b. a transceiver operably coupled to the GCS, in communication with each of the plurality of UAVs; and c. a microprocessor-based GCS controller associated with the GCS, the microprocessor-based GCS controller including a non-transitory computer-readable storage medium having executable instructions stored thereon that when executed by the GCS controller cause the GCS controller to perform a method including: i. associating a set of UAVs as group members within a group membership.” (Liani ¶ 40: “the GCS may include a transceiver in communication with the plurality of UAVs. Embodiments may also include a microprocessor-based GCS controller associated with the GCS, the microprocessor-based GCS controller including a non-transitory computer-readable storage medium having instructions stored thereon that when executed by the GCS controller cause the GCS controller to perform a method including associating a set of UAVs as group members within a group membership.”) “ii. designating at least one UAV from the set of UAVs as a lead UAV within the group membership; iii. designating at least one UAV from the set of UAVs as a follower UAV within the group membership; iv. receiving, by the GCS controller, a lead UAV flight command.” (Liani ¶ 41: “Embodiments may also include designating at least one UAV from the set of UAVs as a lead UAV within the group membership. Embodiments may also include designating at least one UAV from the set of UAVs as a follower UAV within the group membership. Embodiments may also include receiving, by the GCS controller, a lead UAV flight command.”) “v. determining, by the GCS controller, at least one follower flight path instruction for the at least one follower UAV based at least in part on the lead UAV flight command; vi. transmitting, by the transceiver, the at least one follower flight path instruction to at least one follower UAV within the group membership.” (Liani ¶ 42: “Embodiments may also include determining, by the GCS controller, at least one follower flight path instruction for the at least one follower UAV based at least in part on the lead UAV flight command. Embodiments may also include transmitting, by the transceiver, the at least one follower flight path instruction to at least one follower UAV within the group membership.”) Regarding claim 4: Liani discloses “The system of claim 1,” and Liani further discloses the system “further comprising an electrical connection with the payload, wherein the electrical connection is configured to allow transmission of payload identification data between the payload and the UAV.” (Liani ¶ 48: “The system may include an electrical connection with the payload in some embodiments. In some embodiments, the electrical connection may be configured to allow the transmission of payload identification data between the payload and the UAV.”) Regarding claim 5: Liani discloses “The system of claim 4,” and Liani also discloses “wherein the transmission of payload identification data between the payload and the UAV comprises at least one payload attribute.” (Liani ¶ 48: “In some embodiments, the transmission of payload identification data between the payload and the UAV may include at least one payload attribute.”) Regarding claim 6: Liani discloses “The system of claim 5,” and Liani also discloses “wherein the at least one payload attribute comprises one or more of a payload classification, a payload unique identifier, a payload weight distribution, and a flight performance model, wherein the at least one payload attribute is used to at least partially determine the burdened flight profile.” (Liani ¶ 49: “the at least one payload attribute may include one or more of a payload classification, a payload unique identifier, a payload weight distribution, and a flight performance model. In some embodiments, the at least one payload attribute may be used to at least partially determine the burdened flight profile.”) Regarding claim 7: Liani discloses “The system of claim 1,” and Liani additionally discloses “wherein the burdened flight profile is determined based at least in part on one or more of dynamic payload management, payload identification, and semi-autonomous interception of a target using a queuing methodology.” (Liani ¶ 50: “In some embodiments, the burdened flight profile may be determined based at least in part on one or more of dynamic payload management, payload identification, and semi-autonomous interception of a target using a queuing methodology.”) Regarding claim 8: Liani discloses “The system of claim 1,” and Liani also discloses “wherein determining the burdened flight profile is partially based on a rule set, the rule set including one or more of: a recommended maximum UAV velocity; a recommended UAV acceleration; a recommended UAV deceleration; a minimum UAV turning radius; a minimum distance from an object in a flight path; a maximum flight altitude; a formula for calculating a maximum safe distance; a maximum burdened weight value; a maximum angle of one or more axes of an in-flight UAV command; a monitor-and-adjust arming status; a hover travel based at least in part on an IMU or a LIDAR sensor; a coordinate of a ground command station or other UAVs; a monitor-and-adjust power consumption mode; and one or more guidelines to modify one or more pilot input parameters.” (Liani ¶¶ 50-52: “Embodiments may also include determining that the burdened flight profile may be partially based on a rule set, including one or more of a recommended maximum UAV velocity. Embodiments may also include a recommended UAV acceleration. Embodiments may also include a recommended UAV deceleration. Embodiments may also include a minimum UAV turning radius. Embodiments may also include a minimum distance from an object in a flight path. Embodiments may also include a maximum flight altitude. Embodiments may also include a formula for calculating a maximum safe distance. Embodiments may also include a maximum burdened weight value. Embodiments may also include a maximum angle of one or more axis of an in-flight UAV command. Embodiments may also include a monitor-and-adjust arming status. Embodiments may also include a hover travel based at least in part on an IMU or a LIDAR sensor. Embodiments may also include a coordinate of a ground command station or other UAVs. Embodiments may also include a monitor-and-adjust power consumption mode. Embodiments may also include one or more guidelines to modify one or more pilot input parameters.”) Regarding claim 9: Liani discloses “The system of claim 1,” and Liani also discloses “wherein the executable instructions, when executed by the controller, are further configured to cause the GCS controller to initialize a queuing system and a visual tracker; transmit a video feed to a Visual Guidance Computer (VGC) and the visual tracker; and receive a configuration package associated with the payload.” (Liani ¶ 53: “In some embodiments, the instructions stored thereon that when executed by the controller cause the controller to perform a method further including transmitting a video feed to a Visual Guidance Computer (VGC). In some embodiments, the instructions stored thereon that when executed by the controller cause the controller to perform a method further including initializing a queuing system and a visual tracker. Embodiments may also include transmitting a video feed to a Visual Guidance Computer (VGC) and the visual tracker. Embodiments may also include receiving a configuration package associated with the payload.”) Regarding claim 10: Liani discloses “The system of claim 1,” and Liani additionally discloses “wherein the burdened flight profile comprises one or more payload-specific modes of operation.” (Liani ¶ 54: “the burdened flight profile may include one or more payload-specific modes of operation.”) Regarding claim 11: Liani discloses “The system of claim 10,” and Liani also discloses “wherein the one or more payload-specific modes of operation comprises at least one of: a flight mode; a navigation mode; a power consumption mode; a VR display mode; a payload deployment mode; a security mode; a communication mode; a defense mode; or a failure mode.” (Liani ¶ 54: “In some embodiments, the one or more payload-specific modes of operation may include at least one of a flight mode. Embodiments may also include a navigation mode. Embodiments may also include a power consumption mode. Embodiments may also include a VR display mode. Embodiments may also include a payload deployment mode. Embodiments may also include a security mode. Embodiments may also include a communication mode. Embodiments may also include a defense mode. Embodiments may also include a failure mode.”) Regarding claim 12: Liani discloses the following limitations: “A method for optimizing flight of an unmanned aerial vehicle (UAV) including a payload, implemented in a system comprising: the payload ladened UAV.” (Liani ¶ 36 discloses “a system for optimizing flight of an unmanned aerial vehicle (UAV) including a payload,” where the system can perform a method.) “an inertial measurement unit (IMU) operably coupled to the UAV.” (Liani ¶ 161 discloses “using an inertial measurement unit (IMU) in a UAV.”) “and a ground control station (GCS) in communication with the UAVs.” (Liani ¶ 40 discloses that “Embodiments may also include a ground command station (GCS). In some embodiments, the GCS may include a transceiver in communication with the plurality of UAVs.”) “the method comprising: receiving one or more human-initiated flight instructions; determining a UAV context based at least in part on IMU data received from the UAV.” (Liani ¶ 58: “Embodiments of the present disclosure may also include a method for optimizing flight of an unmanned aerial vehicle (UAV) including a payload, the method including receiving one or more human-initiated flight instructions. Embodiments may also include determining a UAV context based at least in part on Inertial Measurement Unit (IMU) data from the UAV.”) “receiving payload identification data; accessing a laden flight profile based at least in part on the payload identification data received from the UAV; determining one or more laden flight parameters, wherein the one or more laden flight parameters are based at least in part on the one or more human-initiated flight instructions, the UAV context, and the laden flight profile.” (Liani ¶ 59: “Embodiments may also include receiving payload identification data. Embodiments may also include accessing a laden flight profile based at least in part on the payload identification data. Embodiments may also include determining one or more laden flight parameters. In some embodiments, the one or more laden flight parameters may be based at least in part on the one or more human-initiated flight instructions, the UAV context, and the laden flight profile.”) “and executing a command sequence based on the payload identification data received from the UAV, the determined UAV context, and the determined burdened flight profile wherein the at least one command sequence executed comprises at least one navigation mode, including at least one of an obstacle-avoidance mode or a UAV avoidance mode.” (Liani ¶ 56: “the system may include an instruction for initializing the burdened flight profile. In some embodiments, the instruction for initializing the burdened flight profile may be at least partially based on the payload identification data.” Also, Liani ¶ 54: “the burdened flight profile may include one or more payload-specific modes of operation. In some embodiments, the one or more payload-specific modes of operation may include at least one of a flight mode. Embodiments may also include a navigation mode.” Additionally, Liani ¶ 22: “the at least one payload-specific mode may include at least one navigation mode, including at least one of a road avoidance mode or a UAV avoidance mode.”) Regarding claim 13: Liani discloses “The method of claim 12,” and Liani also discloses the method “further comprising implementing a load authentication sequence, wherein implementing the load authentication sequence further comprises the UAV interrogating an attached smart payload with an authentication protocol based at least in part on the payload identification data.” (Liani ¶ 60: “the method may include a load authentication sequence. In some embodiments, the unmanned aerial vehicle (UAV) interrogates an attached smart payload with an authentication protocol based at least in part on the payload identification data.”) Regarding claim 14: Liani discloses “The method of claim 12,” and Liani additionally discloses the method “further comprising implementing a load verification sequence, wherein implementing the load verification sequence further comprises the UAV interrogating an attached smart payload with a verification protocol based at least in part on the payload identification data.” (Liani ¶ 61: “the method may include a load verification sequence. In some embodiments, the unmanned aerial vehicle (UAV) interrogates an attached smart payload with a verification protocol based at least in part on the payload identification data.”) Regarding claim 15: Liani discloses “The method of claim 12,” and Liani additionally discloses the method “further comprising: implementing a mechanical load attachment verification sequence, wherein implementing the mechanical load attachment verification sequence further comprises the UAV confirming a mechanical connection between the UAV and an attached mechanical payload.” (Liani ¶ 62: “the method may include a mechanical load attachment verification sequence. In some embodiments, the unmanned aerial vehicle (UAV) confirms a mechanical connection between the unmanned aerial vehicle (UAV) and an attached payload.”) Regarding claim 16: Liani discloses “The method of claim 15,” and Liani further discloses “wherein a payload send communication protocol comprises: receiving payload communication from the attached mechanical payload; and transmitting the payload identification data via a communications channel with the GCS.” (Liani ¶ 62: “a payload send communication protocol may include receiving payload communication from an attached payload. Embodiments may also include transmitting the payload data via a communications channel with a Ground Control Station.”) Regarding claim 17: Liani discloses “The method of claim 12,” and Liani further discloses “wherein receiving one or more human-initiated flight instructions comprises receiving an automated command sequence.” (Liani ¶ 72: “Embodiments may also include instructions for receiving one or more human-initiated flight instructions may include an automated command sequence.”) Regarding claim 18: Liani discloses “The method of claim 12,” and Liani further discloses “wherein the UAV context is one or more of a payload armed status, an authentication status, a group membership, a lead drone status, a follower drone status, a mission status, a mission objective, engagement in an automated command, a maintenance alert status, a reduced operational capacity, a maximum range, and a battery life status.” (Liani ¶ 66: “Embodiments may also include a drone context may be one or more of a payload armed status, an authentication status, a group membership, a lead drone status, a follower drone status, a mission status, a mission objective, engagement in an automated command, a maintenance alert status, a reduced operational capacity, a maximum range, and a battery life status.”) Regarding claim 19: Liani discloses “The method of claim 12,” and Liani further discloses “wherein the UAV context is one or more of an indoor/outdoor flight transition, an environmental low visibility status, a high-wind status, an air pollutant status, a chemical presence status, a munitions status, a high electromagnetic radiation alert, a humidity status, a temperature alert status, a detected audible alert.” (Liani ¶ 66: “Embodiments may also include a drone context may be one or more of an indoor/outdoor flight transition, an environmental low visibility status, a high-wind status, an air pollutant status, a chemical presence status, a munitions status, a high electromagnetic radiation alert, a humidity status, a temperature alert status, a detected audible alert.”) Regarding claim 20: Liani discloses “The method of claim 12,” and Liani also discloses the method “further comprising determining the UAV context based at least in part on inertial measurement unit (IMU) data from the UAV, wherein the UAV context corresponds to a ground truth reading; and the IMU attribute comprises an IMU dataset wherein the IMU dataset uses a neural network to filter the IMU dataset.” (Liani ¶ 67: “Embodiments may also include determining a drone context based at least in part on Inertial Measurement Unit (IMU) data from the UAV. In some embodiments, the drone context may be a ground truth reading. In some embodiments, the inertial measurement unit (IMU) attribute may include an IMU dataset, the IMU dataset created by applying a neural network to filter the IMU data.”) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 4-6 are rejected under 35 U.S.C. 103 as being unpatentable over Hall as applied to claim 1 above, and further in view of Jones et al. (US 2019/0389577 A1), hereinafter referred to as Jones. Regarding claim 4: Hall discloses “The system of claim 1,” and Hall further discloses the system “further comprising an electrical connection with the payload.” (Hall ¶ 67: “The payload engagement mechanism communicates with (via wired or wireless communication) and is controlled by the UAV control system 110.”) Hall does not explicitly disclose “wherein the electrical connection is configured to allow transmission of payload identification data between the payload and the UAV.” However, Jones does teach this limitation. (Jones ¶ 60: “sprayer payloads can be identified within ground station 300 by a UAV. For example, a wired or wireless communication connection between the UAV and payload can identify a particular payload.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the system of Hall by incorporating an electrical connection to allow for the transmission of the payload identification data as taught by Jones, because this is a simple substitution of one known element (i.e., an electrical connection) for another (i.e. a wireless connection) to obtain predictable results (see MPEP 2143(I)(B)). A person having ordinary skill in the art could have replaced the wireless connection of Hall with a wired electrical connection as taught by Jones to achieve the predictable result of transmitting payload data between the payload and UAV without any need for a wireless network. Regarding claim 5: The combination of Hall and Jones teaches “The system of claim 4,” and Jones further teaches “wherein the transmission of payload identification data between the payload and the UAV comprises at least one payload attribute.” (Jones ¶ 75: discloses that “a multi-factor optical confirmation can be conducted to ensure that the correct vehicle is mated with the correct chemical payload. … If the vehicle identifier identified by the payload, and the payload identifier identified by the vehicle match the configuration prescribed in the mission by, for example control subsystem 312, then the visual handshake is successful, and the payload is released from the landing receptacle.” The “payload identifier” reads on the recited “payload attribute” in light of ¶ 20 of the instant specification, which states that “the payload attribute may include one or more of a payload classification, a payload unique identifier, payload weight data, payload weight distribution data, or a flight performance model.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the system of Hall by using a payload identifier as taught by Jones with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this because Jones ¶ 75 teaches that this security measure can help to ensure that the correct vehicle is mated with the correct payload. Regarding claim 6: The combination of Hall and Jones teaches “The system of claim 5,” and Jones further teaches the following limitations: “wherein the at least one payload attribute comprises one or more of a payload classification, a payload unique identifier, a payload weight distribution, and a flight performance model.” (Jones ¶ 75: discloses that “a multi-factor optical confirmation can be conducted to ensure that the correct vehicle is mated with the correct chemical payload. … If the vehicle identifier identified by the payload, and the payload identifier identified by the vehicle match the configuration prescribed in the mission by, for example control subsystem 312, then the visual handshake is successful, and the payload is released from the landing receptacle.” This at least teaches the payload attribute comprising “a payload unique identifier” as claimed.) “wherein the at least one payload attribute is used to at least partially determine the burdened flight profile.” (Jones ¶ 60 states that “sprayer payloads can be identified within ground station 300 by a UAV. For example, a wired or wireless communication connection between the UAV and payload can identify a particular payload.” Also, Jones ¶ 80: “Control subsystem 312 can receive inputs from each of the subsystems of ground station 300 to coordinate vehicle flight, landing, and general refilling while docked with ground station 300. … control system 312 can receive information from the vehicle or sensors on the vehicle to coordinate landing and takeoff.”) Note that under the broadest reasonable interpretation (BRI) of claim 6, consistent with the specification, the payload attribute(s) comprising “one or more of a payload classification, a payload unique identifier, a payload weight distribution, and a flight performance model” is being treated as an alternative limitation. Applicant has elected to use the phrase “one or more” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only the “payload unique identifier” has been addressed here, the claim is still rejected in its entirety. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the system of Hall by using a payload identifier and determining the flight profile based on this payload data as taught by Jones with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this since Jones ¶¶ 75 and 80 teach that this security measure can help ensure that the correct vehicle is mated with the correct payload and to create flight plans that account for particular payloads. A person having ordinary skill in the art would have recognized that each payload could be different from each other, which could affect the UAV pairing and flight plan that is optimal for the situation. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Hall as applied to claim 1 above, and further in view of White (US 2021/0325907 A1). Regarding claim 8: Hall discloses “The system of claim 1,” but Hall does not specifically disclose “wherein determining the burdened flight profile is partially based on a rule set, the rule set including one or more of: a recommended maximum UAV velocity; a recommended UAV acceleration; a recommended UAV deceleration; a minimum UAV turning radius; a minimum distance from an object in a flight path; a maximum flight altitude; a formula for calculating a maximum safe distance; a maximum burdened weight value; a maximum angle of one or more axes of an in-flight UAV command; a monitor-and-adjust arming status; a hover travel based at least in part on an IMU or a LIDAR sensor; a coordinate of a ground command station or other UAVs; a monitor-and-adjust power consumption mode; and one or more guidelines to modify one or more pilot input parameters.” However, White does teach this limitation. (White ¶ 103: “the drone 101, UE 109, drone routing platform 121, and the services platform 113 communicate with each other and other components of the system 100 using well known, new or still developing protocols. In this context, a protocol includes a set of rules defining how the network nodes within the system 100 interact with each other based on information sent over the communication links.” Additionally, White ¶ 97: “the drone 101 can be configured to observe restricted paths or routes. For example, the restricted paths may be based on governmental regulations that govern/restrict the path that the drone 101 may fly (e.g., Federal Aviation Administration (FAA) policies regarding required distances between objects).” This at least teaches the rule set comprising “a minimum distance from an object in a flight path” as claimed.) Note that under the broadest reasonable interpretation (BRI) of claim 8, consistent with the instant specification, “the rule set including one or more of: a recommended maximum UAV velocity; a recommended UAV acceleration; a recommended UAV deceleration; a minimum UAV turning radius; a minimum distance from an object in a flight path; a maximum flight altitude; a formula for calculating a maximum safe distance; a maximum burdened weight value; a maximum angle of one or more axes of an in-flight UAV command; a monitor-and-adjust arming status; a hover travel based at least in part on an IMU or a LIDAR sensor; a coordinate of a ground command station or other UAVs; a monitor-and-adjust power consumption mode; and one or more guidelines to modify one or more pilot input parameters” is treated as an alternative limitation. Applicant has elected to use the phrase “one or more” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only “a minimum distance from an object in a flight path” has been addressed here, the claim is still rejected in its entirety. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the system of Hall by imposing a rule requiring that the UAV stay a certain distance away from other objects in the flight path as taught by White with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this to ensure that the flight profile complies with flight regulations, as White ¶ 97 teaches that there may be “Federal Aviation Administration (FAA) policies regarding required distances between objects.” A person having ordinary skill in the art would have recognized that the modification could also help with collision avoidance. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Hall as applied to claim 1 above, and further in view of Downey et al. (US 2016/0114886 A1), hereinafter referred to as Downey. Regarding claim 9: Hall discloses “The system of claim 1,” and Hall further teaches to “transmit a video feed to a Visual Guidance Computer (VGC) and the visual tracker.” (Hall ¶ 44: “In some embodiments, the instructions include delivery instructions for a payload… The instructions may also include real-time monitoring of a secured enclosure, such as via a video feed relays from the one or more UAVs, through the cloud-based network, and to the remote input receptor.”) The following limitations are not specifically disclosed by Hall, but are taught by Downey: “wherein the executable instructions, when executed by the controller, are further configured to cause the GCS controller to initialize a queuing system and a visual tracker.” (Downey ¶ 1: “a shipping company can include particular payload, e.g., global positioning system sensors, cameras, and a cargo holder, to track its progression along a pre-defined shipping route, and deposit the cargo at a particular location. Further, Downey ¶ 79: “the system can store data messages provided by payload modules in a data structure, e.g., a queue, and provide the data messages to their intended targets upon the occurrence of a different flight phase, e.g., a camera taking pictures during landing can provide its images after the plane lands.” Further, Downey ¶ 83: “The radio system 604 transmits, and receives, data and messages from the camera system 606 to, and from, communication systems outside of the UAV, e.g., ground based control systems.”) “and receive a configuration package associated with the payload.” (Downey ¶ 58: “In some implementations, the payload processing engine 320 can communicate with a configuration utility, e.g., operated by a user, which can provide configuration and programming information to the payload modules 122.” Further, Downey ¶ 96: “the configuration utility can ensure that a ground datalink includes complementary settings to the UAV radio datalink. That is, the configuration utility can connect to a ground based system, and present a user interface with selectable options already selected based on the UAV selections 802.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the system of Hall by allowing the ground control system to initialize a queuing and tracking system and receive a payload configuration package as taught by Downey with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this since Downey ¶¶ 68-69 teach that this allows for tracking flight progression and the need for contingency operations, and Downey ¶ 91 teach that this allows for configuring the UAV’s modules while ensuring correct functionality of the UAV. Claims 12 and 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Hall et al. (US 2017/0144757 A1), hereinafter referred to as Hall, in view of White (US 2021/0325907 A1). Regarding claim 12: Hall discloses the following limitations: “A method for optimizing flight of an unmanned aerial vehicle (UAV) including a payload, implemented in a system comprising: the payload ladened UAV.” (Hall Abstract and ¶ 44 disclose a system and method for the remote operation of one or more UAVs, wherein “the user may input one or more instructions for the operation of the one or more UAVs. In some embodiments, the instructions include delivery instructions for a payload.”) “an inertial measurement unit (IMU) operably coupled to the UAV.” (Hall ¶ 113: “The UAV control system 110 may also include … inertial measurement unit (IMU) 1412.”) “and a ground control station (GCS) in communication with the UAVs.” (Hall ¶ 71 and FIG. 20 disclose “a collective UAV configuration system 1528 (FIG. [20]) operating on a remote computing resource and provided wirelessly to one or more of the UAVs 200A, 200B.”) “the method comprising: receiving one or more human-initiated flight instructions.” (Hall ¶ 44: “the user has permission to send operation commands to the one or more UAVs.”) “determining a UAV context based at least in part on IMU data received from the UAV.” (Hall ¶ 117: “ESCs 1404 communicate with the navigation system 1407 and/or the IMU 1412 and adjust the rotational speed of each lifting motor to stabilize the UAV and guide the UAV along a determined flight plan. The navigation system 1407 may include a GPS, indoor positioning system (IPS), IMU or other similar system and/or sensors that can be used to navigate the UAV 100 to and/or from a location.”) “receiving payload identification data.” (Hall ¶ 101: “Upon coupling, the UAV configuration information of the coupled UAV is received from the coupled UAV, as in 1104. The UAV configuration information may include … weight of the UAV and/or payload.” Also, the examiner notes that on p. 13 of the remarks filed 01/13/2026, applicant has argued that “the issue of payload identification is well recognized in Hall, for example in Para. [0044], referring to ‘instructions […] include one or more of a payload weight, a payload shape, or one or more payload length dimensions’ which would certainly be understood by the skilled artisan to identify the payload.”) “accessing a laden flight profile based at least in part on the payload identification data received from the UAV.” (Hall ¶ 76: “The collective UAV configuration may take any form and may vary depending on, for example, the number of UAVs forming the collective UAV, the weather, the number and/or weight of items carried by UAVs of the collective UAV, power requirements, whether one or more of the UAVs of the collective UAV is damaged or inoperable, etc.” Further, Hall ¶ 110: “The positioning of the UAVs in the collective UAV configuration may be determined based on the power capabilities of the UAVs, the motors, propellers and/or lifting capabilities of the UAVs, the size of the UAVs, the payload weight of the UAVs, the location of the delivery destinations of the UAVs, etc.” Determining UAV configuration and positioning based on payload identification data is equivalent to accessing a laden flight profile based on the payload identification data as claimed.) “determining one or more laden flight parameters, wherein the one or more laden flight parameters are based at least in part on the one or more human-initiated flight instructions, the UAV context, and the laden flight profile.” (Hall ¶ 44: “When a user is identified as an authorized user, the user may input one or more instructions for the operation of the one or more UAVs. In some embodiments, the instructions include delivery instructions for a payload, such as a payload pick-up location and/or a payload drop-off location.” Further, Hall ¶ 101: “The collective UAV configuration information may identify, for example, the navigation information of the collective UAV, operating parameters, the configuration of the collective UAV, the sensor locations of sensors that are being used by the collective UAV, etc.” Also, Hall ¶ 117: “The ESCs 1404 communicate with the navigation system 1407 and/or the IMU 1412 and adjust the rotational speed of each lifting motor to stabilize the UAV and guide the UAV along a determined flight plan.”) “and executing a command sequence based on the payload identification data received from the UAV, the determined UAV context, and the determined burdened flight profile.” (Hall ¶ 76: “The collective UAV configuration may take any form and may vary depending on, for example, … the number and/or weight of items carried by UAVs of the collective UAV, … etc.” Further, Hall ¶ 112: “Based on the determined UAVs, determined collective UAV configuration, and determined resource distribution, instructions are sent to each UAV that is be included in the collective UAV to configure into the collective UAV and distribute resources according to the determined resource distribution, as in 1308. The instructions may be sent to the UAVs as the ordered items are packed and prepared for departure, as part of their navigation instructions, etc.” Also, Hall ¶ 117: “The navigation system 1407 may include a GPS, indoor positioning system (IPS), IMU or other similar system and/or sensors that can be used to navigate the UAV 100 to and/or from a location.” Instructing the UAV(s) to navigate according to the determined UAV configuration is equivalent to executing a command sequence as claimed.) Hall does not specifically disclose “wherein the at least one command sequence executed comprises at least one navigation mode, including at least one of an obstacle-avoidance mode or a UAV avoidance mode.” However, White does teach this limitation. (White ¶ 96: “the drone 101 is configured to travel using one or more modes of operation over population or unpopulated areas.” Also, White ¶ 33: “map data in the geographic database 123 can then be used to calculate a payload survivability estimate over a geographic area, or to generate drone routes that avoids or minimizes potential exposure to map features 105 likely to render a payload 107 inoperable upon impact.” Further, White ¶¶ 34-42 teach that the data used to map the payload survivability can include real-time drone data and payload data such as payload type, payload value, payload outside construction, and payload sensitivity. White as least teaches the command sequence comprising a navigation mode including an obstacle-avoidance mode as claimed.) Note that under the broadest reasonable interpretation (BRI) of claim 12, consistent with the specification, the navigation mode(s) “including at least one of an obstacle-avoidance mode or a UAV avoidance mode” is treated as an alternative limitation. Applicant has elected to use the phrase “at least one” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only the “obstacle-avoidance mode” has been addressed here, the claim is still rejected in its entirety. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the method of Hall by incorporating a mode of operation in which the drone avoids potential exposure to certain map features based on UAV context data and payload data as taught by White with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this because White ¶ 33 teaches that with this modification, “the system 100 advantageously enables drone operators to navigate their drones 101 with reduced risks or with a greater understanding of the risks arising from encountering map features 105 likely to render a payload 107 inoperable on a route.” Regarding claim 17: The combination of Hall and White teaches “The method of claim 12,” and Hall further teaches “wherein receiving one or more human-initiated flight instructions comprises receiving an automated command sequence.” (Hall ¶ 44: “When a user is identified as an authorized user, the user may input one or more instructions for the operation of the one or more UAVs.”) Regarding claim 18: The combination of Hall and White teaches “The method of claim 12,” and Hall further teaches “wherein the UAV context is one or more of a payload armed status, an authentication status, a group membership, a lead drone status, a follower drone status, a mission status, a mission objective, engagement in an automated command, a maintenance alert status, a reduced operational capacity, a maximum range, and a battery life status.” (Hall ¶ 88: “the delivery destination is beyond the range that two coupled UAVs 800B, 800C can reach under their own power so two additional UAVs 800A, 800D are coupled with the UAVs 800B, 800C to form the collective UAV 802 to enable aerial transport of the payload 804.” This at least teaches the drone context being “a maximum range” as claimed.) Note that under the broadest reasonable interpretation (BRI) of claim 18, consistent with the instant specification, the UAV context being “one or more of a payload armed status, an authentication status, a group membership, a lead drone status, a follower drone status, a mission status, a mission objective, engagement in an automated command, a maintenance alert status, a reduced operational capacity, a maximum range, and a battery life status” is treated as an alternative limitation. Applicant has elected to use the phrase “one or more” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only the “maximum range” has been addressed here, the claim is still rejected in its entirety. Regarding claim 19: The combination of Hall and White teaches “The method of claim 12,” and Hall further teaches “wherein the UAV context is one or more of an indoor/outdoor flight transition, an environmental low visibility status, a high-wind status, an air pollutant status, a chemical presence status, a munitions status, a high electromagnetic radiation alert, a humidity status, a temperature alert status, a detected audible alert.” (Hall ¶ 70: “Other factors may also be considered in determining if UAVs should couple. For example, the remaining power of each UAV may be considered, weather and/or other external factors may also be considered. For example, if the UAVs are in an area with other aircraft, it may be determined that the UAVs should couple to form a collective UAV to increase visibility of the UAVs to other aircraft.” This at least teaches the drone context being “an environmental low visibility status” as claimed.) Note that under the broadest reasonable interpretation (BRI) of claim 19, consistent with the instant specification, the UAV context being “one or more of an indoor/outdoor flight transition, an environmental low visibility status, a high-wind status, an air pollutant status, a chemical presence status, a munitions status, a high electromagnetic radiation alert, a humidity status, a temperature alert status, a detected audible alert” is treated as an alternative limitation. Applicant has elected to use the phrase “one or more” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only the “environmental low visibility status” has been addressed here, the claim is still rejected in its entirety. Claims 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Hall in view of White as applied to claim 12 above, and further in view of Byers et al. (US 20160244187 A1), hereinafter referred to as Byers. Regarding claim 13: The combination of Hall and White teaches “The method of claim 12,” but does not specifically teach the method “further comprising implementing a load authentication sequence, wherein implementing the load authentication sequence further comprises the UAV interrogating an attached smart payload with an authentication protocol based at least in part on the payload identification data.” However, Byers does teach this limitation. (Byers ¶ 90 discloses use of an intelligent payload, and Byers ¶ 107 discloses that “During operation, a UAV may be required to visit such a station before crossing into a different flight area. For example, a UAV crossing from flight area 702 into flight area 704 (e.g., from A→B) may be required to check in at inspection gateway station 710 before entering flight area 704. In general, an inspection gateway station may be operable to validate the UAV, its flight plan within the new flight area, and/or the cargo being carried by the UAV.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the method disclosed by the combination of Hall and White by requiring authentication of the attached cargo as taught by Byers with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this since Byers ¶ 129 teaches that the “landing and takeoff procedures are disclosed herein that may be used to ensure safe operation of the UAV during takeoff, flight, and landing of the UAV … In yet another aspect, techniques are disclosed herein that ensure that a delivery UAV maintains compliance with any regulatory agencies, such as an ATC service.” A person having ordinary skill in the art would have recognized that it would save time and energy to ensure that the correct UAV is attached to the correct payload before the mission starts. Regarding claim 14: The combination of Hall and White teaches “The method of claim 12,” but does not specifically teach the method “further comprising implementing a load verification sequence, wherein implementing the load verification sequence further comprises the UAV interrogating an attached smart payload with a verification protocol based at least in part on the payload identification data.” However, Byers does teach this limitation. (Byers ¶ 90 discloses use of an intelligent payload, and Byers ¶ 107 discloses that “During operation, a UAV may be required to visit such a station before crossing into a different flight area. For example, a UAV crossing from flight area 702 into flight area 704 (e.g., from A→B) may be required to check in at inspection gateway station 710 before entering flight area 704. In general, an inspection gateway station may be operable to validate the UAV, its flight plan within the new flight area, and/or the cargo being carried by the UAV.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the method disclosed by the combination of Hall and White by requiring authentication of the attached cargo as taught by Byers with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this since Byers ¶ 129 teaches that the “landing and takeoff procedures are disclosed herein that may be used to ensure safe operation of the UAV during takeoff, flight, and landing of the UAV … In yet another aspect, techniques are disclosed herein that ensure that a delivery UAV maintains compliance with any regulatory agencies, such as an ATC service.” A person having ordinary skill in the art would have recognized that it would save time and energy to ensure that the correct UAV is attached to the correct payload before the mission starts. Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Hall in view of White as applied to claim 12 above, and further in view of Shannon et al. (US 20180257779 A1), hereinafter referred to as Shannon. Regarding claim 15: The combination of Hall and White teaches “The method of claim 12,” but does not specifically teach the method “further comprising: implementing a mechanical load attachment verification sequence, wherein implementing the mechanical load attachment verification sequence further comprises the UAV confirming a mechanical connection between the UAV and an attached mechanical payload.” However, Shannon does teach this limitation. (Shannon ¶ 400: “when the control system determines that the payload coupling apparatus is not mechanically coupled to the payload, the control system can cause the UAV to repeat the lowering of the payload coupling apparatus and the attachment verification process in order to reattempt pickup of the payload, and in some embodiments these processes may only be repeated up to a predetermined number of times.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the method disclosed by the combination of Hall and White by verifying the mechanical attachment of the payload as taught by Shannon with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this because Shannon ¶ 400 teaches that this modification allows the UAV to reattempt to pick up the payload if failure occurs, and move on to a new payload in the case of repeated failures. A person having ordinary skill in the art would have recognized that it would save time and energy to verify the attachment of the payload before the UAV flies off to start the mission. Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Hall in view of White and Shannon as applied to claim 15 above, and further in view of Jones et al. (US 20190389577 A1), hereinafter referred to as Jones. Regarding claim 16: The combination of Hall, White, and Shannon teaches “The method of claim 15,” but does not specifically teach “wherein a payload send communication protocol comprises: receiving payload communication from the … mechanical payload; and transmitting the payload identification data via a communications channel with the GCS.” However, Jones does teach this limitation. (Jones ¶ 73: “the payload mating process using a wireless communication network can be implemented as follows: A sprayer vehicle approaching landing receptacle 302 establishes a connection to a wireless network of ground station 300. The vehicle lands on the desired sprayer payload, which is locked to a specific pad or receptacle. The individual payload, or central computer system, verifies the vehicle using an identifier code unique to each vehicle. If the vehicle identifier matches the expected code, the payload is unlocked from its receptacle.” Further, Jones ¶ 75: “Both the vehicle and payload or landing receptacle then communicate through a wireless or wired communication network to security subsystem 310. If the vehicle identifier identified by the payload, and the payload identifier identified by the vehicle match the configuration prescribed in the mission by, for example control subsystem 312, then the visual handshake is successful, and the payload is released from the landing receptacle.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the method disclosed by the combination of Hall, White, and Shannon by having the UAV receive payload data and communicating this data with a ground station as taught by Jones with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this because Jones ¶ 77 teaches that coordinating with the ground station can help to prevent tampering and ensure security of the system. As explained regarding claim 15 above, Shannon ¶ 400 teaches the use of an “attached mechanical payload.” Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Hall in view of White as applied to claim 12 above, and further in view of Lopez Mendez et al. (US 2022/0036577 A1), hereinafter referred to as Lopez. Regarding claim 20: The combination of Hall and White teaches “The method of claim 12,” but does not explicitly teach the limitations listed below. However, Lopez does teach these limitations: “further comprising determining the UAV context based at least in part on inertial measurement unit (IMU) data from the UAV, wherein the UAV context corresponds to a ground truth reading.” (Lopez ¶¶ 62-65: “In examples in which the one more neural networks are trained to generate a pose, the operations performed by system SY include: receiving a plurality of training image frames for training the one or more neural networks NN; inputting the one or more training image frames to the one or more neural networks NN; and training the one or more neural networks NN to perform the: generating, using the one or more neural networks NN, a neural network pose prediction PNNT1 for the current image frame CIF.” Training a neural network in this way implies the use of a ground truth reading as claimed.) “and the IMU attribute comprises an IMU dataset wherein the IMU dataset uses a neural network to filter the IMU dataset.” (Lopez ¶ 48: “the combining of the inertial measurement unit pose prediction PIMUT1 for the current point in time T1, and the neural network pose prediction PNNT1 for the current image frame CIF, may be achieved by inputting these values to a non-linear filter NLF.”) Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to modify the method disclosed by the combination of Hall and White by training a neural network with vehicle ground truth data and then using the neural network to filter IMU data as taught by Lopez with a reasonable expectation of success. A person having ordinary skill in the art could have been motivated to do this because Lopez ¶ 33 teaches “Examples of the system SY that employ a neural network to estimate the camera pose may offer improvements including reduced power consumption, and a faster estimation of camera pose,” and that “Examples of the system SY that estimate the camera pose by combining the inertial measurement unit data IMUDAT with the predicted pose may offer improvements including improved accuracy and a more robust estimation of camera pose.” Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Madison R Inserra whose telephone number is (571)272-7205. The examiner can normally be reached Monday - Friday: 9:30 AM - 6:30 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aniss Chad can be reached at 571-270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Madison R. Inserra/Primary Examiner, Art Unit 3662
Read full office action

Prosecution Timeline

Mar 10, 2023
Application Filed
Dec 03, 2024
Non-Final Rejection — §102, §103, §112
May 05, 2025
Response Filed
Jun 30, 2025
Final Rejection — §102, §103, §112
Oct 01, 2025
Response after Non-Final Action
Nov 02, 2025
Response after Non-Final Action
Jan 13, 2026
Request for Continued Examination
Mar 11, 2026
Response after Non-Final Action
Mar 25, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597339
TOKENIZATION FOR ON-DEMAND TRAFFIC RESOURCE ALLOCATION
2y 5m to grant Granted Apr 07, 2026
Patent 12591237
MOVING BODY CONTROL METHOD, MOVING BODY CONTROL SYSTEM, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12576866
CALIBRATION FRAMEWORK FOR AUTONOMOUS VEHICLE SIMULATION TECHNOLOGY
2y 5m to grant Granted Mar 17, 2026
Patent 12579901
SYSTEMS AND METHODS FOR DETERMINING INTERSECTION THREAT INDICES
2y 5m to grant Granted Mar 17, 2026
Patent 12565223
VEHICLE HAVING SENSOR REDUNDANCY
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
68%
Grant Probability
99%
With Interview (+38.3%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 179 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month