Prosecution Insights
Last updated: April 19, 2026
Application No. 17/972,301

MULTILAYER DRONE OPERATING SYSTEM

Final Rejection §102§103§112
Filed
Oct 24, 2022
Examiner
MILLER, LEAH NICOLE
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Black Sesame Technologies Inc.
OA Round
4 (Final)
56%
Grant Probability
Moderate
5-6
OA Rounds
3y 4m
To Grant
48%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
18 granted / 32 resolved
+4.3% vs TC avg
Minimal -8% lift
Without
With
+-8.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
32 currently pending
Career history
64
Total Applications
across all art units

Statute-Specific Performance

§101
9.3%
-30.7% vs TC avg
§103
38.3%
-1.7% vs TC avg
§102
23.6%
-16.4% vs TC avg
§112
27.3%
-12.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 32 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This Office Action is in response to the application filed on 10 November 2025. Claims 1-13 are presently pending and are presented for examination. Response to Amendments In response to Applicant’s amendments dated 10 November 2025, Examiner withdraws the previous claim objections; withdraws the previous 35 U.S.C. 112(b) rejections; and maintains the previous prior art rejections. Response to Arguments Applicant's arguments, see Remarks, filed 10 November 2025, have been fully considered but they are not persuasive. Applicant argues, see Remarks, pg. 5-6, regarding claims 1, 12, and 13, that US-20190265694-A1 (“Chen”) “may teach a server assembly, but Chen does not teach that the hardware of each server in the server assembly is implemented in isolation.” However, the server assembly in Chen teaches the “third layer” of a three-layer drone flight control system, in other words, Chen’s server assembly is one layer of “the system.” The amended limitation states: “the hardware of each layer of the system is implemented in isolation” and not “subcomponents of the hardware of the third layer of the system is implemented in isolation.” Chen teaches a drone flight control system that includes a user computing device [i.e., a first layer], a drone [i.e., a second layer], and a server assembly [i.e., a third layer] which are in wireless communication with each other, but are implemented on independent hardware [i.e., implemented in isolation]. For these reasons, examiner is unpersuaded and maintains the corresponding rejections. The remaining arguments are essentially the same as those addressed above and/or below and are unpersuasive for at least the same reasons. Therefore, examiner is unpersuaded and maintains the corresponding rejections. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim(s) 1-13 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1, 12, and 13 recite the limitation "the hardware". There is insufficient antecedent basis for this limitation in these claims. Claim 13 recites the limitation "the system". There is insufficient antecedent basis for this limitation in the claim. As claims 2-11 depend on independent claim 1, they are similarly rejected. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 13 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US-20190265694-A1, hereinafter “Chen” (previously of record). Regarding claim 13, Chen discloses a method for controlling a flight of a drone (Chen, para. 0042: “The present disclosure solution systems, methods, and computer-readable mediums for providing an automated drone security system 200.”; para. 0044: “System 200 may also include executive flight control logic [i.e., controlling a flight of a drone] and flight system security considerations that include the following processes. High-level executive logic may include state machine with predefined flight and operation procedures, wherein certain flight and operation procedures may be triggered by sensor states such as timer, measurements from an inertial measurement unit (IMU), a global positioning system (GPS), velocity, atmospheric conditions, or the like.”), wherein the method comprising: providing a visual interface for a user to access surveillance functions within a first layer (Chen, para. 0005: “A user computing device [i.e., within a first layer] may also be included in the system and in communication with the server assembly [i.e., third layer] and the drone [i.e., second layer], the user computing device having a non-transitory storage medium, a processor for processing data (including the surveillance data) [i.e., to access surveillance functions] between the server assembly and the drone, and a user interface for receiving user input and displaying data transmitted from the drone [i.e., providing a visual interface for a user]. Flight operations associated with surveilling the location may be automatically and/or manually controlled by the user computing device or the server assembly in connection with the location.”); receiving flight functions of the drone from one or more sensors within a second layer (Chen, para. 0006: “In this respect, the system [i.e., the drone, a second layer] may also include flight and security control logic comprising one or more of the following processes: executive logic defined by one or more predefined flight and operation procedures [i.e., receiving flight functions of the drone] triggered by sensor states associated with the one or more onboard sensors of the drone [i.e., from one or more sensors within a second layer] including velocity, a timer, an inertial measurement unit, and/or a global positioning system (GPS)…”); receiving a status of the drone via the one or more sensors within a third layer (Chen, para. 0005: “The drone may be capable of executing one or multiple flight operations for a period of time as well as storing and transmitting the surveillance data to a server assembly. The server assembly [i.e., within a third layer] of the system in turn may be operable for coordinating the drone and receiving the surveillance data [i.e., receiving a status of the drone].”; para. 0010: “Further, one or more of the drones of the system may maintain connectivity with the server assembly [i.e., within a third layer] or the user computing device through 3G/4G, RF, and/or a local wireless network [i.e., via the one or more sensors].”); and communicating between each layer located in independent processors to transmit the status of the drone for controlling the flight of the drone (Chen, para. 0005: “The drone [i.e., second layer] may be capable of executing one or multiple flight operations [i.e., independent processor] for a period of time as well as storing and transmitting the surveillance data to a server assembly. The server assembly of the system in turn may be operable for coordinating the drone [i.e., transmit the status of the drone for controlling the flight of the drone] and receiving the surveillance data…A user computing device [i.e., a first layer] may also be included in the system and in communication with the server assembly [i.e., third layer, independent processor] and the drone [i.e., second layer; communicating between each layer], the user computing device having a non-transitory storage medium, a processor for processing data [i.e., first layer, independent processor] (including the surveillance data) between the server assembly and the drone, and a user interface for receiving user input and displaying data transmitted from the drone.”), wherein the hardware of each layer of the system is implemented in isolation (Chen, para. 0005: “A user computing device [i.e., first layer; hardware of each layer is implemented in isolation] may also be included in the system [i.e., the system] and in communication with the server assembly [i.e., third layer; hardware of each layer is implemented in isolation] and the drone [i.e., second layer; hardware of each layer is implemented in isolation], the user computing device having a non-transitory storage medium, a processor for processing data (including the surveillance data) between the server assembly and the drone, and a user interface for receiving user input and displaying data transmitted from the drone.”). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-3 and 7-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over US-20200369384-A1, hereinafter “Kelly” (previously of record) and in view of WO-2022063730-A2, hereinafter “Aan Den Toorn” (previously of record) and US-20190265694-A1, hereinafter “Chen” (previously of record). Regarding claim 1, Kelly discloses a system for controlling a flight of a drone (Kelly, para. 0010: “Preferably, the at least one flight system may comprise at least one of: a thrust control system; a lift control system; a directional control system; a navigation control system, and a communications system.”), comprising… …a closed-source operating module within a second layer (Kelly, para. 0015: “…a navigation control system [i.e., a closed-source operating module within a second layer] comprising a plurality of different navigation sensors and an external feedback system adapted to receive and provide to the onboard flight controller real-time external flight characteristic data [i.e., receives flight-based functions of the drone]…”; para. 0051: “The navigation control system 20 may also comprise at least one sensor input, via which information may be relayed from one or more sensors 22 of the unmanned aerial vehicle 12, and/or one or more data input for receiving information from other relevant sources, such as from the communications system 16 [i.e., receives flight-based functions of the drone from one or more sensors].”; Note: Examiner is interpreting a “closed-source operating module” as a module that contains “autopilot flight solutions,” based on paragraph 0015 of the specification. The “navigation control system 20” also contains “autopilot flight solutions,” and is therefore also considered a “closed-source operating module.”), wherein the closed-source operating module acquires and saves original sensor data, communicates with the open-source operating module to provide flight-based functions (Kelly, para. 0051: ” The navigation control system 20 [i.e., closed-source operating module] may also comprise at least one sensor input [i.e., acquires and saves original sensor data], via which information may be relayed [i.e., communicates with the open-source operating module to provide flight-based functions] from one or more sensors 22 of the unmanned aerial vehicle 12, and/or one or more data input for receiving information from other relevant sources, such as from the communications system 16.”); and a controlling module within a third layer, wherein the controlling module receives a status of the drone via one or more sensors and transmits the status to the closed-source operating module for controlling the flight of the drone (Kelly, para. 0046: “A communications system 16 [i.e., a controlling module within a third layer] may be provided as a flight system of the unmanned aerial vehicle 10, preferably comprising a wireless communication means [i.e., one or more sensors] via which communications, such as orders, instructions or control signals [i.e., receives a status of the drone via one or more sensors] may be sent to the unmanned aerial vehicle 10, for instance, from a remote controller or air traffic control”; para. 0051: “The navigation control system 20 [i.e., closed-source operating module] may also comprise at least one sensor input, via which information may be relayed from one or more sensors 22 of the unmanned aerial vehicle 12, and/or one or more data input for receiving information from other relevant sources, such as from the communications system 16 [i.e., controlling module…transmits the status to the closed-source operating module for controlling the flight of the drone].”). Kelly does not appear to explicitly disclose the following: …an open-source operating module within a first layer, wherein the open-source operating module provides visual interface through which users can access all non-driving functions; and…wherein each layer of the system runs on at least one independent processor; and wherein the hardware of each layer of the system is implemented in isolation. However, in the same field of endeavor, Aan Den Toorn teaches: …an open-source operating module within a first layer, wherein the open-source operating module provides visual interface through which users can access all non-driving functions (Aan Den Toorn, pg. 8: lines 10-19: “HW module with a matching software module may be added as a package, or a specific software module may be added later or replace an existing software module in a HW module. Software modules may be configured by a supplier, such as a software company, or may be configured or changed by a user [i.e., an open-source operating module within a first layer]. Changing of software may preferably be done with some restrictions, in order to guarantee proper functioning of the modular system as a whole. Readily available or configurable software modules comprise software modules for path planning, sensor fusion, simultaneous localization and mapping (SLAM), world modelling, trajectory planning, haptic feedback control, vision processing, Artificial Intelligence, deep learning, neural networks, swarm control, fleet management, etc. [i.e., provides visual interface through which users can access all non-driving functions]”); and… Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, with the concept of an interface for users to access non-driving functions, taught by Aan Den Toorn, in order to facilitate application developers in creating customized functionality for the drone and/or drone operators to successfully control the drone to complete desired functionality (i.e., inspection, security, etc.) (Aan Den Toorn, pg. 8, lines 16-19: “Readily available or configurable software modules comprise software modules for path planning, sensor fusion, simultaneous localization and mapping (SLAM), world modelling, trajectory planning, haptic feedback control, vision processing, Artificial Intelligence, deep learning, neural networks, swarm control, fleet management, etc.”; pg. 2, lines 9-10: “Mobile robots are also deployed in for example inspection, cleaning, maintenance, military and security settings.”). Kelly and Aan Den Toorn do not appear to explicitly disclose the following: wherein each layer of the system runs on at least one independent processor; and wherein the hardware of each layer of the system is implemented in isolation. However, in the same field of endeavor, Chen teaches: wherein each layer of the system runs on at least one independent processor (Chen, para. 0005: “The drone [i.e., second layer] may be capable of executing one or multiple flight operations [i.e., second layer, independent processor] for a period of time as well as storing and transmitting the surveillance data to a server assembly. The server assembly of the system in turn may be operable for coordinating the drone and receiving the surveillance data…A user computing device may also be included in the system and in communication with the server assembly [i.e., third layer, independent processor] and the drone, the user computing device [i.e., a first layer] having a non-transitory storage medium, a processor for processing data [i.e., first layer, independent processor] (including the surveillance data) between the server assembly and the drone, and a user interface for receiving user input and displaying data transmitted from the drone.”); and wherein the hardware of each layer of the system is implemented in isolation (Chen, para. 0005: “A user computing device [i.e., first layer; hardware of each layer is implemented in isolation] may also be included in the system [i.e., the system] and in communication with the server assembly [i.e., third layer; hardware of each layer is implemented in isolation] and the drone [i.e., second layer; hardware of each layer is implemented in isolation], the user computing device having a non-transitory storage medium, a processor for processing data (including the surveillance data) between the server assembly and the drone, and a user interface for receiving user input and displaying data transmitted from the drone.”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, as modified by Aan Den Toorn, with the concept of having layers of a drone flight control system be implemented on independent processors and on isolated hardware, taught by Chen, in order to increase system redundancy, capability, and system reliability (Chen, para. 0086: “An exemplary drone 220 useable with the drone security system 200 may meet one or more of the following criteria: 1. Flight time/range: a fully charged drone 220 may be able to fly for 10-30 minutes depending on drone type; 2. Loiter time: drone loiter time at each waypoint may be independently configurable; 3. Camera focus, tilt, and zoom control: camera may be controlled via app; 4. Safety systems: deployable parachute, radio-based self-position broadcast, emergency landing system, and e-stop functionality; and/or 5. System redundancy: drivetrain hardware, communication hardware, onboard computation hardware, and onboard sensors (e.g. IMU, GPS) may have backups or be able to perform in a reduced capacity.”). Regarding claim 2, Kelly, Aan Den Toorn, and Chen teach the system of claim 1, and Kelly further discloses the following: wherein the status of the drone is based on one or more auto-pilot flight algorithms (Kelly, para. 0063: “In practice, this allows the unmanned aerial vehicle 10 to control its flight autonomously [i.e., based on one or more auto-pilot flight algorithms], as and when it receives flight-relevant information [i.e., status of the drone], either internal or external to itself .”). Regarding claim 3, Kelly, Aan Den Toorn, and Chen teach the system of claim 1, and Kelly further discloses the following: wherein the flight-based functions include an autopilot function and a user operated function (Kelly, para. 0079: “The unmanned aerial vehicle 10 is able to contact the local air traffic management to request transit through the restricted airspace 134. Here, the air traffic management has directed the unmanned aerial vehicle 10 to modify its course through the restricted airspace 134 via the shortest route [i.e., user operated function], rather than via its direct-most and preferred route. Machine-readable flight control instructions provided on the onboard controller 18 will permit the air traffic management instructions to request and/or take priority [i.e., user operated function] over the autonomous flight control [i.e., autopilot function], thereby allowing the unmanned aerial vehicle 10 to fly in a safe and compliant manner.”). Regarding claim 7, Kelly, Aan Den Toorn, and Chen teach the system of claim 1, and Kelly further discloses the following: wherein the closed-source operating module and the controlling module each contain a real-time operating system (RTOS) (Kelly, para. 0008: “According to a first aspect of the invention, there is provided a method of providing automated safe-fail operation for an unmanned aerial vehicle, the method comprising the steps of: a] obtaining real-time internal flight characteristic data [i.e., real-time operating system] which is indicative of at least one flight system [i.e., the closed-source operating module and the controlling module] of the unmanned aerial vehicle; b] obtaining real-time external flight characteristic data which is indicative of flight-relevant parameters which are external to the unmanned aerial vehicle; c] using an onboard flight controller of the unmanned aerial vehicle, determining, based on the real-time external flight characteristic data [i.e., real-time operating system]…”; para. 0024: ” The autonomous unmanned aerial vehicle may further comprise a ranking circuit for prioritizing the selection and implementation of one of a plurality of safe-fail operations. The ability to rank one or more safe-fail operations continuously during the flight of the unmanned aerial vehicle permits the assessment of the risk upon critical system failure to be determined in real-time [i.e., real-time operating system], therefore hopefully averting the greatest dangers to persons or property at ground level which could be at risk were the unmanned aerial vehicle to crash unexpectedly during routine operation.”; para. 0066: “However, it is presently possible to utilise a lightweight flight controller 18, given the weight reduction in processors 26 for a given computational power, that onboard processing can be achieved. Improved algorithmic analysis also allows such calculations of the 3D flight plan model to be determined in or near to real-time.”). Regarding claim 8, Kelly, Aan Den Toorn, and Chen teach the system of claim 7, and Kelly further discloses the following: wherein the RTOS within the controlling module provides the flight-based functions (Kelly, para. 0008: “…e] determining a safe-fail condition which is triggerable based on the real-time internal flight characteristic data [i.e., real-time operating system]; and f] in the event that the safe-fail condition is triggered, selecting and implementing one of the plurality of different safe-fail operations [i.e., flight-based functions] for the unmanned aerial vehicle in accordance with the real-time external flight characteristic data [i.e., real-time operating system] and machine-readable flight control instructions [i.e., flight-based functions] of the onboard flight controller.”). Regarding claim 9, Kelly, Aan Den Toorn, and Chen teach the system of claim 8, and Kelly further discloses the following: wherein the RTOS within the controlling module provides the status of the drone to the closed-source operating module (Kelly, para. 0021: “Furthermore, the real-time external flight characteristic data [i.e., real-time operating system] may be indicative of one or more flight-relevant parameters including at least one of: air traffic control communications; airspace control data; environmental information data; mission parameter data; collision prediction data; safe landing information data; geographical information data; and payload information data [i.e., status of the drone].”; para. 0056: “Any or all of the flight-relevant data could be additionally or alternatively provided by an external source, for example, relayed to the unmanned aerial vehicle 10 via the communication system 16 [i.e., controlling module].”; Note. External flight characteristic data, used to determine the status of the drone, is an input to the communication system 16 [i.e., controlling module], therefore any other flight system module [i.e., the closed-source operating module] that require that data as input will receive it from the communication system 16 [i.e., controlling module].). Regarding claim 10, Kelly, Aan Den Toorn, and Chen teach the system of claim 8, and Kelly further discloses the following: wherein the flight-based functions include at least one of autopilot solutions or navigation solutions (Kelly, para. 0063: “In practice, this allows the unmanned aerial vehicle 10 to control its flight [i.e., flight-based functions] autonomously [i.e., autopilot solutions], as and when it receives flight-relevant information, either internal or external to itself.”). Regarding claim 11, Kelly, Aan Den Toorn, and Chen teach the system of claim 1, and Kelly further discloses the following: wherein the status of the drone includes at least one of state control, path control, landing control, take-off control, or elevation control (Kelly, para. 0093: “Where there are deviations from expected behaviour, this may limit in particular the range of the unmanned aerial vehicle 10 to such a degree that the full return journey may not be possible. In this instance, an optimum safe-fail condition [i.e., status of the drone] may be determined by the unmanned aerial vehicle 10 to permit it to land safely [i.e., includes at least one of…landing control] without causing damage to persons or property on the ground, as well as attempting to avoid damage to itself.”). Claim(s) 4-6 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kelly, in view of Aan Den Toorn, Chen, and further in view of “MiniTEE—A Lightweight TrustZone-Assisted TEE for Real-Time Systems,” hereinafter “Liu” (previously of record). Regarding claim 4, Kelly, Aan Den Toorn, and Chen teach the system of claim 1, but do not appear to explicitly disclose the following: wherein the open-source operating module and the closed-source operating module both include an ARM TrustZone driver. However, in the same field of endeavor, Liu teaches: wherein the open-source operating module and the closed-source operating module both include an ARM TrustZone driver (Liu, Fig. 1: see annotated figure, below; pg. 1, paragraph 2: “Trusted Execution Environments (TEEs) [3] are widely used to power the security of embedded systems. TEE shields sensitive information in an execution environment which runs in isolation from the main operating system”; pg. 3, section 2.2: “As described in the previous parts, TrustZone has been widely used as a cornerstone hardware technology for enabling TEE on ARM-based platforms.”; Note: It is necessarily required in the implementation of a trusted execution environment (TEE) to have a driver, or equivalent method, in the “normal world” (not-secure) to send requests to the TEE and for the TEE to have a “secure monitor”, or equivalent method, to receive those requests. Regarding the open-source operating module and the closed-source operation module both including a TEE driver, examiner is interpreting this as a duplication of parts, as each implementation of a TEE requires the pair of a driver, to send requests, and a secure monitor, to receive those requests. In re Harza, 274 F.2d 669, 124 USPQ 378 (CCPA 1960), the court held that mere duplication of parts has no patentable significance unless a new and unexpected result is produced. One of ordinary skill in the art, at the time of the application, would find it obvious and with expected results, that each additional module or layer with a TEE, would require the implementation of another pair of a driver and a secure monitor.). PNG media_image1.png 592 827 media_image1.png Greyscale Liu, annotated Fig. 1 Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, as modified by Aan Den Toorn and Chen, with the concept of implementing a trusted execution environment, using Arm® TrustZone ®, taught by Liu, in order to protect a UAV/drone from unauthorized access (Liu, pg. 1, Introduction: “As real-time embedded systems become more complex and interconnected, certain sections or parts of the systems may need to be protected to prevent unauthorized access, or isolated to ensure functional/non-functional correctness. For example, consider an autonomous delivery drone used to transport packages to remote locations. The drone may communicate with the base station via real-time communication protocols to update mission-related information (e.g., new customer location). If those information are tampered with an ill-intentioned entity, the entire drone can be mis-routed, leading failed delivery [1].”). Regarding claim 5, Kelly, Aan Den Toorn, and Chen teach the system of claim 1, but do not appear to explicitly disclose the following: wherein the closed-source operating module and the controlling module both include a secure monitor. However, in the same field of endeavor, Liu teaches: wherein the closed-source operating module and the controlling module both include a secure monitor (Liu, Fig. 1: see annotated figure, above; pg. 3, 2nd paragraph: “Software stacks in the two worlds can be bridged via a new privileged instruction – Secure Monitor Call (SMC) running in monitor mode [i.e., secure monitor]. The Normal World software cannot access the Secure World’s resources while the latter can access all the resources.” Note: It is necessarily required in the implementation of a trusted execution environment (TEE) to have a driver, or equivalent method, in the “normal world” (not-secure) to send requests to the TEE and for the TEE to have a “secure monitor”, or equivalent method, to receive those requests. Regarding the closed-source operating module and the controlling operation module both including a TEE secure monitor, examiner is interpreting this as a duplication of parts, as each implementation of a TEE requires the pair of a driver, to send requests, and a secure monitor, to receive those requests. In re Harza, 274 F.2d 669, 124 USPQ 378 (CCPA 1960), the court held that mere duplication of parts has no patentable significance unless a new and unexpected result is produced. One of ordinary skill in the art, at the time of the application, would find it obvious and with expected results, that each additional module or layer with a TEE, would require the implementation of another pair of a driver and a secure monitor.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, as modified by Aan Den Toorn and Chen, with the concept of implementing a trusted execution environment, using Arm® TrustZone ®, taught by Liu, in order to protect a UAV/drone from unauthorized access (Liu, pg. 1, Introduction: “As real-time embedded systems become more complex and interconnected, certain sections or parts of the systems may need to be protected to prevent unauthorized access, or isolated to ensure functional/non-functional correctness. For example, consider an autonomous delivery drone used to transport packages to remote locations. The drone may communicate with the base station via real-time communication protocols to update mission-related information (e.g., new customer location). If those information are tampered with an ill-intentioned entity, the entire drone can be mis-routed, leading failed delivery [1].”). Regarding claim 6, Kelly, Aan Den Toorn, Chen and Liu teach the system of claim 5, and Liu further teaches the following: wherein the secure monitor exchanges the status of the drone with an ARM TrustZone driver (Liu, Fig. 1: see annotated figure, above; pg. 3, 2nd paragraph: “Software stacks in the two worlds can be bridged via a new privileged instruction – Secure Monitor Call (SMC) running in monitor mode [i.e., secure monitor]. The Normal World software cannot access the Secure World’s resources while the latter can access all the resources.”; Note: The secure monitor and secure monitor calls are what allows information to be transmitted from the trusted module to the requesting module.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, as modified by Aan Den Toorn, Chen, and Liu, with the concept of implementing a trusted execution environment, using Arm® TrustZone ®, taught by Liu, in order to protect a UAV/drone from unauthorized access (Liu, pg. 1, Introduction: “As real-time embedded systems become more complex and interconnected, certain sections or parts of the systems may need to be protected to prevent unauthorized access, or isolated to ensure functional/non-functional correctness. For example, consider an autonomous delivery drone used to transport packages to remote locations. The drone may communicate with the base station via real-time communication protocols to update mission-related information (e.g., new customer location). If those information are tampered with an ill-intentioned entity, the entire drone can be mis-routed, leading failed delivery [1].”). Regarding claim 12, Kelly discloses A system for controlling a flight of a drone (Kelly, para. 0010: “Preferably, the at least one flight system may comprise at least one of: a thrust control system; a lift control system; a directional control system; a navigation control system, and a communications system.”), comprising… …a closed-source operating module within a second layer, wherein the closed-source operating module…for receiving flight functions of the drone from one or more sensors (Kelly, para. 0015: “…a navigation control system [i.e., a closed-source operating module within a second layer] comprising a plurality of different navigation sensors and an external feedback system adapted to receive and provide to the onboard flight controller real-time external flight characteristic data [i.e., receives flight-based functions of the drone]…”; para. 0051: “The navigation control system 20 may also comprise at least one sensor input, via which information may be relayed from one or more sensors 22 of the unmanned aerial vehicle 12, and/or one or more data input for receiving information from other relevant sources, such as from the communications system 16 [i.e., receives flight-based functions of the drone from one or more sensors].”; Note: It is necessarily required in the implementation of a trusted execution environment (TEE) to have a driver, or equivalent method, in the “normal world” (not-secure) to send requests to the TEE and for the TEE to have a “secure monitor,” or equivalent method, to receive those requests.); a controlling module within a third layer…wherein the controlling module receives a status of the drone via the one or more sensors (Kelly, para. 0046: “A communications system 16 [i.e., a controlling module within a third layer] may be provided as a flight system of the unmanned aerial vehicle 10, preferably comprising a wireless communication means [i.e., one or more sensors] via which communications, such as orders, instructions or control signals [i.e., receives a status of the drone] may be sent to the unmanned aerial vehicle 10, for instance, from a remote controller or air traffic control”; para. 0051: “The navigation control system 20 may also comprise at least one sensor input, via which information may be relayed from one or more sensors 22 of the unmanned aerial vehicle 12, and/or one or more data input for receiving information from other relevant sources, such as from the communications system 16.”) and transmits the status…for controlling the flight of the drone (Kelly, para. 0046: “A communications system 16 [i.e., a controlling module] may be provided as a flight system of the unmanned aerial vehicle 10, preferably comprising a wireless communication means via which communications, such as orders, instructions or control signals [i.e., receives a status of the drone] may be sent to the unmanned aerial vehicle 10, for instance, from a remote controller or air traffic control”; para. 0051: “The navigation control system 20 may also comprise at least one sensor input, via which information may be relayed from one or more sensors 22 of the unmanned aerial vehicle 12, and/or one or more data input for receiving information from other relevant sources, such as from the communications system 16 [i.e., transmits the status…for controlling the flight of the drone].”); and …communicate through a peripheral communication bus (Kelly, para. 0057: “The unmanned aerial vehicle 10 is provided with an onboard flight controller 18 which is in communication with the or each flight system.”; Note: One of ordinary skill in the art, at the time of the application, would consider communication amongst processors, or between processors and peripheral devices, via a peripheral communication channel obvious.)… inter-layer communication within the system is through a printed circuit board (PCB) (Kelly, para. 0057: “The unmanned aerial vehicle 10 is provided with an onboard flight controller 18 which is in communication with the or each flight system.”; Note: One of ordinary skill in the art, at the time of the application, would consider a printed circuit board for communication between processors (i.e., motherboard) to be obvious.), and authorization certificates are verified before communication to ensure security (Kelly, para. 0047: “Preferably, the communications system 16 is configured such that the onboard flight controller 18 may include a communications verification circuit. This can be arranged to determine an authenticity of incoming communications signals to the unmanned aerial vehicle 10.”). Kelly does not appear to explicitly disclose the following: …an open-source operating module within a first layer, wherein the open-source operating module provides applications that support various drone application developers…includes one or more drivers… and …with a secure monitor…transmits the status to and through the secure monitor…each module runs on at least one independent processors, is installed in different processors…wherein the hardware of each layer of the system is implemented in isolation… However, in the same field of endeavor, Aan Den Toorn teaches: …an open-source operating module within a first layer, wherein the open-source operating module provides applications that support various drone application developers (Aan Den Toorn, pg. 8, lines 9-15: “The functionalities of the system can be extended with other HW modules and/or software modules that applicant also offers. A HW module with a matching software module may be added as a package, or a specific software module may be added later or replace an existing software module in a HW module. Software modules [i.e., open-source operating module within a first layer] may be configured by a supplier, such as a software company, or may be configured or changed by a user [i.e., module provides applications that support various drone application developers]. Changing of software may preferably be done with some restrictions, in order to guarantee proper functioning of the modular system as a whole.”)… Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, with the concept of a drone module that supports drone application developers, taught by Aan Den Toorn, in order to facilitate application developers in creating customized functionality for the drone (Aan Den Toorn, pg. 8, lines 16-19: “Readily available or configurable software modules comprise software modules for path planning, sensor fusion, simultaneous localization and mapping (SLAM), world modelling, trajectory planning, haptic feedback control, vision processing, Artificial Intelligence, deep learning, neural networks, swarm control, fleet management, etc.”). Kelly and Aan Den Toorn do not appear to explicitly teach the following: includes one or more drivers… and …with a secure monitor…transmits the status to and through the secure monitor…each module runs on at least one independent processors, is installed in different processors…wherein the hardware of each layer of the system is implemented in isolation… However, in the same field of endeavor, Liu teaches: …includes one or more drivers (Liu, Fig. 1: see annotated figure, above; pg. 1, paragraph 2: “Trusted Execution Environments (TEEs) [3] are widely used to power the security of embedded systems. TEE shields sensitive information in an execution environment which runs in isolation from the main operating system”; pg. 3, section 2.2: “As described in the previous parts, TrustZone has been widely used as a cornerstone hardware technology for enabling TEE on ARM-based platforms.”; Note: It is inherent in the implementation of a trusted execution environment (TEE) to have a driver, or equivalent method, in the “normal world” (not-secure) to send requests to the TEE and for the TEE to have a “secure monitor” to receive those requests.)… and …with a secure monitor (Liu, Fig. 1: see annotated figure, above; pg. 3, 2nd paragraph: “Software stacks in the two worlds can be bridged via a new privileged instruction – Secure Monitor Call (SMC) running in monitor mode [i.e., secure monitor]. The Normal World software cannot access the Secure World’s resources while the latter can access all the resources.”)… …transmits the status to and through the secure monitor… (Liu, Fig. 1: see annotated figure, above; pg. 3, 2nd paragraph: “Software stacks in the two worlds can be bridged via a new privileged instruction – Secure Monitor Call (SMC) running in monitor mode [i.e., secure monitor]. The Normal World software cannot access the Secure World’s resources while the latter can access all the resources.”; Note: The secure monitor and secure monitor calls are what allows information to be transmitted from the trusted module to the requesting module.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, as modified by Aan Den Toorn, with the concept of implementing a trusted execution environment, using Arm® TrustZone ®, taught by Liu, in order to protect a UAV/drone from unauthorized access (Liu, pg. 1, Introduction: “As real-time embedded systems become more complex and interconnected, certain sections or parts of the systems may need to be protected to prevent unauthorized access, or isolated to ensure functional/non-functional correctness. For example, consider an autonomous delivery drone used to transport packages to remote locations. The drone may communicate with the base station via real-time communication protocols to update mission-related information (e.g., new customer location). If those information are tampered with an ill-intentioned entity, the entire drone can be mis-routed, leading failed delivery [1].”). Kelly, Aan Den Toorn, and Liu do not appear to explicitly teach the following: …each module runs on at least one independent processors, is installed in different processors…wherein the hardware of each layer of the system is implemented in isolation… However, in the same field of endeavor, Chen teaches: …each module runs on at least one independent processors, is installed in different processors (Chen, para. 0005: “The drone [i.e., second layer] may be capable of executing one or multiple flight operations [i.e., second layer, runs on at least one independent processor] for a period of time as well as storing and transmitting the surveillance data to a server assembly. The server assembly of the system in turn may be operable for coordinating the drone and receiving the surveillance data…A user computing device may also be included in the system and in communication with the server assembly [i.e., third layer, runs on at least one independent processor] and the drone, the user computing device [i.e., a first layer] having a non-transitory storage medium, a processor for processing data [i.e., first layer, runs on at least one independent processor] (including the surveillance data) between the server assembly and the drone, and a user interface for receiving user input and displaying data transmitted from the drone.”)… …wherein the hardware of each layer of the system is implemented in isolation (Chen, para. 0005: “A user computing device [i.e., first layer; hardware of each layer is implemented in isolation] may also be included in the system [i.e., the system] and in communication with the server assembly [i.e., third layer; hardware of each layer is implemented in isolation] and the drone [i.e., second layer; hardware of each layer is implemented in isolation], the user computing device having a non-transitory storage medium, a processor for processing data (including the surveillance data) between the server assembly and the drone, and a user interface for receiving user input and displaying data transmitted from the drone.”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Kelly, as modified by Aan Den Toorn and Liu, with the concept of having layers of a drone flight control system be implemented on independent processors and on isolated hardware, taught by Chen, in order to increase system redundancy, capability, and system reliability (Chen, para. 0086: “An exemplary drone 220 useable with the drone security system 200 may meet one or more of the following criteria: 1. Flight time/range: a fully charged drone 220 may be able to fly for 10-30 minutes depending on drone type; 2. Loiter time: drone loiter time at each waypoint may be independently configurable; 3. Camera focus, tilt, and zoom control: camera may be controlled via app; 4. Safety systems: deployable parachute, radio-based self-position broadcast, emergency landing system, and e-stop functionality; and/or 5. System redundancy: drivetrain hardware, communication hardware, onboard computation hardware, and onboard sensors (e.g. IMU, GPS) may have backups or be able to perform in a reduced capacity.”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Leah N Miller whose telephone number is (703)756-1933. The examiner can normally be reached M-Th 8:30am - 5:30pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn can be reached on (571) 272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /L.N.M./Examiner, Art Unit 3663 /ABBY J FLYNN/Supervisory Patent Examiner, Art Unit 3663
Read full office action

Prosecution Timeline

Oct 24, 2022
Application Filed
Sep 03, 2024
Non-Final Rejection — §102, §103, §112
Dec 06, 2024
Response Filed
Mar 03, 2025
Final Rejection — §102, §103, §112
Jun 06, 2025
Request for Continued Examination
Jun 11, 2025
Response after Non-Final Action
Aug 04, 2025
Non-Final Rejection — §102, §103, §112
Nov 10, 2025
Response Filed
Feb 17, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585279
Navigating a robotic mower along a guide wire
2y 5m to grant Granted Mar 24, 2026
Patent 12579894
MULTI-LANE TRAFFIC MANAGEMENT SYSTEM FOR PLATOONS OF AUTONOMOUS VEHICLES
2y 5m to grant Granted Mar 17, 2026
Patent 12565229
SYSTEM FOR CONTROLLING VEHICLE BASED ON STATE OF CONTROLLER AND SYSTEM FOR CONTROLLING VEHICLE BASED ON COMMUNICATION STATE
2y 5m to grant Granted Mar 03, 2026
Patent 12560930
IDENTIFYING TRANSPORT STRUCTURES
2y 5m to grant Granted Feb 24, 2026
Patent 12552361
HYBRID VEHICLE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
56%
Grant Probability
48%
With Interview (-8.3%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 32 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month