Prosecution Insights
Last updated: April 19, 2026
Application No. 18/181,366

Autonomous Drone System and Method

Final Rejection §101§103
Filed
Mar 09, 2023
Examiner
KOLOSOWSKI-GAGER, KATHERINE
Art Unit
3687
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Mavrik Technologies LLC
OA Round
4 (Final)
26%
Grant Probability
At Risk
5-6
OA Rounds
4y 3m
To Grant
60%
With Interview

Examiner Intelligence

Grants only 26% of cases
26%
Career Allow Rate
95 granted / 358 resolved
-25.5% vs TC avg
Strong +34% interview lift
Without
With
+33.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 3m
Avg Prosecution
54 currently pending
Career history
412
Total Applications
across all art units

Statute-Specific Performance

§101
35.0%
-5.0% vs TC avg
§103
33.9%
-6.1% vs TC avg
§102
14.5%
-25.5% vs TC avg
§112
12.5%
-27.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 358 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This action is in reference to the communication filed on 30 OCT 2025. Amendments to claims 1, 11, 21, 2, 12, 22 entered and considered. Claims 1-30 are present and have been examined. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-30 rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. As explained below, the claim(s) are directed to an abstract idea without significantly more. Step One: Is the Claim directed to a process, machine, manufacture or composition of matter? YES With respect to claim(s) 1-30, the independent claim(s) 1, 11, 21 recite(s) a method, a computer program product, and a system, each of which is a statutory category of invention. Step 2A – Prong One: Is the claim directed to a law of nature, a natural phenomenon (product of nature) or an abstract idea? YES With respect to claim(s) 1-20 the independent claim(s) (claims 1, 11, 21) is/are directed, in part, to “processing a medical assistance request from a requester; defining an incident location for the medical assistance request; assigning a(n) [responder] to the medical assistance request…; dispatching the [responder] to the incident location; navigating the assigned responder through a controlled airspace to avoid/observe rules/capabilities” These claim elements are considered to be abstract ideas because they are directed to a method of organizing human activity which include managing personal behavior or relationships or interactions, including following rules or instructions – processing a request and reacting to it by assigning a resource is following rules or instructions. The claims are further directed to a mental process as they are directed to concepts performed in the human mind including observation, evaluation, judgement and opinion – receiving information about a request, assigning and dispatching a responder to said incident are all concepts that can be performed in the human mind. If a claim limitation, under its broadest reasonable interpretation, covers commercial and legal interactions, then it falls within the “method of organizing human activity” grouping of abstract ideas. If a claim limitation, under its broadest reasonable interpretation, covers concepts performed in the human mind, then it falls within the “mental process” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. Step 2A – Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application? NO. This judicial exception is not integrated into a practical application. In particular, the claim(s) recite(s) additional elements: Claim 1 recites a computing device, claim 11 recites a computer readable medium having instructions as well as a processor, and claim 21 recites a processor and a memory. Each of claims 1, 11, 21 further recite assigning and dispatching an autonomous drone, and wherein the request may be made from a virtual assistant or chatbot. The computing device, computer readable medium, and processors/memory in claims 1, 11, 21 is recited at a high level of generality and as such amount to no more than adding the words “apply it” to the judicial exception, or mere instructions to implement the abstract idea on a computer, or merely uses the computer as a tool to perform the abstract idea (see MPEP 2106.05f), or generally links the use of the judicial exception to a particular technological field of use/computing environment (see MPEP 2106.05h). Examiner finds no improvement to the functioning of the computer or any other technology or technical field in the aforementioned elements s claimed (see MPEP 2106.05a), nor any other application or use of the judicial exception in some meaningful way beyond a general like between the use of the judicial exception to a particular technological environment (see MPEP 2106.05e). Examiner finds similarly with respect to the autonomous drone: the drone itself nor its functionality are being improved in a meaningful way, and at best the inclusion of a drone is a general link between the use of the judicial exception to a particular technological environment (see MPEP 2106.05h), or mere instructions to implement the abstract idea(s) on a computer/computing device as a tool (See MPEP 2106.05f). With regard to the chatbot and virtual assistant, Examiner finds similarly as to the drone – these elements are no more than a general link between the technology (i.e. communications technology) and the abstract idea(s) noted above. No improvement is present, nor any other meaningful limitation beyond “apply it.” Accordingly, this/these additional element(s) do(es) not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea. Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? NO. The independent claim(s) is/are additionally directed to claim elements: Claim 1 recites a computing device, claim 11 recites a computer readable medium having instructions as well as a processor, and claim 21 recites a processor and a memory. Each of claims 1, 11, 21 further recite assigning and dispatching an autonomous drone. When considered individually, the computing device/computer readable medium, as well as processor(s) and memory claim elements only contribute generic recitations of technical elements to the claims. It is readily apparent, for example, that the claim is not directed to any specific improvements of these elements. Examiner looks to Applicant’s specification in: [020] Examples of computing device 12 may include, but are not limited to: a personal computer, a server computer, a series of server computers, a mini computer, a mainframe computer, a smartphone, or a cloud-based computing platform. [021] The instruction sets and subroutines of drone navigation process 10s, which may be stored on storage device 16 coupled to computing device 12, may be executed by one or more processors (not shown) and one or more memory architectures (not shown) included within computing device 12. Examples of storage device 16 may include but are not limited to: a hard disk drive; a RAID device; a random access memory (RAM); a read-only memory (ROM); and all forms of flash memory storage devices. [029] Referring to FIGS. 2A-2E, there is shown autonomous drone 100. As is known in the art, an autonomous drone is a type of unmanned aerial vehicle (UAV) that is capable of operating without the need for direct human input or control. These drones can be programmed with pre-set flight paths and instructions, allowing them to navigate through an environment and complete specific tasks autonomously. As used in this disclosure, autonomous drone (e.g., autonomous drone 100) is intended to mean any drone that is capable of self-navigating (regardless of whether or not it is carrying people or payloads). [0038] As is known in the art, a virtual assistant is an AI-powered software application that can perform various tasks and services for users. Virtual assistants are designed to mimic human interactions and provide personalized assistance to users through natural language processing and machine learning algorithms. Virtual assistants can perform a wide range of tasks, including scheduling appointments, setting reminders, sending messages, making phone calls, ordering food, providing weather updates, answering questions, and even playing music or videos. Virtual assistants are commonly integrated into popular mobile devices, smart speakers, and other internet-connected devices, and can be accessed through voice commands or through text-based chat interfaces. Some examples of popular virtual assistants include Apple's Siri, Amazon's Alexa, Google Assistant, and Microsoft's Cortana. Virtual assistants have become increasingly popular in recent years as more people rely on technology to help them manage their daily tasks and activities. [0042] As is known in the art, a chatbot is a software program that uses artificial intelligence (AI) and natural language processing (NLP) to simulate human conversation through text interactions. Chatbots are designed to mimic human communication and provide personalized assistance to users, often in the form of automated customer service. Chatbots can be integrated into websites, messaging apps, or social media platforms, allowing users to interact with them through chat interfaces. Chatbots can perform a wide range of tasks, such as answering frequently asked questions, providing customer support, booking appointments, making reservations, and even providing recommendations. Chatbots use machine learning algorithms to understand and interpret user inputs, allowing them to respond appropriately and provide relevant information. They can also learn from user interactions over time, becoming more accurate and effective in their responses. Chatbots have become increasingly popular in recent years as more businesses adopt them to improve their customer service and streamline their operations. Finally, at [081] Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. The computer-usable or computer-readable medium may also be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, RF, etc. These passages, as well as others, makes it clear that the invention is not directed to a technical improvement. When the claims are considered individually and as a whole, the additional elements noted above, appear to merely apply the abstract concept to a technical environment in a very general sense – i.e. a generic computer receives information from another generic computer, processes the information and then sends information back. The most significant elements of the claims, that is the elements that really outline the inventive elements of the claims, are set forth in the elements identified as an abstract idea. The fact that the generic computing devices are facilitating the abstract concept is not enough to confer statutory subject matter eligibility. As per dependent claims 2-10, 12-20, 22-30: Dependent claims 2, 3, 12, 13, 22, 23 do not recite any additional abstract ideas than those identified above. However, they recite non-abstract elements including the use of a an API, to receive the request, as well as a location database, a GPS chipset, and the use of cell tower triangulation to obtain the location. While some of these elements are recited in the alternate, in the interest of compact prosecution Examiner finds that these elements as recited are at best a general link between the judicial exception(s) and the technological environment/field of use (See MPEP 2106.05h). No improvement to the communications nor the location identification means are found (see MPEP 2106.05a), and as such these elements do not recite a practical application. Similarly, these elements do not constitution significantly more at least per Applicant’s spec at [048] ”As is known in the art, a GPS chipset (e.g., GPS chipset 70) is a specialized integrated circuit that is used to receive, process, and decode signals from GPS (Global Positioning System) satellites. The GPS chipset (e.g., GPS chipset 70) is an essential component of GPS-enabled devices such as smartphones, smartwatches, and navigation systems.” Finally, at [050] “As is known in the art, cell tower triangulation is a technique used to determine the approximate location of a mobile device (e.g., smartphone 32) by using the signal strength of nearby cell towers (not shown). This technique is often used when GPS or other location-based services are unavailable or inaccurate.” These passages clearly state a reliance on the application of a known technology to an abstract idea, and clearly do not identify any improvement to the functionality thereof. As such, these claims recite an abstract idea without significantly more. Dependent claims 4, 5, 14, 15, 24, 25 are not directed any additional abstract ideas and are also not directed to any additional non-abstract claim elements. Rather, these claims offer further descriptive limitations of elements found in the independent claims and addressed above – such as the types of incident information collected and it’s intended use. While these descriptive elements may provide further helpful context for the claimed invention these elements do not serve to confer subject matter eligibility to the invention since their individual and combined significance is still not heavier than the abstract concepts at the core of the claimed invention. Dependent claims 6, 7, 8, 9, 10, 16, 17, 18, 19, 20, 26, 27, 28, 29, 30 are not directed any additional abstract ideas and are also not directed to any additional non-abstract claim elements. Rather, these claims offer further descriptive limitations of elements found in the independent claims and addressed above – such as the uses and capabilities of the drone itself in the form of communication and transporting capabilities. Examiner notes that the drone itself is addressed with respect to the independent claims, and that these additional claims recite only descriptive matter regarding the capabilities of a drone and therefore do not merit additional consideration. While these descriptive elements may provide further helpful context for the claimed invention these elements do not serve to confer subject matter eligibility to the invention since their individual and combined significance is still not heavier than the abstract concepts at the core of the claimed invention. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-5, 9-15, 19-25, 29, 30 is/are rejected under 35 U.S.C. 103 as being unpatentable over Millhouse (US 20190041854 A1), in view of Kantor et al (US 20160189101 A1), further in view of Philbin (US 20170289350 A1). In reference to claim 1, 11, 21: Millhouse teaches: A computer-implemented method executed on a computing device (as in claim 1) (at least [fig 1, 3 and related text]) A Computer program product residing on a computer readable medium having a plurality of instructions stored thereon which, when executed by a process, cause the process to perform instructions (as in claim 11) (at least [fig 1, 3 and related text]) A computing system including a processor and memory configured to perform operations (as in claim 21) (at least [fig 1, 3 and related text]) All comprising: processing a medical assistance request from a requester (at least fig 1, 2 and related text including 014, 020] “The mobile application 112 may also generate a user interface enabling the individual operating the mobile device to report an incident such as a medical emergency.” “As noted above, the mobile application 112 may also provide a user interface to self-report an incident…”) defining an incident location for the medical assistance request (at least [fig 1, 2 and related text including 014] “The mobile application 112 is configured to communicate with the incident computing device 105 via a communications network 114. The mobile computing device 108 is able to generate a location identifier 116, such as a location determined via GPS Wi-Fi geolocation, or other location-based protocol.” – i.e. incident location); assigning an autonomous drone to the medical assistance request, thus defining an assigned autonomous drone (at least [010] “The automated robotic vehicle may be, but is not limited to, a drone, an unmanned ground vehicle, an unmanned aerial vehicle, an autonomous guided vehicle or an autonomous cart.” At [fig 1, 4, and related text including 017, 034] “At step 408, the incident computing device 105 transmits the location of the incident occurrence to the automated robotic vehicle 110.”; and dispatching the assigned autonomous drone to the incident location (at least [fig 2, 4 and related text] “The automated robotic vehicle 110 travels to the location 204 of the incident occurrence to provide emergency medical supplies for the incident occurrence to the individual. Depending on the type of incident, the incident computing device 105 may also send further instructions to the automated robotic vehicle regarding what to display on a display screen and what bins to unlock or prepare to unlock (once provided with authorized input).” “At step 410, the automated robotic vehicle 110 travels to the location of the incident occurrence to provide emergency medical supplies for the incident occurrence to an individual.”) and autonomously navigating the assigned autonomous drone through a controlled airspace to one or more of [predetermined obstacles} (at least [026] “Traffic control for the automated robotic vehicle 110 can be carried out locally or by software running on a fixed computer elsewhere in the facility. Local methods include zone control, forward sensing control, and combination control. For example, forward sensing control uses collision avoidance sensors to avoid the automated robotic vehicle 110 colliding with objects and customers in the area.”). While Millhouse teaches all of the limitations above, it does not specifically disclose the hazards as claimed. Kantor however does teach: Autonomously navigating the autonomous drone through an airspace to one or more of Avoid undesirable weather conditions (at [011] “The UAV platform may utilize information associated with the UAV (e.g., components of the UAV, the requested flight path, etc.) to identify capabilities of the UAV and other information in the data storage. For example, the UAV platform may retrieve capability information associated with the UAV and/or other information (e.g., the weather information, the obstacle information, the regulatory information, the historical information, etc. associated with the geographical region) from the data storage.” At [020] “UAV platform 230 may calculate the flight path from the origination location to the destination location based on the capability information and/or other information (e.g., weather information, air traffic information, etc.), and may generate flight path instructions, for the flight path, that include delivery confirmation and/or safety instructions. UAV platform 230 may provide the flight path instructions to UAV 220, and UAV 220 may traverse the flight path until UAV 220 arrives at the destination location.” At [046] “In some implementations, UAV platform 230 may calculate the flight path based on the capability information associated with UAV 220 and the weather information. For example, UAV platform 230 may determine that, without weather issues, the flight path may take UAV 220 two hours to complete at an altitude of five-hundred meters. UAV platform 230 may further determine that wind conditions at five-hundred meters may create a headwind of fifty kilometers per hour on UAV 220, but that wind conditions at one-thousand meters may create a tailwind of fifty kilometers per hour on UAV 220. In such an example, UAV platform 230 may alter the flight path from an altitude of five-hundred meters to an altitude of one-thousand meters (e.g., if UAV 220 is capable of reaching the altitude of one-thousand meters). Assume that the tailwind at the altitude of one-thousand meters decreases the flight time from two hours to one hour and thirty minutes. Alternatively, UAV platform 230 may not alter the flight path, but the headwind at the altitude of five-hundred meters may increase the flight time from two hours to two hours and thirty minutes.” Avoid restricted airspace; (at least [039] “ For example, if the credentials of UAV 220 include a government registration number of UAV 220, UAV platform 230 may compare the government registration number to the UAV account information in data storage 235 to determine whether UAV 220 is registered with a government agency to legally fly in airspace regulated by the government agency….” At [044] “AV platform 230 may still deny the request for the flight path if UAV platform 230 determines that UAV 220 has not properly followed the maintenance schedule. This may enable UAV platform 230 to ensure that only properly maintained UAVs 220 are permitted to fly, which may increase safety associated with UAVs 220 utilizing airspace.” and Observe available charge/range of the autonomous drone (at least [043] “UAV platform 230 may determine capability information for UAV 220 based on the request for the flight path and component information of UAV 220 (e.g., provided with the request for the flight path). For example, data storage 235 may include capability information associated with different components of UAVs 220, such as battery life, thrusts provided by rotors, flight times associated with amounts of fuel, etc. In some implementations, UAV platform 230 may utilize the component information of UAV 220 (e.g., UAV 220 has a particular type of battery, engine, rotors, etc.) to retrieve the capability information for components of UAV 220 from data storage 235. For example, if UAV 220 has a particular type of battery and a particular type of rotor, UAV platform 230 may determine that the particular type of battery of UAV 220 may provide two hours of flight time and that the particular type of rotor may enable UAV 220 to reach an altitude of one-thousand meters.”). Millhouse and Kantor are analogous as both references disclose deployment of unmanned vehicles. One of obvious skill in the art would have found the additional obstacle considerations as taught by Kantor to be an obvious addition to the collision considerations as taught by Millhouse, as weather/airspace considerations are common flying considerations. As such, these obstacles are essentially obvious variants of the obstacles taught by Millhouse. The combination as cited teaches all the limitations above, and while Millhouse as cited teaches an interface through which a user may “self-report” an incident, and one of ordinary skill in the art could infer a self-reporting interface to either take the form of a virtual assistant or chatbot, in the interest of compact prosecution Examiner finds that the combination does not specifically disclose requesting assistance from a virtual assistant nor a chatbot. Philbin however does disclose: A medical assistance request from a requester, including one or more of: Processing the medical assistance request from the requestor via voice-based virtual assistant (at least [fig 1 and related text] “The home 110 can have a virtual assistant 112. The virtual assistant 112 can receive verbal commands (from, e.g., the user) such as the emergency communication 106 and convert the received audio into data. The data can be sent via, for example, from the virtual assistant 112 via a Wi-Fi connection 104 over the internet 102 and an IP connection 108 to a call-routing system (CRS) 170… For example, if the user says “call 911 I'm hurt”, the virtual assistant 112 can place an emergency call to the PSAP 150 and provide information regarding the user's medical history. The same action can also include notifying the user's emergency contacts or sending the emergency contacts a text message or placing a voice call to them. Further, the virtual assistant 112 can interact with the user to hone in on specific issues. For example, the virtual assistant 112 can ask further questions, such as “what's wrong? Are you in pain? Where does it hurt?” Triage, by the virtual assistant 112, in this manner can ensure the PSAP 150 dispatches an appropriate emergency responder 160, and that the emergency responder 160 provides the proper care, tailored to the emergency communication 106 and the user's needs.”); and Processing the medical assistance request from the requestor via chatbot. Philbin is analogous to Millhouse, as both references disclose a means of summoning emergency assistance through a networked request. As noted above, Millhouse teaches wherein the request is made via an interactive application, and similarly, Philbin discloses that the virtual home assistant interacts with an application as needed in order to make the request. As such one would have been motivated to include the virtual assistant means of communication as taught by Philbin, as Philbin teaches that a virtual assistant is generally ready to receive a command, which is especially pertinent as many households do away with traditional landlines, and further, provides for an additional means of calling for help if the user is somehow incapacitated. As such the inclusion of a virtual assistant would provide increased safety and access to emergency care. In reference to claim 2, 12, 22 Millhouse further teaches: wherein processing a medical assistance request from a requester includes one or more of: processing the medical assistance request from the requester via an application program interface (at least [fig 1, 2 and related text including 014, 020] “The mobile application 112 may also generate a user interface enabling the individual operating the mobile device to report an incident such as a medical emergency.” “As noted above, the mobile application 112 may also provide a user interface to self-report an incident…”). In reference to claim 3, 13, 23: Millhouse further teaches: wherein defining an incident location for the medical assistance request includes one or more of: obtaining the incident location from the requester (at least [fig 1 and related text including 014, 020] self-reported incident includes location) ; obtaining the incident location from a location database; obtaining the incident location from a GPS chipset included within a handheld electronic device (at least [fig 1 and related text including 014] “The mobile application 112 is configured to communicate with the incident computing device 105 via a communications network 114. The mobile computing device 108 is able to generate a location identifier 116, such as a location determined via GPS Wi-Fi geolocation, or other location-based protocol.” ); and obtaining the incident location via cell tower triangulation of a handheld electronic device. In reference to claim 4, 14, 24: Millhouse further teaches: wherein processing a medical assistance request from a requester includes: identifying an incident type for the medical assistance request (at least [021, 034] “Upon identifying an incident occurrence, the mobile application 112 automatically transmits incident information to an incident computing device 105. The incident information includes a location of the incident occurrence and an incident type. FIG. 2 identifies an exemplary location 204 of an incident occurrence. The incident type is at least one of a fall, a collision, and a medical incident depending on the incident occurrence.”, see also [017] for further discussion of incident types). In reference to claim 5, 15, 25: Millhouse further teaches :wherein assigning an autonomous drone to the medical assistance request, thus defining an assigned autonomous drone includes: assigning an autonomous drone to the medical assistance request based, at least in part, upon the incident type (at least [fig 1, 5, and related text’. assigned bins 502 are shown in Fig 5 and related text, “…bins 122 are locked and organized based on incident types. In such an embodiment, the automated robotic vehicle 110 is further configured to receive an identification of the incident type from the incident computing device 105 and automatically unlock one or more bins of the bins 122 based on the incident type… As a non-limiting example, the automated robotic vehicle may have one bin that includes supplies used to treat diabetics, such as insulin and needles and may have a separate second bin that is used to treat cardiac situations such as nitroglycerin pills or cardiac stimulation devices. It will be appreciated that in other embodiments the automated robotic vehicle 110 may include only a single bin.” In reference to claim 9, 19, 29: Millhouse further teaches: wherein the assigned autonomous drone is configured to communicate with a medical facility (at least [019] “The automated robotic vehicle 110 transmits the identifier to the incident computing device 105 via communications network 114. The incident computing device 105 retrieves medical records associated with the identifier from a remote database 130, such as a hospital database or a medical records repository. The incident computing device 105 transmits the medical records to the automated robotic vehicle 110. The automated robotic vehicle 110 displays the medical records on the display 124, enabling, for example, the responder to view the medical records of the individual.”) In reference to claim 10, 20, 30: Millhouse further teaches: wherein the assigned autonomous drone is configured to communicate with a subject of the medical assistance request (at least [016] “…the automated robotic vehicle 110 includes at least one of a camera 126 and a microphone 128, and provides a transceiver or other two way communication capability…” at [015, 21] the display on the autonomous robot may include instructions for the subject and/or responders). Claim(s) 6, 8, 16, 18, 26, 28 is/are rejected under 35 U.S.C. 103 as being unpatentable over Millhouse in view Kantor, in view of Philbin, further in view of Wilson (This Drone Ambulance Is Totally Wild, And Totally Inevitable, available at: https://www.fastcompany.com/3041696/this-drone-ambulance-is-totally-wild-and-totally-inevitable, hereinafter Drone Ambulance). In reference to claim 6, 16, 26: In reference to claim 6, 16, 26, Millhouse/Kantor/Philbin teaches all the limitations above. Millhouse further teaches wherein the medical professional is able to communicate with the drone and with the patient (see 016, 017, 019). However, Millhouse does not specifically disclose movement of a medical professional. Drone ambulance however, does disclose: wherein the assigned autonomous drone is configured to transport a medical professional to the incident location (Second paragraph: “It’s a one-person ambulance drone modeled after a standard quadcopter–driven by a GPS, pilot, or combination of both–that could be dispatched to an emergency scene with a single EMT. It’s designed to land almost anywhere, thanks to a footprint the size of a compact car. The EMT stabilizes the patient, loads him up, and sends him back to the hospital for further treatment.”) Millhouse and Drone Ambulance are analogous references in that both extoll the virtues of drone technology in order to provide fast and effective medical care. One of ordinary skill in the art would find it obvious to include drone transport of a medical professional as taught by Drone Ambulance, as Drone Ambulance teaches: “In emergencies, seconds count. An estimated 1,000 “saveable” lives are lost a year because of slow emergency response in the nation’s biggest cities. But in traffic-jammed urban environments, how can a four-wheeled ambulance be expected to make it anywhere and back quickly?” – i.e. the use of a drone ambulance would be a massive improvement in emergency triage. Drone ambulance further teaches that as a single pilot could manage an entire fleet of ambulances remotely, the ambulances would have a far greater reach and benefit in situations of traffic congestion and confusing terrain. Finally, Drone Ambulance teaches: “It’s basic product innovation: Faster, cheaper, better,” Rolston says. “Many more of these would cost less to service.” As such it would have been obvious to include personnel transport as taught by Drone Ambulance in the drone medical attention system as taught by Millhouse. In reference to claim 8, 18, 28: In reference to claims 8, 18, 28, Millhouse/Kantor/Philbin teaches all the limitations above. Millhouse further teaches: wherein the patient is able to communicate with a medical professional with the drone, and wherein the drone can communicate with a hospital (see 016, 017, 019). However, Millhouse does not specifically disclose transporting a subject to a medical facility. wherein the assigned autonomous drone is configured to transport a subject of the medical assistance request to a medical facility. (Second paragraph: “It’s a one-person ambulance drone modeled after a standard quadcopter–driven by a GPS, pilot, or combination of both–that could be dispatched to an emergency scene with a single EMT. It’s designed to land almost anywhere, thanks to a footprint the size of a compact car. The EMT stabilizes the patient, loads him up, and sends him back to the hospital for further treatment.”) Millhouse and Drone Ambulance are analogous references in that both extoll the virtues of drone technology in order to provide fast and effective medical care. One of ordinary skill in the art would find it obvious to include drone transport of a patient to a medical facility, as taught by Drone Ambulance, as Drone Ambulance teaches: “In emergencies, seconds count. An estimated 1,000 “saveable” lives are lost a year because of slow emergency response in the nation’s biggest cities. But in traffic-jammed urban environments, how can a four-wheeled ambulance be expected to make it anywhere and back quickly?” – i.e. the use of a drone ambulance would allow for many more lives saved as those patients would be quickly transported. Drone ambulance further teaches that as a single pilot could manage an entire fleet of ambulances remotely, the ambulances would have a far greater reach and benefit in situations of traffic congestion and confusing terrain. Finally, Drone Ambulance teaches: “It’s basic product innovation: Faster, cheaper, better,” Rolston says. “Many more of these would cost less to service.” As such it would have been obvious to include patient transport as taught by Drone Ambulance in the drone medical attention system as taught by Millhouse. Claim(s) 7, 17, 27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Millhouse in view of Kantor, in view of Philbin, further in view of McCain et al (US 20230047041 A1, hereinafter McCain). In reference to claim 7, 17, 27: In reference to claims 7, 17, 27, Millhouse/Kantor/Philbin teaches all the limitations above. Millhouse further teaches the use of a camera while locating the patient (see 016), but does not explicitly teach searching for the subject. McCain teaches: wherein the assigned autonomous drone is configured to search the incident location for a subject of the medical assistance request (at least [fig 3 and related text] “At block 310, the method of surveillance program 175, determines whether the user is located. In an embodiment, the method utilizes client device 110 to locate a user of client device 104. In another embodiment, the method utilizes a machine learning model and images of client device 110 to locate a user of client device 104. In one embodiment, if the method determines that client device 110 does not locate a user of client device 104, then the method continues to utilize client device 110 to traverse the defined path or alternative defined paths until all paths are traversed.”). McCain and Millhouse are analogous references as both disclose the use of drones in providing emergency medical assistance, specifically using location information of the subject. One of ordinary skill in the art would have found it obvious to include or modify the camera of Millhouse to search for the subject requesting assistance, as taught by McCain, as McCain teaches that many situations in which a drone medical response is especially helpful do not have cellular or other communication services: “Various embodiments of the present disclosure recognize that hiking is a common source of injury in the wild, accounting for thousands of deaths annually. Also, falls account for 17% of all unintentional deaths and thousands of injuries annually. Embodiments of the present disclosure provide a system and method to locate, assist, and communicate with hikers and personnel in emergency situations where time is of the essence. Additionally, hikers are often without cellular service, local access networks, or any other method of communication in the event of an emergency.” (see 014) As such, a need for more precisely locating an injured subject during a stressful time would have represented an obvious improvement to the functionality of the geolocation as taught by both, and camera of Millhouse or any other autonomous features of a given drone. Response to Arguments Applicant’s remarks as filed on 30 OCT 2025 have been fully considered. Applicant’s remarks regarding the 101 rejection begin on page 8. Applicant asserts that the claim is not directed to an abstract idea; Examiner respectfully disagrees as noted above. Examiner respectfully notes that the “responder” is analyzed at the abstract idea, while the application thereof of a drone is analyzed in subsequent steps. Dispatching is a mental process. Navigating is also a mental process. Applicant discusses a practical application on page 9. Examiner notes Applicant’s remarks regarding the drone, however, the disclosure itself makes it clear that the drone itself is in no way improved, nor is any other technical area. Applicant’s application is not itself directed to a drone, merely the use of one to provide medical assistance. Applicant’s specification does not support a finding of a particular machine as claimed, as the Application relies on known capabilities of existing drones for implementation. The solution is itself technical, however the problem solved is not. Examiner notes that the limitations referenced on page 8 are analyzed at a subsequent step of the analysis and are not asserted to be a part of the abstract idea. Applicant’s remaining remarks do not allow for any further remarks from Examiner. Applicant’s remarks regarding the prior art begin on page 10 of the remarks. Applicant reproduces claim 1 on page 10, and provides additional citations to the specification to support the amendments on pages 10, 11. Examiner notes the remarks, however finds them to be moot in view of the new grounds of rejection identified above. Relevant Prior Art The following prior while not cited is believed to be relevant and as such is made a part of the record: US 20180068567 A1 to Gong discloses geo fencing UAVs and providing situational navigation assistance during the flight. US 20230227183, to Sampsel, discloses the use of a drone to provide medical assistance. US Patent 11288936, to Kumar, discloses the use of autonomous vehicles in emergency monitoring and detection. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KATHERINE KOLOSOWSKI-GAGER whose telephone number is (571)270-5920. The examiner can normally be reached Monday - Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Mamon Obeid can be reached on 571-270-1813. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KATHERINE KOLOSOWSKI-GAGER/Primary Examiner, Art Unit 3687
Read full office action

Prosecution Timeline

Mar 09, 2023
Application Filed
Sep 25, 2024
Non-Final Rejection — §101, §103
Dec 19, 2024
Response Filed
Jan 11, 2025
Final Rejection — §101, §103
Apr 16, 2025
Response after Non-Final Action
May 19, 2025
Request for Continued Examination
May 22, 2025
Response after Non-Final Action
May 23, 2025
Non-Final Rejection — §101, §103
Oct 30, 2025
Response Filed
Jan 10, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12499467
PREDICTING THE EFFECTIVENESS OF A MARKETING CAMPAIGN PRIOR TO DEPLOYMENT
2y 5m to grant Granted Dec 16, 2025
Patent 12462273
SYSTEM AND METHOD FOR USING DEVICE DISCOVERY TO PROVIDE ADVERTISING SERVICES
2y 5m to grant Granted Nov 04, 2025
Patent 12462938
MACHINE-LEARNING MODEL FOR GENERATING HEMOPHILIA PERTINENT PREDICTIONS USING SENSOR DATA
2y 5m to grant Granted Nov 04, 2025
Patent 12444507
BAYESIAN CAUSAL INFERENCE MODELS FOR HEALTHCARE TREATMENT USING REAL WORLD PATIENT DATA
2y 5m to grant Granted Oct 14, 2025
Patent 12437315
SYSTEMS AND METHODS FOR DYNAMICALLY DETERMINING EVENT CONTENT ITEMS
2y 5m to grant Granted Oct 07, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
26%
Grant Probability
60%
With Interview (+33.6%)
4y 3m
Median Time to Grant
High
PTA Risk
Based on 358 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month