DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Examiner’s Note
Examiner has cited particular paragraphs/columns and line numbers or figures in the references as applied to the claims below for convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations with the individual claim, other passages and figures may apply as well. It is respectfully requested from the applicant, in preparing the responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. Applicant is reminded that the Examiner is entitled to give the broadest reasonable interpretation to the language of the claims. Furthermore, the Examiner is not limited to the Applicant’s definition which is not specifically set forth in the claims.
Specification
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant's cooperation is requested in correcting any errors of which applicant may become aware of, in the specification.
Status of Application
The list of claims 1-20 is pending in this application. In the claim set filed 10/14/2025:
Claim(s) 1, 8 and 15 is/are the independent claim(s) observed in the application.
Claim(s) 1, 4, 6, 8, 11, 15 and 18 has/have been indicated as amended.
Claim(s) 2, 3, 5, 7, 9, 10, 12-14, 16, 17, 19 and 20 has/have been indicated as originally presented.
Response to Arguments
With respect to Applicant’s remarks filed on 10/14/2025; Applicant's “Amendments and Remarks” have been fully considered. Applicant’s remarks will be addressed in sequential order as they were presented.
With respect to the objection(s) of claim(s) 4, 6, 11 and 18, Applicant’s “Amendments and Remarks” have been fully considered and are persuasive. Therefore, the objection(s) of claim(s) 4, 6, 11 and 18 has/have been withdrawn.
With respect to the rejection(s) of claim(s) 1-20 under 35 U.S.C. § 101, Applicant’s “Amendments and Remarks” have been fully considered and are persuasive. Therefore, the rejection(s) of claim(s) 1-20 under 35 U.S.C. § 101 has/have been withdrawn.
With respect to the rejection(s) of claim(s) 1-20 under 35 U.S.C. § 102(a)(1) and 35 U.S.C. § 103, Applicant’s “Amendments and Remarks” have been fully considered and are persuasive. Therefore, the rejection(s) of claim(s) 1-20 under 35 U.S.C. § 102(a)(1) and 35 U.S.C. § 103 has/have been withdrawn.
Office Note: Due to applicant’s amendments, further claim rejections appear on the record as stated in the Final Office Action below.
Final Office Action
Claim Objections
Claim(s) 6 is/are objected to because of the following minor informalities:
Claim(s) 6 contain(s) a minor antecedent basis issue as follows: “The autonomous vehicle of claim 1, wherein the at least one processor is further configured to: determine presence of the emergency situation and absence of the emergency vehicle at the site of the emergency situation based on one or more machine-learning models trained to identify the emergency situation.” However, claim 1, from which claim 6 depends already recites: “determine, from the sensor data, a presence of an emergency situation” and “determine, from the sensor data, an absence of an emergency vehicle at a site of the emergency situation.”
Therefore, claim 6 should be amended to instead recite: “The autonomous vehicle of claim 1, wherein the at least one processor is further configured to: determine the presence of the emergency situation and the absence of the emergency vehicle at the site of the emergency situation based on one or more machine-learning models trained to identify the emergency situation.”
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims under pre-AIA 35 U.S.C. 103(a), the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were made absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and invention dates of each claim that was not commonly owned at the time a later invention was made in order for the examiner to consider the applicability of pre-AIA 35 U.S.C. 103(c) and potential pre-AIA 35 U.S.C. 102(e), (f) or (g) prior art under pre-AIA 35 U.S.C. 103(a).
Claim(s) 1, 3-8, 10-15 and 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Graham et al. (United States Patent 12,087,158 B1) in view of Bradley (United States Patent Publication 2024/0111305 A1), referenced as Graham and Bradley, respectively, moving forward.
With respect to claim 1, Graham discloses:
“An autonomous vehicle, comprising: one or more sensors positioned on a body of the autonomous vehicle”[Graham; "With continued reference to FIG. 1 , data analysis engine 120, traffic impact engine 130, and sensor management engine 140 are logical entities that may be implemented in the form of software (e.g., computer-executable instructions) stored in a memory of, and executing on a processor of unmanned vehicle 100, server 150, or another computer system such as illustrated in FIG. 4;" Fig. 4; Col: 3, Lines: 66-67; Col: 4, Lines: 1-5];
“at least one processor; at least one memory storing instructions, which, when executed by the at least one processor, configure the at least one processor to:”[Graham; "With continued reference to FIG. 1 , data analysis engine 120, traffic impact engine 130, and sensor management engine 140 are logical entities that may be implemented in the form of software (e.g., computer-executable instructions) stored in a memory of, and executing on a processor of unmanned vehicle 100, server 150, or another computer system such as illustrated in FIG. 4;" Fig. 4; Col: 3, Lines: 66-67; Col: 4, Lines: 1-5];
“receive sensor data from the one or more sensors; determine, from the sensor data, a presence of an emergency situation, wherein the emergency situation impacts an object other than the autonomous vehicle”[Graham; "For example, unmanned vehicle 100 may be incorporated into a nearby light pole, a nearby traffic signal, an emergency vehicle (e.g., police, fire, or ambulance), a structure that is built particularly for unmanned vehicles, or the like. In an example, unmanned vehicle 100 may gather information about the scene of the accident. The information may be from other mobile devices in the area (e.g., mobile devices of nearby motorists or pedestrians), other unmanned vehicles in the area, sensors located on the unmanned vehicle (e.g., video camera, photo camera, infrared camera, motion detectors, etc.), or the like. An unmanned vehicle may fly into the air and begin taking photos of the car crash for information and immediately process the scene of the accident. Any gathered information may be assessed all or in part locally or remotely (e.g., server 150);" Fig. 2; Col: 2, Lines: 34-50];
“determine, from the sensor data, an absence of an emergency vehicle at a site of the emergency situation”[Graham; "If EMT or other emergency vehicles (e.g., police or fire) are in route to the accident in area 147, unmanned vehicle may alert vehicles in the way (between the emergency vehicle and the accident) that EMS is arriving for a particular vehicle at a particular location. This may assist in vehicles proactively pulling over in anticipation of an emergency vehicle. A map may be displayed to a device associated with a user of the device/vehicle. The map may include the position of the emergency vehicle and the vehicle of the user in traffic. The emergency vehicle or mobile devices associated with emergency personnel may be provided a map of vehicle positions (e.g., particularly occupied and unoccupied lanes) so the emergency vehicle can efficiently navigate traffic to arrive at the accident in area 147. Unmanned vehicle 100 may assist in clearing a path for the emergency vehicle en route to the accident in area 147. Whether a vehicle is en route may be based on examining GPS coordinates associated with the emergency vehicle;" Col: 5, Lines: 12-39];
“in accordance with the determining, trigger a first responder mode of the autonomous vehicle; in response to triggering of the first responder mode, serve functions of a first responder by controlling movement of the autonomous vehicle to block traffic in the emergency situation”[Graham; "Unmanned vehicle 100 may fly around an area 147 and along nearby roads, and set digital points for navigation or mapping programs (e.g., Google maps, Garmin, auto manufacturers, Siri, etc.), which provide a warning that a lane is closed because of a vehicular crash or the like. Autonomous cars may immediately be communicated with by unmanned vehicle 100 or through remote server 150. The information about the accident may direct autonomous vehicles to switch over to another lane, avoid an accident area, or assist in the directing of traffic via displays on the autonomous vehicle;" Col: 2, Lines: 51-60;
"In an example scenario, after unmanned vehicle 100 is deployed in response to an accident, unmanned vehicle 100 may travel 50 yards along a road and flash an arrow indicating to drivers to move over to the other lane. The flashing arrow or other graphic or text may be placed on a retractable surface of on unmanned vehicle 100. Unmanned vehicle 100 may know which direction to fly along the road based on an assessment of the accident scene;" Col: 3, Lines: 7-14];
And while Graham discloses: a wireless network that enables wireless communication between a UAV and a remote server [Graham; Fig. 1; Col: 7, Lines: 38-62], Graham does not specifically state:
“a transceiver;” in which the system is configured to: “transmit, via the transceiver to a computing system, a notification corresponding to the emergency situation, the notification including a geolocation of the site of the emergency situation;” and “receive, via the transceiver from the computing system, a request to transmit particular sensor data of a sensor of the one or more sensors; and transmit, via the transceiver to the computing system, the requested particular sensor data.”
Bradley, which is in the same field of invention of control systems/methods for UAVs, teaches:
“a transceiver” [Bradley; "Sensors 476 can include sensors including cameras as well as those configured to measure ambient temperature, cabin temperature, moisture, interior cabin pressure of a vehicle, accelerometers to detect acceleration, telemetry data, location data, etc. Sensors 476 can also include an inertial measurement unit (IMU) having one or more of an accelerometer, a gyroscope, and a magnetometer which may be used to estimate acceleration and speed of UAV 270. Sensors 476 can also include infrared sensors, thermal sensors, LIDAR sensors, GPS sensors, magnetic sensors, current sensors, and the like. Sensors 476 can include wireless transceivers so as to transmit sensor data (e.g. temperature data feeds, moisture data feeds, pressure feeds, accelerometer feeds, telemetry data feed, location data feeds, etc.). In some aspects, sensors 476 can be used to anticipate, by utilizing AI models, a pending threat based on historical data prior to it happening, as discussed more particularly below in FIG. 3 . Based on information from the one or more sensors 476, as soon as an initial threat is detected one or more UAVs 270 can be immediately launched. Sensors 476 of UAV 270 can also include one or more cameras with night vision, infrared cameras, microphones, and the like, so as to allow UAV 270 to capture video and provide a live feed and perform threat assessment logic at night or in low light conditions;" Fig. 4; ¶: 0132; See also: ¶: 0158, 0161];
“transmit, via the transceiver to a computing system, a notification corresponding to the emergency situation, the notification including a geolocation of the site of the emergency situation” [Bradley; "In some aspects, this disclosure relates to a system configured to enhance safety of citizens and state officials (e.g., law enforcement officers as well as other first responders). In operation, the system can be configured to enhance safety of citizens who encounter state officials, such as law enforcement personnel. The system can include one or more UAVs that relay information to each other, a central command center, and/or a social media network in a transparent and unbiased format. Each UAV can be equipped with sensors and firmware to assist in relaying salient information of an event to appropriate personnel (e.g., a law enforcement officer who is closest to a site of interest), assess the situation, and safety level assessment (e.g., the type of situation and related safety of an encounter between a citizen and police) as well as notify other citizens and law enforcement officials regarding the event (e.g., via a remote command center, notifications in one or more social media networks, emergency push notifications to devices within a geofence of an area, an alarm and/or LED light patterns, etc.);" ¶: 0120; See also: ¶: 0158, 0161];
“receive, via the transceiver from the computing system, a request to transmit particular sensor data of a sensor of the one or more sensors; and transmit, via the transceiver to the computing system, the requested particular sensor data” [Bradley; "In some aspects, a combination of cellphones, vehicle 260 sensors, sensors 476 on UAV 270, speaker systems, AI, and/or other sensors can enable a user to communicate with vehicle 260 and/or intruder 420. For example, vehicle 260 may be equipped with an AI-powered communication system that integrates with the vehicle's 260 cameras, sensors, and speaker system. This system would process data from the cameras and sensors, allowing a user remotely monitor vehicle's 260 surroundings in real-time by communicating all the data to a mobile device. In another example, a mobile app may connect to vehicle's 260 AI-powered communication system. The mobile app may have access to live camera feeds from vehicle's 260 cameras, and receive alerts about security events, and communicate with vehicle 260 remotely. In another example, a UAV 270 equipped with camera(s) may be integrated in the above system. For example, if the AI system detects a potential security threat or intrusion, you can deploy the UAV 270 to get a better view of the situation. The UAV 270 camera feed may be streamed to a mobile device, enabling a user to assess the situation from different angles;" ¶: 0191;
"In step 1004, the method may include transmitting, by the one or more UAVs, the video and/or audio data to a user device of the vehicle's owner;" ¶: 0211; See also: Fig. 4; ¶: 0132, 0158, 0161].
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for controlling a UAV to perform traffic control upon detecting a vehicular accident as disclosed by Graham to incorporate the teachings regarding using a transceiver on-board a UAV to transmit information to and from a remote computing system to relay information from the UAV to an emergency personnel for example, as taught by Bradley with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for controlling a UAV to perform traffic control upon detecting a vehicular accident that is more robust in its ability to assess patients at the site of an accident prior to paramedics arriving at the scene thereby enabling emergency responders to more effectively and efficiently respond to critical situations, for example [Bradley; ¶: 0134, 0150, 0151].
With respect to claim 3, Graham discloses: “wherein the one or more sensors include at least one or more acoustic sensors, one or more cameras, one or more radio detection and ranging (RADAR) sensors, or one or more light detection and ranging (LiDAR) sensors”[Graham; "Alternatively, unmanned vehicle 100 may be located at a nearby structure 131. For example, unmanned vehicle 100 may be incorporated into a nearby light pole, a nearby traffic signal, an emergency vehicle (e.g., police, fire, or ambulance), a structure that is built particularly for unmanned vehicles, or the like. In an example, unmanned vehicle 100 may gather information about the scene of the accident. The information may be from other mobile devices in the area (e.g., mobile devices of nearby motorists or pedestrians), other unmanned vehicles in the area, sensors located on the unmanned vehicle (e.g., video camera, photo camera, infrared camera, motion detectors, etc.), or the like. An unmanned vehicle may fly into the air and begin taking photos of the car crash for information and immediately process the scene of the accident. Any gathered information may be assessed all or in part locally or remotely (e.g., server 150);" Col: 2, Lines: 34-50].
With respect to claim 4, Graham discloses: “wherein the at least one processor is further configured to transmit, via the transceiver to the computing system, three dimensional (3D) visualization images generated based on the sensor data from one or more radio detection and ranging (RADAR) sensors, one or more light detection and ranging (LiDAR) sensors, or one or more cameras”[Graham; "Alternatively, unmanned vehicle 100 may be located at a nearby structure 131. For example, unmanned vehicle 100 may be incorporated into a nearby light pole, a nearby traffic signal, an emergency vehicle (e.g., police, fire, or ambulance), a structure that is built particularly for unmanned vehicles, or the like. In an example, unmanned vehicle 100 may gather information about the scene of the accident. The information may be from other mobile devices in the area (e.g., mobile devices of nearby motorists or pedestrians), other unmanned vehicles in the area, sensors located on the unmanned vehicle (e.g., video camera, photo camera, infrared camera, motion detectors, etc.), or the like. An unmanned vehicle may fly into the air and begin taking photos of the car crash for information and immediately process the scene of the accident. Any gathered information may be assessed all or in part locally or remotely (e.g., server 150);" Col: 2, Lines: 34-50].
With respect to claim 5, Graham discloses: “further comprising one or more display devices mounted on one or more exterior sides of the autonomous vehicle; and wherein the at least one processor is further configured to display one or more messages on the one or more display devices to alert a passerby of the emergency situation or to request help, or initiate a two-way communication between the passerby and an agent at a mission control, a police department, a fire department, or an emergency medical services department”[Graham; "There may be multiple different visual alerts such as: a graphic or text displayed on a screen attached with unmanned vehicle 100; a graphic or text displayed on a screen communicatively connected with the unmanned vehicle 100 or vehicle 132; a graphic or text displayed on a screen in proximity to vehicle 132 or unmanned vehicle 100; a graphic or text displayed on a screen of mobile device 133 or vehicle 132; or a light or flashing light (e.g., strobe, rotating, vehicle signal or emergency lights) attached or communicatively connected with unmanned vehicle 100 or vehicle 132. It is contemplated that multimedia alerts (e.g., audio or video) may be communicated to devices discussed herein (e.g., in system 90);" Col: 2, Lines: 61-67; Col: 3, Lines: 1-6].
With respect to claim 6, Graham does not specifically state: “wherein the at least one processor is further configured to: determine presence of the emergency situation and absence of the emergency vehicle at the site of the emergency situation based on one or more machine-learning models trained to identify the emergency situation including an accident, a fire, a smoke, a damaged property, or an injured person, based at least in part on acoustic data or visual data of the sensor data using one or more radio detection and ranging (RADAR) sensors, one or more light detection and ranging (LiDAR) sensors, one or more cameras, one or more ultrasound sensors, or one or more microphones.”
Bradley teaches: “wherein the at least one processor is further configured to: determine presence of the emergency situation and absence of the emergency vehicle at the site of the emergency situation based on one or more machine-learning models trained to identify the emergency situation including an accident, a fire, a smoke, a damaged property, or an injured person, based at least in part on acoustic data or visual data of the sensor data using one or more radio detection and ranging (RADAR) sensors, one or more light detection and ranging (LiDAR) sensors, one or more cameras, one or more ultrasound sensors, or one or more microphones”[Bradley; "In some aspects, the one or more UAV threat response operations include one or more UAVs launching from the vehicle to identify an intruder associated with the one or more threats, causing the one or more UAVs to follow and/or distract the intruder from trying to harm or break into the vehicle, alerting, by the one or more UAVs, one or more persons in a vicinity of the vehicle regarding the determined one or more threats, and tracking and/or identifying, by the one or more UAVs, a location of a police officer in the vicinity and/or guide the police officer to the vehicle and/or the intruder. In some aspects, the UAVs are configured to identify citizens or suspects through AI facial recognition wherein deep learning models are used. The deep learning models can be trained with facial data from private and/or government databases containing facial data. In some aspects, the UAVs are configured to identify citizens or suspects through AI voice recognition wherein deep learning models are used;" ¶: 0059;
"In operation, controller 309 can be configured to manage data streams from each corresponding UAV 270. Such data streams can be used by controller 309 to perform threat assessment logic so as to analyze and identify, by utilizing AI, possible events arising to a predetermined threshold and initiate corresponding response actions in a distributed environment. Examples of such data streams can include audio feeds, video feeds, image feeds, related historical data, and any other feedback received from UAV 270 to identify events of interest (e.g., a conflict arising between a law enforcement officer and a citizen, an intruder breaking into a structure or a vehicle, a fire that is actively burning, an active shooter in a public setting, etc.). In some aspects, data processed by controller 309 can include a status feed of a nearby crowd-sourced data mesh (e.g., from database 233) to identify events of interest so as to identify and/or predict events of interest or threats based thereon;" ¶: 0169; See also: Fig. 1; ¶: 0042, 0043, 0088, 0122, 0123].
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for controlling a UAV to perform traffic control upon detecting a vehicular accident as disclosed by Graham to incorporate the teachings regarding using a transceiver on-board a UAV to transmit information to and from a remote computing system to relay information from the UAV to an emergency personnel for example, as taught by Bradley with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for controlling a UAV to perform traffic control upon detecting a vehicular accident that is more robust in its ability to assess patients at the site of an accident prior to paramedics arriving at the scene thereby enabling emergency responders to more effectively and efficiently respond to critical situations, for example [Bradley; ¶: 0134, 0150, 0151].
With respect to claim 7, Graham discloses: “wherein the at least one processor is further configured to trigger the first responder mode of the autonomous vehicle based on a command received from the computing system in accordance with a current geolocation of the autonomous vehicle”[Graham; "At step 181, unmanned vehicle 100 may receive an alert associated with an accident of vehicle 132 in area 147. The alert of step 181 may include a command to activate unmanned vehicle 100, information identifying vehicle 132, GPS information, data/information (e.g., speed of vehicles, types of vehicles, etc.) from surrounding devices in or around vehicles (e.g., vehicle 148) of area 147;" Fig. 2; Col: 4, Lines: 34-41].
With respect to claim 8, Graham discloses:
“A computer-implemented method comprising: receiving sensor data from one or more sensors positioned on a body of an autonomous vehicle”[Graham; "With continued reference to FIG. 1 , data analysis engine 120, traffic impact engine 130, and sensor management engine 140 are logical entities that may be implemented in the form of software (e.g., computer-executable instructions) stored in a memory of, and executing on a processor of unmanned vehicle 100, server 150, or another computer system such as illustrated in FIG. 4;" Fig. 4; Col: 3, Lines: 66-67; Col: 4, Lines: 1-5];
“determining a presence of an emergency situation, wherein the emergency situation impacts an object other than the autonomous vehicle”[Graham; "For example, unmanned vehicle 100 may be incorporated into a nearby light pole, a nearby traffic signal, an emergency vehicle (e.g., police, fire, or ambulance), a structure that is built particularly for unmanned vehicles, or the like. In an example, unmanned vehicle 100 may gather information about the scene of the accident. The information may be from other mobile devices in the area (e.g., mobile devices of nearby motorists or pedestrians), other unmanned vehicles in the area, sensors located on the unmanned vehicle (e.g., video camera, photo camera, infrared camera, motion detectors, etc.), or the like. An unmanned vehicle may fly into the air and begin taking photos of the car crash for information and immediately process the scene of the accident. Any gathered information may be assessed all or in part locally or remotely (e.g., server 150);" Fig. 2; Col: 2, Lines: 34-50];
“determining an absence of an emergency vehicle at a site of the emergency situation based on the received sensor data”[Graham; "If EMT or other emergency vehicles (e.g., police or fire) are in route to the accident in area 147, unmanned vehicle may alert vehicles in the way (between the emergency vehicle and the accident) that EMS is arriving for a particular vehicle at a particular location. This may assist in vehicles proactively pulling over in anticipation of an emergency vehicle. A map may be displayed to a device associated with a user of the device/vehicle. The map may include the position of the emergency vehicle and the vehicle of the user in traffic. The emergency vehicle or mobile devices associated with emergency personnel may be provided a map of vehicle positions (e.g., particularly occupied and unoccupied lanes) so the emergency vehicle can efficiently navigate traffic to arrive at the accident in area 147. Unmanned vehicle 100 may assist in clearing a path for the emergency vehicle en route to the accident in area 147. Whether a vehicle is en route may be based on examining GPS coordinates associated with the emergency vehicle;" Col: 5, Lines: 12-39];
“in accordance with the determining, triggering a first responder mode of the autonomous vehicle; in response to the triggering of the first responder mode, serve functions of a first responder by controlling movement of the autonomous vehicle to block traffic in the emergency situation”[Graham; "Unmanned vehicle 100 may fly around an area 147 and along nearby roads, and set digital points for navigation or mapping programs (e.g., Google maps, Garmin, auto manufacturers, Siri, etc.), which provide a warning that a lane is closed because of a vehicular crash or the like. Autonomous cars may immediately be communicated with by unmanned vehicle 100 or through remote server 150. The information about the accident may direct autonomous vehicles to switch over to another lane, avoid an accident area, or assist in the directing of traffic via displays on the autonomous vehicle;" Col: 2, Lines: 51-60;
"In an example scenario, after unmanned vehicle 100 is deployed in response to an accident, unmanned vehicle 100 may travel 50 yards along a road and flash an arrow indicating to drivers to move over to the other lane. The flashing arrow or other graphic or text may be placed on a retractable surface of on unmanned vehicle 100. Unmanned vehicle 100 may know which direction to fly along the road based on an assessment of the accident scene;" Col: 3, Lines: 7-14];
And while Graham discloses: a wireless network that enables wireless communication between a UAV and a remote server [Graham; Fig. 1; Col: 7, Lines: 38-62], Graham does not specifically state:
“transmitting, via a transceiver of the autonomous vehicle to a computing system, a notification corresponding to the emergency situation, the notification including a geolocation of the site of the emergency situation;”
Or “receiving, via the transceiver from the computing system, a request to transmit particular sensor data of a sensor of the one or more sensors; and transmitting, via the transceiver to the computing system, the requested particular sensor data”
Bradley teaches:
“transmitting, via a transceiver of the autonomous vehicle to a computing system, a notification corresponding to the emergency situation, the notification including a geolocation of the site of the emergency situation” [Bradley; "In some aspects, this disclosure relates to a system configured to enhance safety of citizens and state officials (e.g., law enforcement officers as well as other first responders). In operation, the system can be configured to enhance safety of citizens who encounter state officials, such as law enforcement personnel. The system can include one or more UAVs that relay information to each other, a central command center, and/or a social media network in a transparent and unbiased format. Each UAV can be equipped with sensors and firmware to assist in relaying salient information of an event to appropriate personnel (e.g., a law enforcement officer who is closest to a site of interest), assess the situation, and safety level assessment (e.g., the type of situation and related safety of an encounter between a citizen and police) as well as notify other citizens and law enforcement officials regarding the event (e.g., via a remote command center, notifications in one or more social media networks, emergency push notifications to devices within a geofence of an area, an alarm and/or LED light patterns, etc.);" ¶: 0120;
"Sensors 476 can include wireless transceivers so as to transmit sensor data (e.g. temperature data feeds, moisture data feeds, pressure feeds, accelerometer feeds, telemetry data feed, location data feeds, etc.);" Fig. 4; ¶: 0132; See also: ¶: 0158, 0161];
“receiving, via the transceiver from the computing system, a request to transmit particular sensor data of a sensor of the one or more sensors; and transmitting, via the transceiver to the computing system, the requested particular sensor data” [Bradley; "In some aspects, a combination of cellphones, vehicle 260 sensors, sensors 476 on UAV 270, speaker systems, AI, and/or other sensors can enable a user to communicate with vehicle 260 and/or intruder 420. For example, vehicle 260 may be equipped with an AI-powered communication system that integrates with the vehicle's 260 cameras, sensors, and speaker system. This system would process data from the cameras and sensors, allowing a user remotely monitor vehicle's 260 surroundings in real-time by communicating all the data to a mobile device. In another example, a mobile app may connect to vehicle's 260 AI-powered communication system. The mobile app may have access to live camera feeds from vehicle's 260 cameras, and receive alerts about security events, and communicate with vehicle 260 remotely. In another example, a UAV 270 equipped with camera(s) may be integrated in the above system. For example, if the AI system detects a potential security threat or intrusion, you can deploy the UAV 270 to get a better view of the situation. The UAV 270 camera feed may be streamed to a mobile device, enabling a user to assess the situation from different angles;" ¶: 0191;
"In step 1004, the method may include transmitting, by the one or more UAVs, the video and/or audio data to a user device of the vehicle's owner;" ¶: 0211; See also: Fig. 4; ¶: 0132, 0158, 0161].
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for controlling a UAV to perform traffic control upon detecting a vehicular accident as disclosed by Graham to incorporate the teachings regarding using a transceiver on-board a UAV to transmit information to and from a remote computing system to relay information from the UAV to an emergency personnel for example, as taught by Bradley with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for controlling a UAV to perform traffic control upon detecting a vehicular accident that is more robust in its ability to assess patients at the site of an accident prior to paramedics arriving at the scene thereby enabling emergency responders to more effectively and efficiently respond to critical situations, for example [Bradley; ¶: 0134, 0150, 0151].
With respect to claim 10, Graham discloses: “wherein the one or more sensors includes at least one or more acoustic sensors, one or more cameras, one or more radio detection and ranging (RADAR) sensors, or one or more light detection and ranging (LiDAR) sensors”[Graham; "Alternatively, unmanned vehicle 100 may be located at a nearby structure 131. For example, unmanned vehicle 100 may be incorporated into a nearby light pole, a nearby traffic signal, an emergency vehicle (e.g., police, fire, or ambulance), a structure that is built particularly for unmanned vehicles, or the like. In an example, unmanned vehicle 100 may gather information about the scene of the accident. The information may be from other mobile devices in the area (e.g., mobile devices of nearby motorists or pedestrians), other unmanned vehicles in the area, sensors located on the unmanned vehicle (e.g., video camera, photo camera, infrared camera, motion detectors, etc.), or the like. An unmanned vehicle may fly into the air and begin taking photos of the car crash for information and immediately process the scene of the accident. Any gathered information may be assessed all or in part locally or remotely (e.g., server 150);" Col: 2, Lines: 34-50].
With respect to claim 11, Graham discloses: “further comprising transmitting, via the transceiver to the computing system, three dimensional (3D) visualization images generated based on the sensor data from the one or more radio detection and ranging (RADAR) sensors, the one or more light detection and ranging (LiDAR) sensors, one or more ultrasound sensors, or the one or more cameras”[Graham; "Alternatively, unmanned vehicle 100 may be located at a nearby structure 131. For example, unmanned vehicle 100 may be incorporated into a nearby light pole, a nearby traffic signal, an emergency vehicle (e.g., police, fire, or ambulance), a structure that is built particularly for unmanned vehicles, or the like. In an example, unmanned vehicle 100 may gather information about the scene of the accident. The information may be from other mobile devices in the area (e.g., mobile devices of nearby motorists or pedestrians), other unmanned vehicles in the area, sensors located on the unmanned vehicle (e.g., video camera, photo camera, infrared camera, motion detectors, etc.), or the like. An unmanned vehicle may fly into the air and begin taking photos of the car crash for information and immediately process the scene of the accident. Any gathered information may be assessed all or in part locally or remotely (e.g., server 150);" Col: 2, Lines: 34-50].
With respect to claim 12, Graham discloses: “further comprising displaying one or more messages on one or more display devices one or more display devices mounted on one or more exterior sides of the autonomous vehicle to alert a passerby of the emergency situation or to request help”[Graham; "There may be multiple different visual alerts such as: a graphic or text displayed on a screen attached with unmanned vehicle 100; a graphic or text displayed on a screen communicatively connected with the unmanned vehicle 100 or vehicle 132; a graphic or text displayed on a screen in proximity to vehicle 132 or unmanned vehicle 100; a graphic or text displayed on a screen of mobile device 133 or vehicle 132; or a light or flashing light (e.g., strobe, rotating, vehicle signal or emergency lights) attached or communicatively connected with unmanned vehicle 100 or vehicle 132. It is contemplated that multimedia alerts (e.g., audio or video) may be communicated to devices discussed herein (e.g., in system 90);" Col: 2, Lines: 61-67; Col: 3, Lines: 1-6].
With respect to claim 13, Graham discloses: “further comprising initiating a two-way communication between a passerby and an agent at a mission control, a police department, a fire department, or an emergency medical services department”[Graham; "Unmanned vehicle 100 may also send a text message to other vehicles in area 147, such as vehicle 148 in area 147, based on an accident (or other mechanical failure) of vehicle 132;" Col: 3, Lines: 14-17;
"At step 181, unmanned vehicle 100 may receive an alert associated with an accident of vehicle 132 in area 147. The alert of step 181 may include a command to activate unmanned vehicle 100, information identifying vehicle 132, GPS information, data/information (e.g., speed of vehicles, types of vehicles, etc.) from surrounding devices in or around vehicles (e.g., vehicle 148) of area 147;" Fig. 2; Col: 4, Lines: 34-41].
With respect to claim 14, Graham discloses: “further triggering the first responder mode of the autonomous vehicle based on a command received from the computing system in accordance with a current geolocation of the autonomous vehicle”[Graham; "At step 181, unmanned vehicle 100 may receive an alert associated with an accident of vehicle 132 in area 147. The alert of step 181 may include a command to activate unmanned vehicle 100, information identifying vehicle 132, GPS information, data/information (e.g., speed of vehicles, types of vehicles, etc.) from surrounding devices in or around vehicles (e.g., vehicle 148) of area 147;" Fig. 2; Col: 4, Lines: 34-41].
With respect to claim 15, Graham discloses:
“A non-transitory computer-readable medium (CRM) embodying programmed instructions which, when executed by at least one processor of an autonomous vehicle, cause the at least one processor to perform operations comprising:”[Graham; "With continued reference to FIG. 1 , data analysis engine 120, traffic impact engine 130, and sensor management engine 140 are logical entities that may be implemented in the form of software (e.g., computer-executable instructions) stored in a memory of, and executing on a processor of unmanned vehicle 100, server 150, or another computer system such as illustrated in FIG. 4;" Fig. 4; Col: 3, Lines: 66-67; Col: 4, Lines: 1-5;
"The drives and their associated computer-readable media provide non-volatile storage of computer readable instructions, data structures, program modules and other data for the computer 920. As described herein, computer-readable media is a tangible, physical, and concrete article of manufacture and thus not a signal per sc;" Fig. 4; Col: 8, Lines: 49-55];
“receiving sensor data from one or more sensors positioned on a body of the autonomous vehicle; based on the sensor data, determining a presence of an emergency situation, wherein the emergency situation impacts an object other than the autonomous vehicle”[Graham; "For example, unmanned vehicle 100 may be incorporated into a nearby light pole, a nearby traffic signal, an emergency vehicle (e.g., police, fire, or ambulance), a structure that is built particularly for unmanned vehicles, or the like. In an example, unmanned vehicle 100 may gather information about the scene of the accident. The information may be from other mobile devices in the area (e.g., mobile devices of nearby motorists or pedestrians), other unmanned vehicles in the area, sensors located on the unmanned vehicle (e.g., video camera, photo camera, infrared camera, motion detectors, etc.), or the like. An unmanned vehicle may fly into the air and begin taking photos of the car crash for information and immediately process the scene of the accident. Any gathered information may be assessed all or in part locally or remotely (e.g., server 150);" Fig. 2; Col: 2, Lines: 34-50];
“based on the sensor data, determining an absence of an emergency vehicle at a site of the emergency situation”[Graham; "If EMT or other emergency vehicles (e.g., police or fire) are in route to the accident in area 147, unmanned vehicle may alert vehicles in the way (between the emergency vehicle and the accident) that EMS is arriving for a particular vehicle at a particular location. This may assist in vehicles proactively pulling over in anticipation of an emergency vehicle. A map may be displayed to a device associated with a user of the device/vehicle. The map may include the position of the emergency vehicle and the vehicle of the user in traffic. The emergency vehicle or mobile devices associated with emergency personnel may be provided a map of vehicle positions (e.g., particularly occupied and unoccupied lanes) so the emergency vehicle can efficiently navigate traffic to arrive at the accident in area 147. Unmanned vehicle 100 may assist in clearing a path for the emergency vehicle en route to the accident in area 147. Whether a vehicle is en route may be based on examining GPS coordinates associated with the emergency vehicle;" Col: 5, Lines: 12-39];
“in accordance with the determining, triggering a first responder mode of the autonomous vehicle; in response to the triggering of the first responder mode, serve functions of a first responder by controlling movement of the autonomous vehicle to block traffic in the emergency situation”[Graham; "Unmanned vehicle 100 may fly around an area 147 and along nearby roads, and set digital points for navigation or mapping programs (e.g., Google maps, Garmin, auto manufacturers, Siri, etc.), which provide a warning that a lane is closed because of a vehicular crash or the like. Autonomous cars may immediately be communicated with by unmanned vehicle 100 or through remote server 150. The information about the accident may direct autonomous vehicles to switch over to another lane, avoid an accident area, or assist in the directing of traffic via displays on the autonomous vehicle;" Col: 2, Lines: 51-60;
"In an example scenario, after unmanned vehicle 100 is deployed in response to an accident, unmanned vehicle 100 may travel 50 yards along a road and flash an arrow indicating to drivers to move over to the other lane. The flashing arrow or other graphic or text may be placed on a retractable surface of on unmanned vehicle 100. Unmanned vehicle 100 may know which direction to fly along the road based on an assessment of the accident scene;" Col: 3, Lines: 7-14];
And while Graham discloses: a wireless network that enables wireless communication between a UAV and a remote server [Graham; Fig. 1; Col: 7, Lines: 38-62], Graham does not specifically state:
“transmitting, via a transceiver of the autonomous vehicle to a computing system, a notification corresponding to the emergency situation, the notification including a geolocation of the site of the emergency situation;”
Or “receiving, via the transceiver from the computing system, a request to transmit particular sensor data of a sensor of the one or more sensors; and transmitting, via the transceiver to the computing system, the requested particular sensor data”
Bradley teaches:
“transmitting, via a transceiver of the autonomous vehicle to a computing system, a notification corresponding to the emergency situation, the notification including a geolocation of the site of the emergency situation” [Bradley; "In some aspects, this disclosure relates to a system configured to enhance safety of citizens and state officials (e.g., law enforcement officers as well as other first responders). In operation, the system can be configured to enhance safety of citizens who encounter state officials, such as law enforcement personnel. The system can include one or more UAVs that relay information to each other, a central command center, and/or a social media network in a transparent and unbiased format. Each UAV can be equipped with sensors and firmware to assist in relaying salient information of an event to appropriate personnel (e.g., a law enforcement officer who is closest to a site of interest), assess the situation, and safety level assessment (e.g., the type of situation and related safety of an encounter between a citizen and police) as well as notify other citizens and law enforcement officials regarding the event (e.g., via a remote command center, notifications in one or more social media networks, emergency push notifications to devices within a geofence of an area, an alarm and/or LED light patterns, etc.);" ¶: 0120;
"Sensors 476 can include wireless transceivers so as to transmit sensor data (e.g. temperature data feeds, moisture data feeds, pressure feeds, accelerometer feeds, telemetry data feed, location data feeds, etc.);" Fig. 4; ¶: 0132; See also: ¶: 0158, 0161];
“receiving, via the transceiver from the computing system, a request to transmit particular sensor data of a sensor of the one or more sensors; and transmitting, via the transceiver to the computing system, the requested particular sensor data” [Bradley; "In some aspects, a combination of cellphones, vehicle 260 sensors, sensors 476 on UAV 270, speaker systems, AI, and/or other sensors can enable a user to communicate with vehicle 260 and/or intruder 420. For example, vehicle 260 may be equipped with an AI-powered communication system that integrates with the vehicle's 260 cameras, sensors, and speaker system. This system would process data from the cameras and sensors, allowing a user remotely monitor vehicle's 260 surroundings in real-time by communicating all the data to a mobile device. In another example, a mobile app may connect to vehicle's 260 AI-powered communication system. The mobile app may have access to live camera feeds from vehicle's 260 cameras, and receive alerts about security events, and communicate with vehicle 260 remotely. In another example, a UAV 270 equipped with camera(s) may be integrated in the above system. For example, if the AI system detects a potential security threat or intrusion, you can deploy the UAV 270 to get a better view of the situation. The UAV 270 camera feed may be streamed to a mobile device, enabling a user to assess the situation from different angles;" ¶: 0191;
"In step 1004, the method may include transmitting, by the one or more UAVs, the video and/or audio data to a user device of the vehicle's owner;" ¶: 0211; See also: Fig. 4; ¶: 0132, 0158, 0161].
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for controlling a UAV to perform traffic control upon detecting a vehicular accident as disclosed by Graham to incorporate the teachings regarding using a transceiver on-board a UAV to transmit information to and from a remote computing system to relay information from th