DETAILED ACTION
Remarks
This Final office action is in response to the amendments filled on 11/14/2025. Claims 1, 4 and 13 are amended. Claims 10 and 18 are cancelled. Claims 1-9, 11-17, 19 and 20 are pending and examined below.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 4, 5, 9, 11-15, 17, 19 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2024/0019864 (“Elshenawy”), and in view of US 2024/0180062 (“Morimoto”), and further in view of US 2019/0138029 (“Ryll”).
Regarding claim 1 (and similarly claim 13), Elshenawy discloses a method of controlling a vehicle-based drone (see at least fig 1 and [0037]), comprising:
determining a need for secondary navigation data within a sensing area of one or more vehicle sensors (see at least [0142], where “At 1802, a vehicle detects an occlusion in an environment around the vehicle. For example, using the AV sensor suite 108 the AV 102 can detect occlusions or areas that are blind to a sensor in the AV sensor suite 108.”; see also fig 18, where detects an occlusion is interpreted as determining a need for secondary navigation data.);
commanding the drone to depart from the vehicle (see at least fig 18, where 1804, a drone is deployed. see also [0143]);
receiving at the vehicle the secondary navigation data from the drone (see at least [0143], where “At 1806, using the one or more sensors, the drone provides sensor data to the vehicle to clear the occlusion in the environment around the vehicle. For example, using the one or more sensors in the drone sensor suite 204, the drone can be deployed to the area that includes the occlusion and provide sensor data to the AV vehicle 102 to clear the occlusion in the environment around the AV vehicle 102.”); and
operating the vehicle at least in part as a function of the secondary navigation data (see at least [0145], where “If there were obstructions detected along the route, a new route is calculated to avoid the detected obstruction, as in 1910 and, using the one or more sensors, the drone analyzes the (new) route for any obstructions along the route, as in 1906.”; see also fig 19, where a new route is generated if obstructions are detected using drone sensor data (secondary navigation data). So, the vehicle is operated using secondary navigation data.).
For additional claim limitation of claim 13, using primary navigation data including one or both map data and a global positioning system (GPS) of the vehicle to at least in part control operation of the vehicle (see at least [0076] of Elshenawy).
Elshenawy does not disclose the following limitation:
the need for secondary navigation data is determined to exist when a vehicle is travelling or will be travelling within an area that is unmapped such that map data is not available for the area and when information from a Global Positioning System (GPS) of the vehicle is not available.
However, Morimoto discloses a method wherein the need for secondary navigation data is determined to exist when a vehicle is travelling or will be travelling within an area that is unmapped such that map data is not available for the area (see at least [0069], where “As illustrated in FIG. 4, with an agricultural assistance system 100, an agricultural field map MP2 is created by capturing airborne images using an unmanned aerial vehicle (aerial vehicle) 70, and the planned travel line L1 is created based on the agricultural field map MP2.”; see also [0078], fig 10 and fig 16. Agricultural field map is created by an uav for tractor so that the tractor can travel within the agricultural field. Tractor carrying the uav. The map of the area is created means the agricultural field was unmapped or mapping data for the agricultural field was not available.).
For additional claim limitation of claim 13, Morimoto discloses a method wherein the vehicle sensors do not provide primary navigation data (see at least [0058] tractor sensor is not used for mapping. UAV is used for mapping).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have modified Elshenawy to incorporate the teachings of Morimoto by including the above feature for avoiding confusion during navigation and reach the destination faster.
Elshenawy in view of Morimoto does not disclose the following limitation:
when information from a Global Positioning System (GPS) of the vehicle is not available.
However, Ryll discloses a method when information from a Global Positioning System (GPS) of the vehicle is not available (see at least [0058], where “there may be many areas where an autonomous operation of a drone may be desired (for inspections, rescue operations, etc.) but where the GPS information is either not available or faulty.”; see also [0089], where “FIG. 8 shows a schematic flow diagram of a method 800 for operating an unmanned aerial vehicle (e.g. the unmanned aerial vehicle 100 as illustrated in FIG. 1 or any other drone), according to various aspects. The method 800 may include: in 810, providing a spherical map of a vicinity of the unmanned aerial vehicle”; UAV generating map when GPS data is not available).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have modified Elshenawy in view of Morimoto to incorporate the teachings of Ryll by including the above feature for avoiding navigation disruption.
Regarding claim 4, Elshenawy further discloses a method wherein the secondary navigation data includes information about the existence of a road or path the location of roads or paths determined to exist along an intended path of travel of the vehicle (see at least [0145], where “If there were obstructions detected along the route, a new route is calculated to avoid the detected obstruction, as in 1910 and, using the one or more sensors, the drone analyzes the (new) route for any obstructions along the route, as in 1906.”; obstruction along the path is detected. So, existence of path is also detected. The secondary information includes path (new path) along an intended path (original path).), and the location of one or more obstacles relative to the location of the vehicle (see at least [0034], where “The AV assist drone can act as an eye in the sky to detect a just occurring obstruction (e.g., a double-parked car, a traffic jam, etc.). When an obstacle is detected that will block the route of the AV, a new route can be determined for the AV that avoids the detected obstacle.”).
Regarding claim 5 (and similarly claim 15), Elshenawy further discloses method includes commanding the drone to fly away from the vehicle to an area of interest (see at least [0033], where “the AV deploys the AV assist drone to review a route the AV is current following. The AV assist drone can autonomously, without specific instructions from the AV, determine areas along the route that are occluded or will be occluded and deploy to a location to fill in the occlusion.”; deploy is interpreted as fly away and area of interest is interpreted as route of the AV.).
Regarding claim 9 (and similarly claim 17), Elshenawy further discloses a method wherein the secondary navigation data is communicated with a control system of the vehicle and the control system determines a path of travel of the vehicle as a function of the secondary navigation data, and the control system operates the vehicle along the path of travel (see at least [0039], where the drone communicate the sensor data to the vehicle. See also [0040], where a new route is generated based on the drone sensor information (secondary navigation data). See also [0052]).
Regarding claim 11 (and similarly claim 19), Elshenawy further discloses a method wherein the drone includes a sensor and the secondary navigation data includes signals from the sensor (see at least [0039]).
Regarding claim 12, Elshenawy further discloses a method wherein the sensor data is used to determine the size and location of objects in an intended or optional path of travel of the vehicle (see at least [0034], where new route is interpreted as optional path of travel of the vehicle. See also [0096], where “The drone perception module 412 may also identify other features or characteristics of objects in the environment of the AV assist drone 104 and/or the AV 102 based on image data or other sensor data, for example, colors, sizes (e.g., heights of people or buildings in the environment, makes and models of vehicles, pictures and/or words on billboards, etc.).”; see also [0123], where “ the AV assist drone 104 can be deployed to determine a location of and/or identify the obstruction 802.”).
Regarding claim 14, Elshenawy further discloses a method wherein the secondary navigation data includes information about one or more of the location of roads or paths along an intended path of travel of the vehicle (see at least [0145], where new route is determined based on drone information. So, drone is providing secondary information of new route. The secondary information includes path (new path) along an intended path (original path).), and the location of one or more obstacles relative to the location of the vehicle (see at least [0034]).
Regarding claim 20, Elshenawy further discloses a method wherein the need for secondary navigation data is determined by a vehicle control system (see at least [0143]).
Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2024/0019864 (“Elshenawy”), and in view of US 2024/0180062 (“Morimoto”), and in view of US 2019/0138029 (“Ryll”), as applied to claim 1 above, and further in view of US 2020/0164981 (“Chundi”).
Regarding claim 2, Elshenawy in view of Morimoto and Ryll does not disclose claim 2. However, Chundi discloses a method which includes: determining that secondary navigation data is no longer needed (see at least fig 7 and fig 8, where drone is returned upon drone needs to return determination); and
commanding the drone to return to the vehicle (see at least [0036], where “FIG. 7 shows drone 4 returning its docking station on law enforcement vehicle 2 via flight path 8.”).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have modified Elshenawy in view of Morimoto and Ryll to incorporate the teachings of Chundi by including the above feature for reducing unnecessary usages of drone.
Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2024/0019864 (“Elshenawy”), and in view of US 2024/0180062 (“Morimoto”), and in view of US 2019/0138029 (“Ryll”), as applied to claim 1 above, and in view of US 2020/0164981 (“Chundi”), as applied to claim 2 above, and further in view of US 2021/0171197 (“Anderson”).
Regarding claim 3, Elshenawy in view of Morimoto, Ryll and Chundi does not disclose claim 3. However, Anderson discloses a method wherein the determination that the drone needs to return to the vehicle is made as a function of weather conditions experienced by the drone (see at least [0271], where “the UAV 1802 may receive instructions from a distribution center 1801 or the global services 1803 that command the UAV 1802 to return to base immediately due to bad weather”).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have modified Elshenawy in view of Morimoto, Ryll and Chundi to incorporate the teachings of Anderson by including the above feature for avoiding damage or loss of drone by returning back considering the increment weather condition.
Claim(s) 6-8 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2024/0019864 (“Elshenawy”), and in view of US 2024/0180062 (“Morimoto”), and in view of US 2019/0138029 (“Ryll”), as applied to claim 1 above, and further in view of US 2024/0166047 (“Sanderson”).
Regarding claim 6 (and similarly claim 16), Elshenawy further discloses a method which also includes providing images or video from a camera of the drone to the vehicle, where the images or video includes at least (see at least [0039], where “a method for clearing one or more occlusions in an environment around a vehicle can include deploying a vehicle assist drone from the vehicle, where the vehicle assist drone includes one or more sensors and is in communication with the vehicle, using the one or more sensors on the vehicle assist drone to collect sensor data related to the environment around the vehicle, and communicating the collected sensor data to the vehicle from the vehicle assist drone, where the collected sensor data is used to clear one or more occlusions in the environment around the vehicle. The one or more sensors can include a camera, LIDAR, a time-of-flight sensor, and other sensors.”; environment around the vehicle is interpreted as terrain near the vehicle.).
Elshenawy in view of Morimoto and Ryll does not disclose the following limitation:
the images or video includes at least a portion of the vehicle.
However, Sanderson discloses a method wherein the images or video includes at least a portion of the vehicle (see at least [0033], where “example guidance systems may employ a drone to provide image data and other information to assist a vehicle operator with positioning a tire patch of one or more wheels of the vehicle, facilitating the operator maneuvering the vehicle through tight spaces, e.g., rocks or other obstacles.”; see also fig 4).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have modified Elshenawy in view of Morimoto, Ryll and Chundi to incorporate the teachings of Sanderson by including the above feature for providing maneuvering assistance to the vehicle by including the vehicle parts and terrain situation.
Regarding claim 7, Elshenawy further discloses a method wherein displaying the images or video on a screen in the vehicle (see at least [0061], where “The AV onboard controller 106 processes sensor data generated by the AV sensor suite 108 and/or other data (e.g., data received from the AV assist drone 104, from the fleet management system 2202, etc.) to determine the state of the AV 102.”; see also [0079], where “as illustrated in FIG. 3, the user guidance devices 208 can include a laser pointer 334, a light source 336, a speaker 338, and a display 340.”; see also [0095] and fig 5).
Regarding claim 8, Elshenawy in view of Morimoto and Ryll does not disclose claim 8. However, Sanderson further discloses a method wherein the portion of the vehicle includes at least one wheel of the vehicle (see at least [0033] and fig 4). Same motivation of claim 6 applies.
Response to Arguments
Applicant’s arguments with respect to claim 1-9, 11-17, 19 and 20 have been considered but are moot because the arguments do not apply to the new combination used in the current rejection that is due to the newly added claim amendments.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SOHANA TANJU KHAYER whose telephone number is (408)918-7597. The examiner can normally be reached Monday - Thursday, 7 am-5.30 pm, PT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached on 5712703976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SOHANA TANJU KHAYER/Primary Examiner, Art Unit 3657