Prosecution Insights
Last updated: April 19, 2026
Application No. 18/960,818

METHOD FOR OPERATING AN UNMANNED AIRCRAFT OF A MOTOR VEHICLE CHARGING STATION, UNMANNED AIRCRAFT, AND MOTOR VEHICLE CHARGING STATION

Final Rejection §103§112
Filed
Nov 26, 2024
Examiner
NORRIS, URSULA LEE
Art Unit
3676
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Audi AG
OA Round
2 (Final)
87%
Grant Probability
Favorable
3-4
OA Rounds
2y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
46 granted / 53 resolved
+34.8% vs TC avg
Moderate +12% lift
Without
With
+12.5%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 0m
Avg Prosecution
29 currently pending
Career history
82
Total Applications
across all art units

Statute-Specific Performance

§101
15.0%
-25.0% vs TC avg
§103
34.1%
-5.9% vs TC avg
§102
24.6%
-15.4% vs TC avg
§112
23.8%
-16.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 53 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims The following is a Final Office Action in response to the communication filed on 02/23/2026. Claims 1—10 are currently pending. Priority The Applicant’s claim for benefit of German Patent Application DE 10202023133519.7 filed on 11/30/2024 in the Federal Republic of Germany, has been received and acknowledged. Information Disclosure Statement Information Disclosure Statement received 11/26/2024 has been reviewed and considered. Response to Arguments Applicant's arguments and amendments filed 02/23/2026 with respect to the rejection of claim 7 under 35 U.S.C. 112(b) have been fully considered and are persuasive. The provided amendments overcome the previously recited rejection and the rejection of claim 7 under 35 U.S.C. 112(b) is withdrawn. Applicant's arguments and amendments filed 02/23/2026 with respect to the rejection of claims 1 and 3—10 under 35 U.S.C. 103 have been fully considered but are not persuasive. The independent claims including claims 1, 9, and 10 have been amended to recite the limitation of, or substantially similar to: “providing the projection surface on request by displacing a motor vehicle component of the motor vehicle,” which was previously included in claim 4. With regards to the prior art disclosure of Baur the Response states “[t]he alleged user input of Baur only affects the position of the drone. Baur does not provide any teaching of displacement of a motor vehicle component such that a projection surface is provided by way of the movement of the motor vehicle component,” to which the Examiner does not agree. To start, the drone of Baur may be controlled by a user in the vehicle using a joystick (e.g., see Baur para. [0019]). The drone is configured to gather at least image data and project the image data to a vehicle-based vision system (e.g., see para. [0013] of Baur). In some situations this operation may be performed in real time (e.g., see para. [0021] of Baur). As such, the image displayed on the vehicle-based vision system is directly controlled by the operation of the joystick where the vehicle-based vision system provides the visual display for projecting the image data gathered by the drone. In addition to the foregoing, the current drafting of the limitation as amended into at least claim 1 does not require any special interpretation such as the features pointed to in the Response at page 6. For example, the claim does not recite “motor vehicle components may be, for example, a hood a trunk lid, a door, a tank cap or a tank cover or the like and these components may be moved to provide the projection surface.” While the instant Specification may recite features which are not taught by Sychov as modified by Baur, the claims do not currently include these features. Moreover, under the broadest reasonable interpretation as set forth in the both the previous and current rejection, the combination of Sychov as modified by Baur (e.g., and Baur in particular) renders the amended limitations obvious. For the foregoing reasons the rejection of claims 1 and 3—10 under 35 U.S.C. 103 is maintained as modified below in view of the claims amendments. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 3, 9 and 10 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 3 depends from claim 1 where claim 1 recites “providing the projection surface on request by displacing a motor vehicle component of the motor vehicle.” Claim 3 recites multiple potential projection surfaces including “a wall surface of the motor vehicle charging station, a floor service of the surrounding motor vehicle.” It is unclear how the aforementioned surfaces can be provided by displacing a component of the motor vehicle given that they are not part of the motor vehicle and are provided by the motor vehicle. Claims 9 and 10 recite the limitation “wherein the projector is configured to project onto a projection surface provided on request…” where both claims already have antecedent basis for a projection surface. As such the limitation related to the second recitation of a projection surface is rendered indefinite. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1 and 3—10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Published US Patent Application to Sychov (US 20180186247 A1) in view of Published US Patent Application to Baur (US 20180141658 A1). Regarding claim 1, Sychov discloses [a] method for operating [a device] (projector unit 504, control unit 505, detector module 506, speaker unit 510, and media server 513; para. [0016], “[t]he projector unit is integrated into the housing. Further elements such as the charging unit, the control unit, and/or the detecting module or parts of these elements may be partly or completely integrated into the housing.”) of a motor vehicle charging station having at least one charging point for a motor vehicle (see FIG. 1 and 2 including parking space 501 and charging unit 500), comprising: projecting by a projector (projector unit 504, see para. [0005], [0007], [0009])… at least temporarily, an image aimed at supporting a user of the motor vehicle onto a projection surface (para. [0013], “[i]n comparison with conventional charging systems the inventive system has several advantages in terms of user friendliness, user-safety and economic operation. In particular, by detecting status data of the system and controlling the projector unit as a function of the status data, it is possible to display images or moving images and, thus, to provide useful information for the user of the charging system, wherein different information may be provided in different conditions or states of the system.”). Sychov discloses an autonomous charging station which utilizes a stationary device to gather and project information for use by an automobile user. As such, Sychov may not explicitly disclose that a drone is used to gather and project the information. However, the prior art is replete with examples of drones which interact with automobiles, and specifically drones that gather information, perform some level of an analysis on the information, and display/transmit/project data to an automobile user (e.g., see additional cited art). One such reference includes Baur, which is in the same field of endeavor as the instant application insofar as it is directed to drones which gather data and provide information/guidance (e.g., by way of projecting the data to an automobile) to an automobile user. Baur teaches each and every positively recited limitation of claim 1: projecting by a projector of the aircraft (Baur, para. [0021], “[t]he drone may communicate (in real time) captured image data to the vision system of the vehicle (such as for display of captured video images or for processing of captured image data)… Optionally, the drone may include an image processor that processes captured image data, whereby the drone may communicate a signal to the vehicle vision system that is indicative of a detected object or the like.”; para. [0011], “[t]he aerial platform or drone includes one or more cameras 16a for transmission of captured images for viewing by an operator of the drone. By incorporating such a drone in or at a vehicle, numerous advantages can be achieved. The drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle… for display of images for viewing by the driver of the vehicle.” Examiner notes that signals and communications projected by the drone to the vehicle constitute a projection), at least temporarily, an image aimed at supporting a user (para. [0011], “whereby the captured image data may be communicated to the vision system of the vehicle… for display of images for viewing by the driver of the vehicle.”) of the motor vehicle onto a projection surface (vehicle-based vision system, see citations below); and providing the projection surface (para. [0023], “[t]he vehicle-based vision system receives image data captured by… the drone…”; The vehicle-based vision system provides for a projection surface for the data projected by the drone) on request by displacing a motor vehicle component (para. [0019], “the control may control the drone responsive to a user input, such as a voice command from the driver or such as a joystick control or such as a touch screen or the like in the vehicle.”) of the motor vehicle (para. [0013], “[t]he drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle, such as for image processing for object detection or the like, or for display of images for viewing by the driver of the vehicle.” Examiner notes that controlling the drone using the joystick may cause the drone to gather image data which is projected on the vehicle-based vision system). Baur provides motivation to utilize a drone to collect data for the car rather than using a stationary sensor. For example Baur teaches “cameras disposed at a vehicle are limited by the physics and the physical location where the camera is mounted at the vehicle.” (Baur, para. [0011]). Baur further teaches “[t]he aerial platform or drone includes one or more cameras 16a for transmission of captured images for viewing by an operator of the drone. By incorporating such a drone in or at a vehicle, numerous advantages can be achieved. The drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle, such as for image processing for object detection or the like, or for display of images for viewing by the driver of the vehicle.” (Baur, para. [0013]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have incorporated the data gathering and projection features of the stationary object of Sychov into a drone capable of gathering data and communicating with a vehicle such as that of Baur. The components and functions of both the stationary device of Sychov and the drone of Baur are known and the combination provides the predictable result of a high-mobility device capable of gathering data and presenting the data to a user of an automobile. Moreover, Baur provides motivation for incorporating the data gathering and data projecting features into a drone where Baur teaches that drones are capable of gathering data in locations that a stationary sensor cannot access. Regarding claim 3, Sychov modified by Baur teaches wherein any of a wall surface of the motor vehicle charging station, a floor surface of the surroundings of the motor vehicle, a body surface of the motor vehicle, and an interior surface of the motor vehicle is used as the projection surface (Baur teaches projecting a signal to the vehicle such that the vehicle displays the relevant information gathered by the drone on a surface of the vehicle, see para. [0010], [0017], [0021], and [0023] of Baur). Regarding claim 4, Sychov modified by Baur teaches wherein the motor vehicle is connected to the motor vehicle charging station and/or the aircraft via a data link (Baur, para. [0019], “[a] vehicle-based control may control the drone (via wired or wireless communication) to follow the vehicle or to stray from the vehicle to view other areas remote of the vehicle.”). Regarding claim 5, Sychov modified by Baur teaches wherein the aircraft projects the image to implement any of guiding the motor vehicle to the charging station (Baur, para. [0021], “the drone may include an image processor that processes captured image data, whereby the drone may communicate a signal to the vehicle vision system that is indicative of a detected object or the like.”; para. [0023], “[r]esponsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.”), providing the user with an instruction for a usage action of the motor vehicle, and providing the user with an instruction for charging an energy storage device of the motor vehicle by way of the motor vehicle charging station. Regarding claim 6, Sychov modified by Baur teaches wherein the aircraft includes at least one of a screen, a loudspeaker, a microphone, and a camera (cameras 16a), wherein the screen is configured to display an additional image, at least temporarily, wherein the loudspeaker is configured to output an output sound signal, at least temporarily, wherein the microphone is configured to record an input sound signal, at least temporarily, and wherein the camera is configured take a recording (Baur, para. [0013], “[t]he aerial platform or drone includes one or more cameras 16a for transmission of captured images for viewing by an operator of the drone.”; para. [0017], “the communication may comprise display of images derived from the image data captured by the drone camera.”), at least temporarily. Regarding claim 7, Sychov modified by Baur teaches wherein the aircraft is configured to communicate via a communication link with a central location (Baur, para. [0014], “the drone may be battery operated and may communicate with the vehicle vision system via a wireless communication link or the like.” Examiner notes that, under the broadest reasonable interpretation, a central location, without any additional limitations, could include the vehicle vision system), wherein the central location provides the image, an additional image, and/or an output sound signal, and/or evaluates an input sound signal and/or the recording (para. [0013], “[t]he drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle, such as for image processing for object detection or the like, or for display of images for viewing by the driver of the vehicle.”; para. [0023], “[r]esponsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.” See also para. [0017]). Sychov modified by Baur may not explicitly disclose the limitations of claim 8; however, the inclusion and utilization of a redundant or duplicative element (e.g., an additional camera included in the drone) would be obvious in view of Sychov modified by Baur. For example, Sychov modified by Baur discloses the claimed invention except for the express utilization of a secondary drone camera to generate additional image data of the vehicle or the user. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included and utilized a secondary camera since it has been held that mere duplication of essential working parts of a device involves only routine skill in the art. In re Harza, 274 F.2d 669, 124 USPQ 378 (CCPA 1960). See MPEP § 2144.04(VI)(B). Regarding claim 9, Sychov discloses [a device] (projector unit 504, control unit 505, detector module 506, speaker unit 510, and media server 513; para. [0016], “[t]he projector unit is integrated into the housing. Further elements such as the charging unit, the control unit, and/or the detecting module or parts of these elements may be partly or completely integrated into the housing.”) for a motor vehicle charging station (see FIG. 1 and 2 including parking space 501 and charging unit 500), comprising: a projector (projector unit 504, see para. [0005], [0007], [0009]) configured to project, at least temporarily, an image aimed at supporting a user of the motor vehicle onto a projection surface (para. [0013], “[i]n comparison with conventional charging systems the inventive system has several advantages in terms of user friendliness, user-safety and economic operation. In particular, by detecting status data of the system and controlling the projector unit as a function of the status data, it is possible to display images or moving images and, thus, to provide useful information for the user of the charging system, wherein different information may be provided in different conditions or states of the system.”), wherein the motor vehicle charging station has at least one charging point for a motor vehicle (see FIG. 1 and 2 including parking space 501 and charging unit 500). Sychov discloses an autonomous charging station which utilizes a stationary device to gather and project information for use by an automobile user. As such, Sychov may not explicitly disclose that a drone is used to gather and project the information. However, the prior art is replete with examples of drones which interact with automobiles, and specifically drones that gather information, perform some level of an analysis on the information, and display/transmit/project data to an automobile user (e.g., see additional cited art). One such reference includes Baur, which is in the same field of endeavor as the instant application insofar as it is directed to drones which gather data and provide information/guidance (e.g., by way of projecting the data to an automobile) to an automobile user. Baur teaches a projector (drone of Baur) configured to project (Baur, para. [0021], “[t]he drone may communicate (in real time) captured image data to the vision system of the vehicle (such as for display of captured video images or for processing of captured image data)… Optionally, the drone may include an image processor that processes captured image data, whereby the drone may communicate a signal to the vehicle vision system that is indicative of a detected object or the like.”; para. [0011], “[t]he aerial platform or drone includes one or more cameras 16a for transmission of captured images for viewing by an operator of the drone. By incorporating such a drone in or at a vehicle, numerous advantages can be achieved. The drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle… for display of images for viewing by the driver of the vehicle.” Examiner notes that signals and communications projected by the drone to the vehicle constitute a projection), at least temporarily, an image aimed at supporting a user (para. [0011], “whereby the captured image data may be communicated to the vision system of the vehicle… for display of images for viewing by the driver of the vehicle.”) of the motor vehicle onto a projection surface (vehicle-based vision system, see citations below); and wherein the projector is configured to project onto the projection surface (para. [0023], “[t]he vehicle-based vision system receives image data captured by… the drone…”; The vehicle-based vision system provides for a projection surface for the data projected by the drone) provided on request by displacing a motor vehicle component (para. [0019], “the control may control the drone responsive to a user input, such as a voice command from the driver or such as a joystick control or such as a touch screen or the like in the vehicle.”) of the motor vehicle (para. [0013], “[t]he drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle, such as for image processing for object detection or the like, or for display of images for viewing by the driver of the vehicle.” Examiner notes that controlling the drone using the joystick may cause the drone to gather image data which is projected on the vehicle-based vision system). Baur provides motivation to utilize a drone to collect data for the car rather than using a stationary sensor. For example Baur teaches “cameras disposed at a vehicle are limited by the physics and the physical location where the camera is mounted at the vehicle.” (Baur, para. [0011]). Baur further teaches “[t]he aerial platform or drone includes one or more cameras 16a for transmission of captured images for viewing by an operator of the drone. By incorporating such a drone in or at a vehicle, numerous advantages can be achieved. The drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle, such as for image processing for object detection or the like, or for display of images for viewing by the driver of the vehicle.” (Baur, para. [0013]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have incorporated the data gathering and projection features of the stationary object of Sychov into a drone capable of gathering data and communicating with a vehicle such as that of Baur. The components and functions of both the stationary device of Sychov and the drone of Baur are known and the combination provides the predictable result of a high-mobility device capable of gathering data and presenting the data to a user of an automobile. Moreover, Baur provides motivation for incorporating the data gathering and data projecting features into a drone where Baur teaches that drones are capable of gathering data in locations that a stationary sensor cannot access. Regarding claim 10, Sychov discloses [a] motor vehicle charging station (see FIG. 1 and 2 including parking space 501 and charging unit 500) comprising: at least one charging point (charging unit 500) for a motor vehicle; and [a device] (projector unit 504, control unit 505, detector module 506, speaker unit 510, and media server 513; para. [0016], “[t]he projector unit is integrated into the housing. Further elements such as the charging unit, the control unit, and/or the detecting module or parts of these elements may be partly or completely integrated into the housing.”) including a projector configured to project, at least temporarily, an image aimed at supporting a user of the motor vehicle onto a projection surface (para. [0013], “[i]n comparison with conventional charging systems the inventive system has several advantages in terms of user friendliness, user-safety and economic operation. In particular, by detecting status data of the system and controlling the projector unit as a function of the status data, it is possible to display images or moving images and, thus, to provide useful information for the user of the charging system, wherein different information may be provided in different conditions or states of the system.”). Sychov discloses an autonomous charging station which utilizes a stationary device to gather and project information for use by an automobile user. As such, Sychov may not explicitly disclose that a drone is used to gather and project the information. However, the prior art is replete with examples of drones which interact with automobiles, and specifically drones that gather information, perform some level of an analysis on the information, and display/transmit/project data to an automobile user (e.g., see additional cited art). One such reference includes Baur, which is in the same field of endeavor as the instant application insofar as it is directed to drones which gather data and provide information/guidance (e.g., by way of projecting the data to an automobile) to an automobile user. Baur teaches an unmanned aircraft including a projector (the drone of Baur) configured to project (Baur, para. [0021], “[t]he drone may communicate (in real time) captured image data to the vision system of the vehicle (such as for display of captured video images or for processing of captured image data)… Optionally, the drone may include an image processor that processes captured image data, whereby the drone may communicate a signal to the vehicle vision system that is indicative of a detected object or the like.”; para. [0011], “[t]he aerial platform or drone includes one or more cameras 16a for transmission of captured images for viewing by an operator of the drone. By incorporating such a drone in or at a vehicle, numerous advantages can be achieved. The drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle… for display of images for viewing by the driver of the vehicle.” Examiner notes that signals and communications projected by the drone to the vehicle constitute a projection), at least temporarily, an image aimed at supporting a user (para. [0011], “whereby the captured image data may be communicated to the vision system of the vehicle… for display of images for viewing by the driver of the vehicle.”) of the motor vehicle onto a projection surface (vehicle-based vision system, see citations below); and wherein the projector is configured to project onto the projection surface (para. [0023], “[t]he vehicle-based vision system receives image data captured by… the drone…”; The vehicle-based vision system provides for a projection surface for the data projected by the drone) provided on request by displacing a motor vehicle component (para. [0019], “the control may control the drone responsive to a user input, such as a voice command from the driver or such as a joystick control or such as a touch screen or the like in the vehicle.”) of the motor vehicle (para. [0013], “[t]he drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle, such as for image processing for object detection or the like, or for display of images for viewing by the driver of the vehicle.” Examiner notes that controlling the drone using the joystick may cause the drone to gather image data which is projected on the vehicle-based vision system). Baur provides motivation to utilize a drone to collect data for the car rather than using a stationary sensor. For example Baur teaches “cameras disposed at a vehicle are limited by the physics and the physical location where the camera is mounted at the vehicle.” (Baur, para. [0011]). Baur further teaches “[t]he aerial platform or drone includes one or more cameras 16a for transmission of captured images for viewing by an operator of the drone. By incorporating such a drone in or at a vehicle, numerous advantages can be achieved. The drone, when deployed to fly over the vehicle (or around or near the vehicle), captures image data via its camera, whereby the captured image data may be communicated to the vision system of the vehicle, such as for image processing for object detection or the like, or for display of images for viewing by the driver of the vehicle.” (Baur, para. [0013]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have incorporated the data gathering and projection features of the stationary object of Sychov into a drone capable of gathering data and communicating with a vehicle such as that of Baur. The components and functions of both the stationary device of Sychov and the drone of Baur are known and the combination provides the predictable result of a high-mobility device capable of gathering data and presenting the data to a user of an automobile. Moreover, Baur provides motivation for incorporating the data gathering and data projecting features into a drone where Baur teaches that drones are capable of gathering data in locations that a stationary sensor cannot access. Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over Published US Patent Application to Sychov (US 20180186247 A1) in view of Published US Patent Application to Baur (US 20180141658 A1) as applied to claim 1 above, and further in view of Published Germany Patent Application to Sass et al., hereinafter “Sass” (DE 102020211941 A1). Regarding claim 2, Baur teaches a drone system which functions to assist a user of an automobile where the drone may be remote controlled (e.g., para. [0013], “maneuver or fly the aerial platform or drone via remote control or autonomous control or the like.”) as such Sychov modified by Baur teaches a drone which is responsive to a request for support. However, Sychov modified by Baur may not explicitly teach that the drone is housed in a hangar prior to interacting with the vehicle. Sass, which is in the same field of endeavor as the instant application insofar as it is directed to a drone (28) which works an area dedicated to housing cars (12), where the drone is operational with multiple cars, and where the drone is housed and charged at a base (16) teaches the deficient limitations. For example, Sass teaches the deficient limitations of claim 2 where Sass states “[t]he drone (28) flies from its base (16) to motor vehicles (12) parked on a parking deck. The drone (28) recognizes motor vehicles (12) that are willing to share data.” (Sass, Abstract). Notably, the disclosure of Sass is intended to bridge the gap between drones which function as an extension of a single vehicle, and drones capable of servicing multiple vehicles, such as drones that work in shared spaces. For example, Sass states “[t]he patent application US 2016/0332748 A1 describes systems and procedures for docking an unmanned aerial vehicle (UAV) to a vehicle… The UAV can be used to take pictures and stream the images live to a display inside the vehicle. The vehicle can control the UAV. The UAV can communicate with the support vehicle during the flight. The disadvantage here is that the UAV is an extension of a vehicle. The UAV is not trained to operate on different vehicles. Furthermore, only data exchange between the vehicle and the UAV takes place here; data transmission to a base station is not taught.” (Sass, para. [0003]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have incorporated the hangar (e.g., drone charging station and/or base station) and methodology of sharing drones among multiple vehicles as taught by Sass in order to provide for a charging and data transfer hub for the vehicle charging station of Sychov as modified by Baur. The combination achieves the predictable result of a drone system which allows for multiple cars to be serviced by the same drone. Moreover, Sass identified the need for such a system when using drones that are owned and operated by a facility rather than an individual vehicle. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to URSULA NORRIS whose telephone number is (703)756-4731. The examiner can normally be reached Monday to Friday, 7 AM to 4 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TARA SCHIMPF can be reached at 571-270-7741. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /U.L.N./Examiner, Art Unit 3676 /TARA SCHIMPF/Supervisory Patent Examiner, Art Unit 3676
Read full office action

Prosecution Timeline

Nov 26, 2024
Application Filed
Nov 15, 2025
Non-Final Rejection — §103, §112
Feb 23, 2026
Response Filed
Mar 18, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601237
TEMPORARY SUSPENSION OF COMPLETED HYDROCARBON WELLS
2y 5m to grant Granted Apr 14, 2026
Patent 12601249
DEVICES, SYSTEMS, AND METHODS FOR MITIGATING DOWNHOLE MOTOR DYSFUNCTION
2y 5m to grant Granted Apr 14, 2026
Patent 12565833
METHODS AND SYSTEMS FOR REAL-TIME MULTIPHASE FLOW PREDICTION USING SENSOR FUSION AND PHYSICS-BASED HYBRID AI MODEL(S)
2y 5m to grant Granted Mar 03, 2026
Patent 12546210
DEVICE AND SYSTEM FOR ORIENTING CORE SAMPLES
2y 5m to grant Granted Feb 10, 2026
Patent 12523144
SYSTEM AND METHOD FOR NON-INVASIVE DETECTION AT A WELLSITE
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+12.5%)
2y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 53 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month