Prosecution Insights
Last updated: April 19, 2026
Application No. 18/934,847

AUTONOMOUS INVENTORY MANAGEMENT USING BATTERY SWAPPING DRONES

Non-Final OA §103
Filed
Nov 01, 2024
Examiner
HILAIRE, CLIFFORD
Art Unit
2488
Tech Center
2400 — Computer Networks
Assignee
Brookhurst Garage Inc.
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
87%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
313 granted / 438 resolved
+13.5% vs TC avg
Strong +16% interview lift
Without
With
+15.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
32 currently pending
Career history
470
Total Applications
across all art units

Statute-Specific Performance

§101
3.1%
-36.9% vs TC avg
§103
47.9%
+7.9% vs TC avg
§102
19.6%
-20.4% vs TC avg
§112
28.9%
-11.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 438 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 4, 6 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Spencer Williams et al. [US 20220019970 A1] in view of Thomas Galluzzo et al. [US 20150332213 A1]. Regarding claim 1, Spencer teaches and/or suggests: 1. A method for operating an inventory aerial robot (i.e. A system that employs aerial drones for inventory management is disclosed- Abstract), the method comprising: generating a command for performing inventory management of a storage site based on inventory management data associated with the storage site (i.e. a map of the warehouse- ¶0124); causing the inventory aerial robot to perform an inventory management trip based on the command (i.e. application software and/or control algorithms may be loaded and/or stored on the external processor which may be used to control the drone 100 over the wireless connection… They may give instructions to the drone to fly to a certain location in the warehouse, such as using a map of the warehouse and/or by altering its roll/pitch/yaw/throttle, take off, land, fly to another item in the list of items stored in the data storage system, hover, scan an item, otherwise collect data about an item, a shelf, or the warehouse, update a 3D map, collect and/or transport an item as payload, or other such instructions- ¶0124 In some embodiments, one or more sequences of commands can be entered by the user prior to drone take-off for providing an automated flight plan and/or mission profile for the drone. It will be apparent in view of this disclosure that any command, commands, command sequences, automated flight plans, or automated mission profiles can be configured for using a single drone to complete a task or mission or for using multiple drones to complete a task or mission- ¶0125), wherein the inventory management trip comprises: departing from a base station, navigating through the storage site to the plurality of target locations, capturing, at one of the target locations (i.e. FIG. 26A is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to detect an identifier with the optical sensor, capture an image of the identifier with the camera, and perform an image processing and/or machine learning algorithm on the captured image of the identifier, wherein the optical sensor and the camera are communicatively coupled to a graphics processor, in accordance with an example embodiment of the present disclosure- ¶0071… the optical sensor 116 includes a camera having a global shutter to reduce image blur from flying by an identifier 1104 too quickly. A global shutter camera may be used to instantaneously capture an image of an identifier 1104 with less image blur than a rolling shutter camera that captures image pixels sequentially, for example. Thus, the aerial drone 100 can employ an optical sensor 116 with a global shutter to improve readability of captured images of identifiers 1104, which may be especially useful in implementations where the controller 102 performs OCR analysis on the image- ¶0103), an image associated with an inventory item location at the one of the target locations (i.e. The controller 102 can be configured to capture image data for a plurality of inventory items 802 (e.g., an image, multiple images, or video footage of several adjacent inventory items 802) via the camera 118. The controller 102 can be further configured to detect locations (e.g., x, y, and/or z coordinates) of the identifiers 804 for the plurality of inventory items 802 based on the image data and configured to generate a flight path 808 (which may be an updated version of an original flight path 806) for the aerial drone based on the detected locations of the identifiers 804 in order to cause the optical sensor 116 to align with and detect respective ones of the identifiers 804 (e.g., as shown in FIG. 8B)- ¶0100… The controller 102 can be configured to cause the aerial drone 100 to stop at a first position 1410 (e.g., remain at a constant position or at a nearly constant position (e.g., within a restricted range of motion)) and maintain an alignment between the optical sensor 116 and first identifier 1404 for a predetermined time period or until the identifier 1404 is recognized (e.g., until the detected identifier 1404 is successfully correlated with an identifier from a list of stored identifiers and/or until a threshold data set for the inventory item 1402 can be determined/derived from the detected identifier 1404). The controller 102 may be configured to cause the aerial drone 100 to fly to second position 1412, third position 1414, and so on while scanning identifiers for respective inventory items at each of the positions- ¶0107), and returning to the base station (i.e. for example, in some embodiments, a plurality of drones can be assigned to work in concert to perform a comprehensive warehouse inventory, wherein each drone can inventory a single shelf, rack, etc. before returning to a base station to recharge- ¶0125); performing an analysis of the image captured by the inventory aerial robot (i.e. controller 102 performs OCR analysis on the image- ¶0103). However, Spencer does not explicitly teach: receiving an input that includes coordinates of a plurality of target locations in the storage site; causing a swap of a battery pack of the inventory aerial robot at the base station to prepare the inventory aerial robot for another inventory management trip. In the same field of endeavor, Thomas teaches and/or suggests: receiving an input that includes coordinates of a plurality of target locations in the storage site (i.e. As discussed above, and with reference to FIGS. 2, 6 and 7, the central server 200 may be responsible for receiving orders from the WMS 201. The order may contain information such as, for example, UPC, product description, location in the warehouse (which rack, which shelf, which slot on the shelf), order number and quantity of each product to be shipped. This information may be processed by software running on the central server 200, and the best manipulation robot(s) 600 to retrieve the tote(s) based on current location or availability may be determined- ¶0099); causing a swap of a battery pack of the inventory aerial robot at the base station to prepare the inventory aerial robot for another inventory management trip (i.e. For separate charging, a battery hot-swap may be performed by using permanently installed smaller short-life (minutes) onboard batteries to maintain power while a larger modular battery 190 is replaced with a fully charged battery 190 of equivalent design- ¶0085). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer with the teachings of Thomas to prevents the manipulation robot from needing to power down during battery swap, which saves time (Thomas- ¶0085). Regarding claim 4, Spencer and Thomas teach and/or suggest all the limitations of clam 1 and Spencer teaches and/or suggests: wherein the inventory aerial robot navigates through the storage site to the plurality of target locations by: capturing images using one or more image sensors as the inventory aerial robot moves along a path (i.e. FIG. 8B is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to follow a flight path based on image data from the camera, in accordance with an example embodiment of the present disclosure- ¶0034); and analyzing the images captured by the one or more image sensors to determine a current location of the inventory aerial robot in the path by tracking a number of structures in the storage site passed by the inventory aerial robot, the structures comprising racks with rows and columns (i.e. FIG. 8B is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to follow a flight path based on image data from the camera, in accordance with an example embodiment of the present disclosure- ¶0082). Regarding claim 6, Spencer and Thomas teach and/or suggest all the limitations of clam 1 and Spencer teaches and/or suggests: wherein the image associated with the inventory item location that is captured by the inventory aerial robot is a barcode (i.e. The aerial drone 100 includes at least one optical sensor 116 configured to detect identifiers on inventory items (e.g., labeling information, such as, but not limited to, shipping labels, packaging labels, text, images, barcodes, combinations thereof, and the like)- ¶0097). Regarding claim 7, Spencer and Thomas teach and/or suggest all the limitations of clam 1: However, Spencer does not explicitly teach: wherein causing the inventory aerial robot to perform the inventory management trip and causing the swap of the battery pack of the inventory aerial robot at the base station are conducted in a fully autonomous manner. In the same field of endeavor, Thomas teaches and/or suggests: wherein causing the inventory aerial robot to perform the inventory management trip and causing the swap of the battery pack of the inventory aerial robot at the base station are conducted in a fully autonomous manner (i.e. Hot-swapping may be done manually by a human operator, or may be done automatically by internal mechanisms of the manipulation robot 100 and charging station being used to physically swap batteries 190 while the robot 100 coordinates the procedure- ¶0085). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer with the teachings of Thomas to prevents the manipulation robot from needing to power down during battery swap, which saves time (Thomas- ¶0085). Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Spencer Williams et al. [US 20220019970 A1] in view of Thomas Galluzzo et al. [US 20150332213 A1] and further in view of John J. O’Brien et al. [US 20190392380 A1]. Regarding claim 2, Spencer and Thomas teach and/or suggest all the limitations of clam 1 and Spencer teaches and/or suggests: wherein generating the command for performing inventory management of the storage site based on inventory management data associated with the storage site comprises: receiving configuration information of the storage site, the configuration information comprising information about regularly shaped structures in the storage site (i.e. The graphical user interface generated by the WMS can include a mapping of a plurality of inventory items. The graphical user interface can be configured to receive user inputs (e.g., data entries, selections, etc.) via an I/O device (e.g., keyboard, mouse, touch panel, microphone (e.g., for voice commands), and the like). In response to receiving a selection of an inventory item of the plurality of mapped inventory items, the WMS may be configured to cause the graphical user interface to display information corresponding to the selected inventory item based on information received from the aerial drone- ¶0089… warehouse structures, such as aisles, shelves, signs, floors, paths, and so forth- ¶0097… FIG. 29C includes values (e.g., A1, A2, A3, B1, C, . . . ) populated by the WMS 2900 based on the identifiers of inventory items and/or other information (e.g., time, date, location, sensor info (e.g., altitude, temperature, humidity, etc.), and so forth) detected by the aerial drone 100. As shown in FIGS. 30A and 30B, in some embodiments, the graphical user interface generated by the WMS 2900 can include a mapping 3000 of a plurality of inventory items 3002. For example, the mapping 3000 can correspond to an aisle selection 3001 input by the user- ¶0121); creating a map of aisles and racks of the storage site based on the configuration information (i.e. wherein the graphical user interface includes a mapping of the inventory items, in accordance with an example embodiment of the present disclosure- ¶0078-79); and generating a plan for specifying scans and locations that one or more inventory aerial robot need to go in conducting the inventory management (i.e. In some embodiments, one or more sequences of commands can be entered by the user prior to drone take-off for providing an automated flight plan and/or mission profile for the drone. It will be apparent in view of this disclosure that any command, commands, command sequences, automated flight plans, or automated mission profiles can be configured for using a single drone to complete a task or mission or for using multiple drones to complete a task or mission- ¶0125). However, Spencer and Thomas do not teach/suggest explicitly: receiving the inventory management data specifying management specification of the aisles and racks. In the same field of endeavor, John teaches/suggests: receiving the inventory management data specifying management specification of the aisles and racks (i.e. The example JIT replenishment management system 100 may maintain a database storing product information for each of the products in the whole inventory of the retail store 110. The product information may include a product name, a product code, a location code (e.g., zone, aisle, shelf, bin, etc.), a frozen status or chilled status, a quantity of the product displayed on a sales floor, a category, a department, a priority to be dispensed, a quantity of the product to be dispensed, a time to be dispensed, a scheduled pickup time, stock status, and a product supplier- ¶0020). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer and Thomas with the teachings of John to smooth a supply chain flow by aligning the inventory flow and reducing flow times from delivery vehicles to sales floor (e.g., store shelf, refrigerator, freezer, etc.) in a retail store (John- ¶0015). Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Spencer Williams et al. [US 20220019970 A1] in view of Thomas Galluzzo et al. [US 20150332213 A1] and further in view of Dylan Schwesinger et al. [A 3D Approach to Infrastructure-Free Localization in Large Scale Warehouse Environments: already of record]. Regarding claim 3, Spencer and Thomas teach and/or suggest all the limitations of clam 1. However, Spencer and Thomas do not teach/suggest explicitly: wherein the inventory aerial robot navigates through the storage site to the plurality of target locations by counting a number of regularly shaped structures that the inventory aerial robot has passed through. In the same field of endeavor, John teaches/suggests: wherein the inventory aerial robot navigates through the storage site to the plurality of target locations by counting a number of regularly shaped structures that the inventory aerial robot has passed through (i.e. In this paper, we present a method for infrastructure-free localization of Automated Guided Vehicles (AGVs) in a warehouse environment. To accomplish this, our approach leverages 3D data for both mapping and feature segmentation. First, a 3D reconstruction of the warehouse is created to extract salient natural features — in this case the shelving uprights — as landmarks. Next, the map-based localization approach leverages 3D LIDAR to enable 3D feature to- landmark matching which minimizes the potential for data association errors- Abstract... in warehouse environments, we propose using the pole-like pallet rack supports as landmarks- page 275, ¶3... In this work, we chose to use the vertical, polelike supports of pallet racks as landmark features, which we will refer to simply as pole features- page 275, ¶9... To validate the effectiveness of the landmark segmentation of the Mapping Trike, the number of landmarks was counted by hand as a ground truth measure. This value was compared against the map generated by the mapping process. In total, 74 of 74 visible landmarks were successfully segmented- page 278, ¶2). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer and Thomas with the teachings of Dylan to enable 3D featureto- landmark matching which minimizes the potential for data association errors (Dylan- Abstract). Claims 8, 9, 12, 15, 16 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Spencer Williams et al. [US 20220019970 A1] in view of Thomas Galluzzo et al. [US 20150332213 A1] and further in view of Jorg Hartung [US 20190307106 A1: already of record]. Regarding claim 8, Spencer and Thomas teach and/or suggest all the limitations of clam 1. However, Spencer and Thomas do not teach/suggest explicitly: wherein upon returning to the base station, the inventory aerial robot transmits captured images to the base station, the base station performs the swap of the battery pack, and the inventory aerial robot receives a second command for performing another inventory management trip. In the same field of endeavor, Jorg teaches/suggests: wherein upon returning to the base station, the inventory aerial robot transmits captured images to the base station, the base station performs the swap of the battery pack, and the inventory aerial robot receives a second command for performing another inventory management trip (i.e. The shed 10 also includes a docking station 7 which is designed to couple with a docking port 14 of robot 3 automatically for charging of the on-board battery 28, and optionally downloading of information such as images, videos or test results- ¶0120... the robot 3 can store data on-board and download it when placed at the docking station 7- ¶0150). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer and Thomas with the teachings of Jorg to provide on-board rechargeable electric energy storage device such as one or more batteries (Jorg- ¶0119). Regarding claim 9, Spencer teaches and/or suggests: 9. An inventory aerial robot (i.e. FIG. 1F is a block diagram illustrating electronics for an aerial drone, in accordance with an example embodiment of the present disclosure- ¶0013), comprising: one or more image sensors configured to capture images of a storage site (i.e. camera 118- fig. 1f); one or more processors (i.e. processor 104- fig. 1f); and memory coupled to the one or more processors (i.e. memory 106- fig. 1f), the memory storing instructions, wherein the instructions, when executed by the one or more processors (i.e. The memory 106 can be an example of tangible, computer-readable storage medium that provides storage functionality to store various data and or program code associated with operation of the controller 102/drone 100, such as software programs and/or code segments, or other data to instruct the processor 104, and possibly other components of the controller 102/drone 100, to perform the functionality described herein. Thus, the memory 106 can store data, such as a program of instructions (e.g., software module(s)) for operating the controller 102/drone 100 (including its components), and so forth. It should be noted that while a single memory 106 is described, a wide variety of types and combinations of memory (e.g., tangible, non-transitory memory) can be employed. The memory 106 can be integral with the processor 104, can comprise stand-alone memory, or can be a combination of both- ¶0093), cause the one or more processors to: receive a command for performing inventory management of the storage site based on inventory management data associated with the storage site (i.e. a map of the warehouse- ¶0124); perform an inventory management trip based on the command (i.e. application software and/or control algorithms may be loaded and/or stored on the external processor which may be used to control the drone 100 over the wireless connection… They may give instructions to the drone to fly to a certain location in the warehouse, such as using a map of the warehouse and/or by altering its roll/pitch/yaw/throttle, take off, land, fly to another item in the list of items stored in the data storage system, hover, scan an item, otherwise collect data about an item, a shelf, or the warehouse, update a 3D map, collect and/or transport an item as payload, or other such instructions- ¶0124 In some embodiments, one or more sequences of commands can be entered by the user prior to drone take-off for providing an automated flight plan and/or mission profile for the drone. It will be apparent in view of this disclosure that any command, commands, command sequences, automated flight plans, or automated mission profiles can be configured for using a single drone to complete a task or mission or for using multiple drones to complete a task or mission- ¶0125), wherein the inventory management trip comprises: receiving an input that includes coordinates of a plurality of target locations in the storage site, departing from a base station, navigating through the storage site to the plurality of target locations, capturing, at one of the target locations (i.e. FIG. 26A is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to detect an identifier with the optical sensor, capture an image of the identifier with the camera, and perform an image processing and/or machine learning algorithm on the captured image of the identifier, wherein the optical sensor and the camera are communicatively coupled to a graphics processor, in accordance with an example embodiment of the present disclosure- ¶0071… the optical sensor 116 includes a camera having a global shutter to reduce image blur from flying by an identifier 1104 too quickly. A global shutter camera may be used to instantaneously capture an image of an identifier 1104 with less image blur than a rolling shutter camera that captures image pixels sequentially, for example. Thus, the aerial drone 100 can employ an optical sensor 116 with a global shutter to improve readability of captured images of identifiers 1104, which may be especially useful in implementations where the controller 102 performs OCR analysis on the image- ¶0103), an image associated with an inventory item location at the one of the target locations(i.e. The controller 102 can be configured to capture image data for a plurality of inventory items 802 (e.g., an image, multiple images, or video footage of several adjacent inventory items 802) via the camera 118. The controller 102 can be further configured to detect locations (e.g., x, y, and/or z coordinates) of the identifiers 804 for the plurality of inventory items 802 based on the image data and configured to generate a flight path 808 (which may be an updated version of an original flight path 806) for the aerial drone based on the detected locations of the identifiers 804 in order to cause the optical sensor 116 to align with and detect respective ones of the identifiers 804 (e.g., as shown in FIG. 8B)- ¶0100… The controller 102 can be configured to cause the aerial drone 100 to stop at a first position 1410 (e.g., remain at a constant position or at a nearly constant position (e.g., within a restricted range of motion)) and maintain an alignment between the optical sensor 116 and first identifier 1404 for a predetermined time period or until the identifier 1404 is recognized (e.g., until the detected identifier 1404 is successfully correlated with an identifier from a list of stored identifiers and/or until a threshold data set for the inventory item 1402 can be determined/derived from the detected identifier 1404). The controller 102 may be configured to cause the aerial drone 100 to fly to second position 1412, third position 1414, and so on while scanning identifiers for respective inventory items at each of the positions- ¶0107), and returning to the base station(i.e. for example, in some embodiments, a plurality of drones can be assigned to work in concert to perform a comprehensive warehouse inventory, wherein each drone can inventory a single shelf, rack, etc. before returning to a base station to recharge- ¶0125), However, Spencer does not explicitly teach: receives a swap of a battery pack at the base station to prepare the inventory aerial robot for another inventory management trip. In the same field of endeavor, Thomas teaches and/or suggests: receives a swap of a battery pack at the base station to prepare the inventory aerial robot for another inventory management trip(i.e. For separate charging, a battery hot-swap may be performed by using permanently installed smaller short-life (minutes) onboard batteries to maintain power while a larger modular battery 190 is replaced with a fully charged battery 190 of equivalent design- ¶0085). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer with the teachings of Thomas to prevents the manipulation robot from needing to power down during battery swap, which saves time (Thomas- ¶0085). However, Spencer and Thomas do not teach/suggest explicitly: wherein upon returning to the base station, the inventory aerial robot transmits the image captured. In the same field of endeavor, Jorg teaches/suggests: wherein upon returning to the base station, the inventory aerial robot transmits the image captured (i.e. The shed 10 also includes a docking station 7 which is designed to couple with a docking port 14 of robot 3 automatically for charging of the on-board battery 28, and optionally downloading of information such as images, videos or test results- ¶0120... the robot 3 can store data on-board and download it when placed at the docking station 7- ¶0150). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer and Thomas with the teachings of Jorg to provide on-board rechargeable electric energy storage device such as one or more batteries (Jorg- ¶0119). Regarding claim 12, Spencer, Thomas and Jorg teach and/or suggest all the limitations of claim 9 and Spencer further teaches/suggests: wherein the inventory aerial robot is configured to navigate through the storage site to the plurality of target locations by: capturing images using one or more image sensors as the inventory aerial robot moves along a path (i.e. FIG. 8B is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to follow a flight path based on image data from the camera, in accordance with an example embodiment of the present disclosure- ¶0034); and analyzing the images captured by the one or more image sensors to determine a current location of the inventory aerial robot in the path by tracking a number of structures in the storage site passed by the inventory aerial robot, the structures comprising racks with rows and columns(i.e. FIG. 8B is an illustration of an aerial drone with an optical sensor and a camera having a wider field of view than the optical sensor, wherein the aerial drone is configured to follow a flight path based on image data from the camera, in accordance with an example embodiment of the present disclosure- ¶0082). Regarding claim 14, Spencer, Thomas and Jorg teach and/or suggest all the limitations of claim 9 and Spencer further teaches/suggests: wherein the image associated with the inventory item location that is captured by the inventory aerial robot is a barcode (i.e. The aerial drone 100 includes at least one optical sensor 116 configured to detect identifiers on inventory items (e.g., labeling information, such as, but not limited to, shipping labels, packaging labels, text, images, barcodes, combinations thereof, and the like)- ¶0097). Regarding claim 15, Spencer, Thomas and Jorg teach and/or suggest all the limitations of clam 9. However, Spencer does not explicitly teach: wherein the inventory aerial robot is configured to perform the inventory management trip and receive the swap of the battery pack at the base station in a fully autonomous manner. In the same field of endeavor, Thomas teaches and/or suggests: wherein the inventory aerial robot is configured to perform the inventory management trip and receive the swap of the battery pack at the base station in a fully autonomous manner (i.e. Hot-swapping may be done manually by a human operator, or may be done automatically by internal mechanisms of the manipulation robot 100 and charging station being used to physically swap batteries 190 while the robot 100 coordinates the procedure- ¶0085). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer with the teachings of Thomas to prevents the manipulation robot from needing to power down during battery swap, which saves time (Thomas- ¶0085). Regarding claim 16, apparatus claim 16 is drawn to the apparatus using/performing the same method as claimed in claim 8. Therefore, apparatus claim 16 corresponds to method claim 8, and is rejected for the same reasons of obviousness as used above Regarding claim 20, Spencer, Thomas and Jorg teach and/or suggest all the limitations of clam 16. However, Spencer does not explicitly teach: wherein the inventory aerial robot is configured to perform the inventory management trip and the base station is configured to perform the swap of the battery pack in a fully autonomous manner. In the same field of endeavor, Thomas teaches and/or suggests: wherein the inventory aerial robot is configured to perform the inventory management trip and the base station is configured to perform the swap of the battery pack in a fully autonomous manner (i.e. Hot-swapping may be done manually by a human operator, or may be done automatically by internal mechanisms of the manipulation robot 100 and charging station being used to physically swap batteries 190 while the robot 100 coordinates the procedure- ¶0085). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer with the teachings of Thomas to prevents the manipulation robot from needing to power down during battery swap, which saves time (Thomas- ¶0085). Claims 10 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Spencer Williams et al. [US 20220019970 A1] in view of Thomas Galluzzo et al. [US 20150332213 A1], further in view of Jorg Hartung [US 20190307106 A1: already of record] and even further in view of John J. O’Brien et al. [US 20190392380 A1]. Regarding claim 10, Spencer, Thomas and Jorg teach and/or suggest all the limitations of claim 9 and Spencer further teaches/suggests: wherein the command for performing inventory management of the is generated by: receiving configuration information of the storage site, the configuration information comprising information about regularly shaped structures in the storage site(i.e. The graphical user interface generated by the WMS can include a mapping of a plurality of inventory items. The graphical user interface can be configured to receive user inputs (e.g., data entries, selections, etc.) via an I/O device (e.g., keyboard, mouse, touch panel, microphone (e.g., for voice commands), and the like). In response to receiving a selection of an inventory item of the plurality of mapped inventory items, the WMS may be configured to cause the graphical user interface to display information corresponding to the selected inventory item based on information received from the aerial drone- ¶0089… warehouse structures, such as aisles, shelves, signs, floors, paths, and so forth- ¶0097… FIG. 29C includes values (e.g., A1, A2, A3, B1, C, . . . ) populated by the WMS 2900 based on the identifiers of inventory items and/or other information (e.g., time, date, location, sensor info (e.g., altitude, temperature, humidity, etc.), and so forth) detected by the aerial drone 100. As shown in FIGS. 30A and 30B, in some embodiments, the graphical user interface generated by the WMS 2900 can include a mapping 3000 of a plurality of inventory items 3002. For example, the mapping 3000 can correspond to an aisle selection 3001 input by the user- ¶0121); creating a map of aisles and racks of the storage site based on the configuration information (i.e. wherein the graphical user interface includes a mapping of the inventory items, in accordance with an example embodiment of the present disclosure- ¶0078-79); and generating a plan for specifying scans and locations that one or more inventory aerial robot need to go in conducting the inventory management (i.e. In some embodiments, one or more sequences of commands can be entered by the user prior to drone take-off for providing an automated flight plan and/or mission profile for the drone. It will be apparent in view of this disclosure that any command, commands, command sequences, automated flight plans, or automated mission profiles can be configured for using a single drone to complete a task or mission or for using multiple drones to complete a task or mission- ¶0125). However, Spencer and Thomas and Jorg do not teach/suggest explicitly: receiving the inventory management data specifying management specification of the aisles and racks. In the same field of endeavor, John teaches/suggests: receiving the inventory management data specifying management specification of the aisles and racks (i.e. The example JIT replenishment management system 100 may maintain a database storing product information for each of the products in the whole inventory of the retail store 110. The product information may include a product name, a product code, a location code (e.g., zone, aisle, shelf, bin, etc.), a frozen status or chilled status, a quantity of the product displayed on a sales floor, a category, a department, a priority to be dispensed, a quantity of the product to be dispensed, a time to be dispensed, a scheduled pickup time, stock status, and a product supplier- ¶0020). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer, Thomas and Jorg with the teachings of John to smooth a supply chain flow by aligning the inventory flow and reducing flow times from delivery vehicles to sales floor (e.g., store shelf, refrigerator, freezer, etc.) in a retail store (John- ¶0015). Regarding claim 17, Spencer, Thomas and Jorg teach and/or suggest all the limitations of claim 16 and Spencer further teaches/suggests: receiving configuration information of the storage site, the configuration information comprising information about regularly shaped structures in the storage site (i.e. The graphical user interface generated by the WMS can include a mapping of a plurality of inventory items. The graphical user interface can be configured to receive user inputs (e.g., data entries, selections, etc.) via an I/O device (e.g., keyboard, mouse, touch panel, microphone (e.g., for voice commands), and the like). In response to receiving a selection of an inventory item of the plurality of mapped inventory items, the WMS may be configured to cause the graphical user interface to display information corresponding to the selected inventory item based on information received from the aerial drone- ¶0089… warehouse structures, such as aisles, shelves, signs, floors, paths, and so forth- ¶0097… FIG. 29C includes values (e.g., A1, A2, A3, B1, C, . . . ) populated by the WMS 2900 based on the identifiers of inventory items and/or other information (e.g., time, date, location, sensor info (e.g., altitude, temperature, humidity, etc.), and so forth) detected by the aerial drone 100. As shown in FIGS. 30A and 30B, in some embodiments, the graphical user interface generated by the WMS 2900 can include a mapping 3000 of a plurality of inventory items 3002. For example, the mapping 3000 can correspond to an aisle selection 3001 input by the user- ¶0121); creating a map of aisles and racks of the storage site based on the configuration information (i.e. wherein the graphical user interface includes a mapping of the inventory items, in accordance with an example embodiment of the present disclosure- ¶0078-79); and generating a plan for specifying scans and locations that one or more inventory aerial robot need to go in conducting the inventory management(i.e. In some embodiments, one or more sequences of commands can be entered by the user prior to drone take-off for providing an automated flight plan and/or mission profile for the drone. It will be apparent in view of this disclosure that any command, commands, command sequences, automated flight plans, or automated mission profiles can be configured for using a single drone to complete a task or mission or for using multiple drones to complete a task or mission- ¶0125). However, Spencer and Thomas and Jorg do not teach/suggest explicitly: receiving the inventory management data specifying management specification of the aisles and racks. In the same field of endeavor, John teaches/suggests: receiving the inventory management data specifying management specification of the aisles and racks (i.e. The example JIT replenishment management system 100 may maintain a database storing product information for each of the products in the whole inventory of the retail store 110. The product information may include a product name, a product code, a location code (e.g., zone, aisle, shelf, bin, etc.), a frozen status or chilled status, a quantity of the product displayed on a sales floor, a category, a department, a priority to be dispensed, a quantity of the product to be dispensed, a time to be dispensed, a scheduled pickup time, stock status, and a product supplier- ¶0020). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer, Thomas and Jorg with the teachings of John to smooth a supply chain flow by aligning the inventory flow and reducing flow times from delivery vehicles to sales floor (e.g., store shelf, refrigerator, freezer, etc.) in a retail store (John- ¶0015). wherein the command for performing inventory management of the is generated by: receiving configuration information of the storage site, the configuration information comprising information about regularly shaped structures in the storage site; creating a map of aisles and racks of the storage site based on the configuration information; and receiving the inventory management data specifying management specification of the aisles and racks; and generating a plan for specifying scans and locations that one or more inventory aerial robot need to go in conducting the inventory management. Claims 11 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Spencer Williams et al. [US 20220019970 A1] in view of Thomas Galluzzo et al. [US 20150332213 A1], further in view of Jorg Hartung [US 20190307106 A1: already of record] and even further in view of Dylan Schwesinger et al. [A 3D Approach to Infrastructure-Free Localization in Large Scale Warehouse Environments: already of record]. Regarding claim 11, Spencer, Thomas and Jorg teach and/or suggest all the limitations of claim 9. However, Spencer and Thomas do not teach/suggest explicitly: wherein the inventory aerial robot is configured to navigate through the storage site to the plurality of target locations by counting a number of regularly shaped structures that the inventory aerial robot has passed through. In the same field of endeavor, John teaches/suggests: wherein the inventory aerial robot is configured to navigate through the storage site to the plurality of target locations by counting a number of regularly shaped structures that the inventory aerial robot has passed through (i.e. In this paper, we present a method for infrastructure-free localization of Automated Guided Vehicles (AGVs) in a warehouse environment. To accomplish this, our approach leverages 3D data for both mapping and feature segmentation. First, a 3D reconstruction of the warehouse is created to extract salient natural features — in this case the shelving uprights — as landmarks. Next, the map-based localization approach leverages 3D LIDAR to enable 3D feature to- landmark matching which minimizes the potential for data association errors- Abstract... in warehouse environments, we propose using the pole-like pallet rack supports as landmarks- page 275, ¶3... In this work, we chose to use the vertical, polelike supports of pallet racks as landmark features, which we will refer to simply as pole features- page 275, ¶9... To validate the effectiveness of the landmark segmentation of the Mapping Trike, the number of landmarks was counted by hand as a ground truth measure. This value was compared against the map generated by the mapping process. In total, 74 of 74 visible landmarks were successfully segmented- page 278, ¶2). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer and Thomas with the teachings of Dylan to enable 3D featureto- landmark matching which minimizes the potential for data association errors (Dylan- Abstract). Regarding claim 18, Spencer, Thomas and Jorg teach and/or suggest all the limitations of claim 16. However, Spencer and Thomas do not teach/suggest explicitly: wherein the inventory aerial robot is configured to navigate through the storage site to the plurality of target locations by counting a number of regularly shaped structures that the inventory aerial robot has passed through. In the same field of endeavor, John teaches/suggests: wherein the inventory aerial robot is configured to navigate through the storage site to the plurality of target locations by counting a number of regularly shaped structures that the inventory aerial robot has passed through (i.e. In this paper, we present a method for infrastructure-free localization of Automated Guided Vehicles (AGVs) in a warehouse environment. To accomplish this, our approach leverages 3D data for both mapping and feature segmentation. First, a 3D reconstruction of the warehouse is created to extract salient natural features — in this case the shelving uprights — as landmarks. Next, the map-based localization approach leverages 3D LIDAR to enable 3D feature to- landmark matching which minimizes the potential for data association errors- Abstract... in warehouse environments, we propose using the pole-like pallet rack supports as landmarks- page 275, ¶3... In this work, we chose to use the vertical, polelike supports of pallet racks as landmark features, which we will refer to simply as pole features- page 275, ¶9... To validate the effectiveness of the landmark segmentation of the Mapping Trike, the number of landmarks was counted by hand as a ground truth measure. This value was compared against the map generated by the mapping process. In total, 74 of 74 visible landmarks were successfully segmented- page 278, ¶2). It would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention, to modify the teachings of Spencer and Thomas with the teachings of Dylan to enable 3D featureto- landmark matching which minimizes the potential for data association errors (Dylan- Abstract). Allowable Subject Matter Claims 5, 13 and 19 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CLIFFORD HILAIRE whose telephone number is (571)272-8397. The examiner can normally be reached 5:30-1400. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SATH V PERUNGAVOOR can be reached at (571)272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. CLIFFORD HILAIRE Primary Examiner Art Unit 2488 /CLIFFORD HILAIRE/Primary Examiner, Art Unit 2488
Read full office action

Prosecution Timeline

Nov 01, 2024
Application Filed
Jan 24, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602591
TRAINING REINFORCEMENT LEARNING AGENTS USING AUGMENTED TEMPORAL DIFFERENCE LEARNING
2y 5m to grant Granted Apr 14, 2026
Patent 12596427
REWARD GENERATING METHOD FOR REDUCING PEAK LOAD OF POWER CONSUMPTION AND COMPUTING DEVICE FOR PERFORMING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12576797
ROTATING DEVICE FOR DISPLAY OF VEHICLE
2y 5m to grant Granted Mar 17, 2026
Patent 12573211
SYSTEMS AND METHODS FOR MANEUVER IDENTIFICATION FROM CONDENSED REPRESENTATIONS OF VIDEO
2y 5m to grant Granted Mar 10, 2026
Patent 12568310
TARGET TRACKING DEVICE, TARGET TRACKING METHOD, AND RECORDING MEDIUM FOR STORING TARGET TRACKING PROGRAM
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
87%
With Interview (+15.7%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 438 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month