Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed Nov. 25, 2025 have been fully considered but they are not persuasive.
Applicant argues, with respect to claim 1, that neither Hance nor Wintz, alone or in combination, teaches or suggests the specific per-location information set now recited in claim 1, nor the way in which that information is generated. Even assuming, solely for the sake of argument, that Hance's "map" may be read as a "digital twin" under a broadest reasonable interpretation, Hance uses its mapping and real-time item information to locate pallets, determine capacities, and decide when to condense pallets based on shipment expectations, histories, and depletion timing. Hance does not disclose, for any individual storage location, an information set that comprises product information, visual images, a percentage volume of that stored location filled, and a percentage of product present compared to a previous quantity. In particular, Hance's logic is expressed in terms of capacities, absolute quantities, and thresholds for condensing pallets; it does not describe computing or presenting a percentage of product present compared to a previous quantity at a given location, nor does it describe surfacing that comparative percentage as part of a per-location view of the map or digital representation.
Wintz likewise fails to cure these deficiencies. Wintz determines a "percent of fill capacity" for candidate destinations based on the quantity of items currently stored and the storage location's capacity, and uses that percent fill as one of multiple cost-function sub-scores to select a storage destination. That percent of fill capacity is a ratio of current occupancy to capacity, not a percentage of product present compared to a previous quantity at that location as now claimed.
Wintz does not disclose storing or computing a time-based comparative product metric (current quantity versus a prior quantity at the same location), and it does not disclose providing, for each storage location, an information set that comprises both that comparative product percentage and a percentage volume of the stored location filled, together with product information and visual images of the storage location. Indeed, Wintz's per-location information is described as structured warehouse-management metadata and scoring parameters; there is no teaching of per-location visual imagery within a digital twin of the storage area.
Accordingly, even if Wintz's percent-fill calculations were incorporated into Hance's system, the combination would at most suggest computing a single percent-of-capacity value used internally by a control system to manage space and routing. The combination still would not teach or suggest (i) computing a percentage of product present compared to a previous quantity for each storage location, (ii) providing, for each storage location in a lidar/vision-derived digital twin, an information set that comprises all four expressly recited elements - product information, visual images, percentage volume of the stored location filled, and percentage of product present compared to a previous quantity - or (iii) making that complete per-location information set accessible via the digital twin as claimed. Furthermore, Applicant submits that, even as modified in view of Wintz, Hance's system would still rely on Wintz-style transactional capacity estimates (based on inventory and location capacity data) rather than lidar/vision-based measurements of actual product volume and free space at each storage location, and thus is neither taught nor suggested as determining which portion of a pallet has been picked, the true layout and shape of newly unoccupied space for re- use by future items, or discrepancies between expected and sensed product volume at a given location indicative of shrinkage or theft. These capabilities flow from the claimed sensor-derived per-location percentage metrics and digital-twin information set, which are not taught or rendered obvious by the cited art.
In response, the examiner respectfully disagrees. As discussed in the last Office Action and recognized by applicant, the “map” of Hance is anticipated the “digital twin” of the claimed invention. As discussed in the last Office Action, HANCE et al. discloses in pages 29-30, paragraph #0098 “As
another example condition, the WCS may not instruct the robotic devices to condense the set of pallets
if there is not much time until the set of pallets are due to be shipped out, emptied, or otherwise no
longer present a disadvantage to the warehouse. In other words, the WCS may not instruct the
robotic devices to condense the set of pallets if the space taken up by some or all of the set of pallets
would become available sooner (as a result of depleting the pallets naturally) than if the robotic devices
condensed the set of pallets. For instance, the WCS may use any records/histories, scheduled task
information, or other information discussed above to estimate an amount of time until the set of pallets will be depleted a result of warehouse activities other than condensing items (e.g., loading full or less- than-full pallets onto trucks for shipping). If the estimated amount of time is greater than how long it would take for the robotic devices to complete the condensing of the set of pallets, then the WCS may decide to instruct the robotic devices to condense the set of pallets. But if it would likely take the
robotic devices longer to condense the set of pallets than it would take for the set of pallets to no longer
present a disadvantage to the warehouse, the WCS may instead decide to not instruct the robotic
devices to condense the set of pallets (or may cancel the condensing if the task of condensing is already
underway …”. The claimed 1) product information is anticipated by the records/histories, scheduled task information, or other information discussed above to estimate an amount of time until the set of pallets will be depleted a result of warehouse activities other than condensing items (e.g., loading full or less- than-full pallets onto trucks for shipping) of HANCE et al.. The claimed 2) visual images is anticipated by robotic devices include one or more sensors disclosed in pages 8-9, paragraph #0026 “Any of the robotic devices described herein may include one or more sensor(s) such as force sensors, proximity sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, Global Positioning System (GPS) receivers, sonar, optical sensors, biosensors, Radio Frequency identification (RFID) sensors, Near Field Communication (NFC) sensors, wireless sensors, compasses, smoke sensors, light sensors, audio sensors, microphones, speakers, radar, cameras (e.g., color cameras, grayscale cameras, and/or infrared cameras), depth sensors (e.g., Red Green Blue plus Depth (RGB-D), lasers, a light detection and ranging (LIDAR) device, a structured-light scanner, and/or a time-of-flight camera), a stereo camera, motion sensors (e.g., gyroscope, accelerometer, inertial measurement unit (IMU), and/or foot step or wheel odometry), and/or range sensors (e.g., ultrasonic and/or infrared), among others. The sensor(s) may provide sensor data to a processor(s) to allow for appropriate interaction of a robotic device with the environment. ..".
The secondary reference, Wintz et al., teaches in page 6, paragraph #0061 “"FIG. 2A is a conceptual diagram for determining a size score (e.g., refer to FIG. 4A). As shown in FIG. 2A, candidate destination 110A is at 100% size capacity, meaning this destination cannot accommodate item 112 size. Therefore, destination 110A can be assigned a least favorable score of 5 and the size capacity of 110B and 110C can be compared (200). Destination 110B can receive a more favorable size score in comparison to destination 110C because destination 110B has 0% size capacity filled". From the above passage, the claimed 3) percentage volume of the stored location filed is anticipated by the sum (total) of the percentage sizes of storage locations 110A to 110C of Wintz et al. and the claimed 4) percentage of product present compared to a previous quantity is met by the percentage size of one storage locations 110A or 110B or 110C of Wintz et al. because the percentage size is a percentage of product present compared to a previous quantity. Thus, the proposed combination of HANCE et al. and Wintz et al. does discloses the claimed “processing said data to produce processed data in the form of a digital twin representing said storage area and accessible using a computer device to provide information about storage locations and information about products stored in said storage locations and in said storage area, wherein said information for that storage location comprises product information, visual images, percentage volume of the stored location filled and percentage of product present compared to a previous quantity” of claim 1.
In re page 6, applicant states that the remaining claims are not rende4red obvious by the cited art for the same reasons as discussed in claim 1 above.
In response, as discussed above with respect to claim 1, the combination of HANCE et al. and Wintz et al. discloses all the claimed limitations.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-8 are rejected under 35 U.S.C. 103 as being unpatentable over HANCE et al. (WO 2018/038816 A1) in view of Wintz et al. (US 2022/0388783 A1).
Regarding claim 1, HANCE et al. discloses a method of using a warehouse management system (Fig. 1) comprising: providing a data gathering apparatus including lidar and visual image capture devices (see pages 8-9, paragraph #0026, "Any of the robotic devices described herein may include one or more sensor(s) such as force sensors, proximity sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, Global Positioning System (GPS) receivers, sonar, optical sensors, biosensors, Radio Frequency identification (RFID) sensors, Near Field Communication (NFC) sensors, wireless sensors, compasses, smoke sensors, light sensors, audio sensors, microphones, speakers, radar, cameras (e.g., color cameras, grayscale cameras, and/or infrared cameras), depth sensors (e.g., Red Green Blue plus Depth (RGB-D), lasers, a light detection and ranging (LIDAR) device, a structured-light scanner, and/or a time-of-flight camera), a stereo camera, motion sensors (e.g., gyroscope, accelerometer, inertial measurement unit (IMU), and/or foot step or wheel odometry), and/or range sensors (e.g., ultrasonic and/or infrared), among others. The sensor(s) may provide sensor data to a processor(s) to allow for appropriate interaction of a robotic device with the environment. .."); causing said data gathering apparatus to move around a storage area having a plurality of storage locations therein and gathering lidar and visual image data while moving (see page 11, paragraph #0035, "Within examples, one or more robotic devices may be brought into the warehouse 100 to create a map of the warehouse 100 space before determining placement of other objects, such as any fixed robotic devices or components discussed above, as well as any pallets. Herein, a "map" refers to information representative of a positioning of elements in a particular area of an environment, and/or representative of a relationship of certain elements to other elements or to the environment. Within example implementations, a map is a digital map, determined by collecting and compiling data representative of relationships between elements in the given environment, and then formatting such data into a virtual form, such as a virtual 2D or 3D image. ..."); and processing said data to produce processed data in the form of a digital twin representing said storage area and accessible using a computer device to provide information about storage locations and information about products stored in said storage locations and in said storage area, wherein said information for that storage location comprises product information, visual images, volume of the stored location filled and product present compared to a previous quantity (see pages 28-29, paragraphs #0095-#0096, "As another example condition, the WCS may use the item shipment expectation to determine how much storage capacity (i.e., storage space) would be required for items that the WCS expects to be received at the warehouse. … . For instance, given an example threshold of two thousand square feet, the WCS may be aware of or predict that there is an upcoming delivery of one thousand of a type of item and, based on that number, determine an amount of storage space that would be needed to accommodate pallets of the one thousand items (e.g., five hundred pallet rack positions, three thousand square feet, or other manners of quantifying required storage space). …” and pages 29-30, paragraph #0098, "As another example condition, the WCS may not instruct the robotic devices to condense the set of pallets if there is not much time until the set of pallets are due to be shipped out, emptied, or otherwise no longer present a disadvantage to the warehouse. In other words, the WCS may not instruct the robotic devices to condense the set of pallets if the space taken up by some or all of the set of pallets would become available sooner (as a result of depleting the pallets naturally) than if the robotic devices condensed the set of pallets. For instance, the WCS may use any records/histories, scheduled task information, or other information discussed above to estimate an amount of time until the set of pallets will be depleted a result of warehouse activities other than condensing items (e.g., loading full or less- than-full pallets onto trucks for shipping). If the estimated amount of time is greater than how long it would take for the robotic devices to complete the condensing of the set of pallets, then the WCS may decide to instruct the robotic devices to condense the set of pallets. But if it would likely take the robotic devices longer to condense the set of pallets than it would take for the set of pallets to no longer present a disadvantage to the warehouse, the WCS may instead decide to not instruct the robotic devices to condense the set of pallets (or may cancel the condensing if the task of condensing is already underway). ").
However, HANCE et al. does not specifically discloses percentage volume of the stored location filled and percentage of product present compared to a previous quantity.
Wintz et al. teaches in the same field of warehouse management system (WMS) determining percentage of size capacity of destinations (see Fig. 4A and page 6, paragraph #0061, "FIG. 2A is a conceptual diagram for determining a size score (e.g., refer to FIG. 4A). As shown in FIG. 2A, candidate destination 110A is at 100% size capacity, meaning this destination cannot accommodate item 112 size. Therefore, destination 110A can be assigned a least favorable score of 5 and the size capacity of 110B and 110C can be compared (200). Destination 110B can receive a more favorable size score in comparison to destination 110C because destination 110B has 0% size capacity filled ").
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the filing percentage as taught by Wintz et al. into HANCE et al.'s system in order to reduce bottlenecks in a storage facility.
Regarding claim 2, HANCE et al. also discloses wherein said data gathering apparatus is attached to a warehouse vehicle and caused to move by the movement of that vehicle (see pages 8-9, paragraph #0026, "Any of the robotic devices described herein may include one or more sensor(s) such as force sensors, proximity sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, Global Positioning System (GPS) receivers, sonar, optical sensors, biosensors, Radio Frequency identification (RFID) sensors, Near Field Communication (NFC) sensors, wireless sensors, compasses, smoke sensors, light sensors, audio sensors, microphones, speakers, radar, cameras (e.g., color cameras, grayscale cameras, and/or infrared cameras), depth sensors (e.g., Red Green Blue plus Depth (RGB-D), lasers, a light detection and ranging (LIDAR) device, a structured-light scanner, and/or a time-of-flight camera), a stereo camera, motion sensors (e.g., gyroscope, accelerometer, inertial measurement unit (IMU), and/or foot step or wheel odometry), and/or range sensors (e.g., ultrasonic and/or infrared), among others. The sensor(s) may provide sensor data to a processor(s) to allow for appropriate interaction of a robotic device with the environment. ..."). Regarding claim 3, HANCE et al. further discloses wherein said visual capture device comprises a 3D camera (see page 9, paragraph #0029, "In some examples, any or all of the robotic devices in the warehouse 100 may include one or more sensors, one or more computers, and one or more robotic arms. A sensor may be used to perform various operations, such as scanning areas within the warehouse 100 in order to capture visual data and/or three-dimensional (3D) depth information. … " and page 11, paragraph #0035, "Within examples, one or more robotic devices may be brought into the warehouse 100 to create a map of the warehouse 100 space before determining placement of other objects, such as a virtual 2D or 3D image. … ").
Regarding claim 4, HANCE et al. discloses scanning products to determine a volume occupied by said products prior to being placed in said storage locations (see pages 28-29, paragraph #0095, "As another example condition, the WCS may use the item shipment expectation to determine how much storage capacity (i.e., storage space) would be required for items that the WCS expects to be received at the warehouse. …").
Regarding claim 5, HANCE et al. discloses communicating at least some of said processed data to another warehouse management system (see page 5, paragraph #0015, "A group of robotic devices may be used in a warehouse setting for a number of different applications. Another possible application includes distribution (e.g., to stores or other warehouses), in which mixed pallets may be constructed containing groups of different types of items (i.e., types of products) to ship to stores. …").
Regarding claim 6, HANCE et al. discloses a warehouse management system (Fig. 1) comprising: at least one data gathering apparatus including movable around a storage area having a plurality of storage locations therein for gathering lidar and visual image data while moving (see pages 8-9, paragraph #0026, "Any of the robotic devices described herein may include one or more sensor(s) such as force sensors, proximity sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, Global Positioning System (GPS) receivers, sonar, optical sensors, biosensors, Radio Frequency identification (RFID) sensors, Near Field Communication (NFC) sensors, wireless sensors, compasses, smoke sensors, light sensors, audio sensors, microphones, speakers, radar, cameras (e.g., color cameras, grayscale cameras, and/or infrared cameras), depth sensors (e.g., Red Green Blue plus Depth (RGB-D), lasers, a light detection and ranging (LIDAR) device, a structured-light scanner, and/or a time-of-flight camera), a stereo camera, motion sensors (e.g., gyroscope, accelerometer, inertial measurement unit (IMU), and/or foot step or wheel odometry), and/or range sensors (e.g., ultrasonic and/or infrared), among others. The sensor(s) may provide sensor data to a processor(s) to allow for appropriate interaction of a robotic device with the environment. …” and page 11, paragraph #0035, "Within examples, one or more robotic devices may be brought into the warehouse 100 to create a map of the warehouse 100 space before determining placement of other objects, such as any fixed robotic devices or components discussed above, as well as any pallets. Herein, a "map" refers to information representative of a positioning of elements in a particular area of an environment, and/or representative of a relationship of certain elements to other elements or to the environment. Within example implementations, a map is a digital map, determined by collecting and compiling data representative of relationships between elements in the given environment, and then formatting such data into a virtual form, such as a virtual 2D or 3D image. …"); and at least one processor for processing said data to produce processed data in the form of a digital twin representing said storage area and accessible using a computer device to provide information about storage locations and information about products stored in said storage locations and in said storage area, wherein said information for that storage location comprises product information, visual images, volume of the stored location filled and product present compared to a previous quantity (see page 12, paragraph #0038, "FIG. 2 is a simplified block diagram illustrating components of an example computing system 200, in accordance with at least some implementations described herein. … As such, computing system 200 can include various components, such as processor 202, …” and pages 28-29, paragraphs #0095-#0096, "As another example condition, the WCS may use the item shipment expectation to determine how much storage capacity (i.e., storage space) would be required for items that the WCS expects to be received at the warehouse. … For instance, given an example threshold of two thousand square feet, the WCS may be aware of or predict that there is an upcoming delivery of one thousand of a type of item and, based on that number, determine an amount of storage space that would be needed to accommodate pallets of the one thousand items (e.g., five hundred pallet rack positions, three thousand square feet, or other manners of quantifying required storage space). …” and pages 29-30, paragraph #0098, "As another example condition, the WCS may not instruct the robotic devices to condense the set of pallets if there is not much time until the set of pallets are due to be shipped out, emptied, or otherwise no longer present a disadvantage to the warehouse. In other words, the WCS may not instruct the robotic devices to condense the set of pallets if the space taken up by some or all of the set of pallets would become available sooner (as a result of depleting the pallets naturally) than if the robotic devices condensed the set of pallets. For instance, the WCS may use any records/histories, scheduled task information, or other information discussed above to estimate an amount of time until the set of pallets will be depleted a result of warehouse activities other than condensing items (e.g., loading full or less-than-full pallets onto trucks for shipping). If the estimated amount of time is greater than how long it would take for the robotic devices to complete the condensing of the set of pallets, then the WCS may decide to instruct the robotic devices to condense the set of pallets. But if it would likely take the robotic devices longer to condense the set of pallets than it would take for the set of pallets to no longer present a disadvantage to the warehouse, the WCS may instead decide to not instruct the robotic devices to condense the set of pallets (or may cancel the condensing if the task of condensing is already underway). ...").
However, HANCE et al. does not specifically discloses percentage volume of the stored location filled and percentage of product present compared to a previous quantity.
Wintz et al. teaches in the same field of warehouse management system (WMS) determining percentage of size capacity of destinations (see Fig. 4A and page 6, paragraph #0061, "FIG. 2A is a conceptual diagram for determining a size score (e.g., refer to FIG. 4A). … As shown in FIG. 2A, candidate destination 110A is at 100% size capacity, meaning this destination cannot accommodate item 112 size. Therefore, destination 110A can be assigned a least favorable score of 5 and the size capacity of 110B and 110C can be compared (200). Destination 110B can receive a more favorable size score in comparison to destination 110C because destination 110B has 0% size capacity filled. … ").
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the filing percentage as taught by Wintz et al. into HANCE et al.'s system in order to reduce bottlenecks in a storage facility.
Regarding claim 7, HANCE et al. discloses wherein said data gathering apparatus is attached to a warehouse vehicle and caused to move by the movement of that vehicle (see pages 8-9, paragraph #0026, "Any of the robotic devices described herein may include one or more sensor(s) such as force sensors, proximity sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, Global Positioning System (GPS) receivers, sonar, optical sensors, biosensors, Radio Frequency identification (RFID) sensors, Near Field Communication (NFC) sensors, wireless sensors, compasses, smoke sensors, light sensors, audio sensors, microphones, speakers, radar, cameras (e.g., color cameras, grayscale cameras, and/or infrared cameras), depth sensors (e.g., Red Green Blue plus Depth (RGB-D), lasers, a light detection and ranging (LIDAR) device, a structured-light scanner, and/or a time-of-flight camera), a stereo camera, motion sensors (e.g., gyroscope, accelerometer, inertial measurement unit (IMU), and/or foot step or wheel odometry), and/or range sensors (e.g., ultrasonic and/or infrared), among others. The sensor(s) may provide sensor data to a processor(s) to allow for appropriate interaction of a robotic device with the environment. ").
Regarding claim 8, HANCE et al. discloses wherein said visual capture device comprises a 3D camera (see page 9, paragraph #0029, "In some examples, any or all of the robotic devices in the warehouse 100 may include one or more sensors, one or more computers, and one or more robotic arms. A sensor may be used to perform various operations, such as scanning areas within the warehouse 100 in order to capture visual data and/or three-dimensional (3D) depth information. …", and page 11, paragraph #0035, "Within examples, one or more robotic devices may be brought into the warehouse 100 to create a map of the warehouse 100 space before determining placement of other objects, , such as a virtual 2D or 3D image. …").
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to THAI Q TRAN whose telephone number is (571)272-7382. The examiner can normally be reached Monday to Friday from 10:00am to 6:30pm..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Colleen Fauz can be reached at (571) 272-1667. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/THAI Q TRAN/Supervisory Patent Examiner, Art Unit 2484