DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This Final Office Action is in response to the applicant’s amendment/response of 24 November 2025.
Claims 8 and 17 have been canceled.
Claims 21-23 have been newly added.
Claims 1-7, 9-10, 16, 18-23 are currently pending and addressed below.
Response to Arguments
Applicant’s arguments/amendments with respect to the drawing objection have been fully considered and are persuasive. Therefore, the drawing objection has been withdrawn.
Applicant’s arguments/amendments with respect to the rejection of claims under 35 U.S.C. 112(b) have been fully considered and are persuasive. Therefore, the rejection of claims under 35 U.S.C. 112(b) has been withdrawn.
Applicant’s arguments/amendments with respect to the rejection of claims under 35 U.S.C. 103 have been considered but are moot because the new ground of rejection does not rely on any combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 5-6, 9-10, 16, 18-20, and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Lutz et al. (US 20170113352 A1) in view of Thubert et al. (US 20230284288 A1) and further in view of Loughran (US 20170239816 A1).
Regarding claim 1, and similarly with respect to claim 16, Lutz et al. discloses A navigation
alignment system to assist in a navigation of a material handling vehicle, the navigation alignment system comprising: a housing; ([0046] “The robotic fleet 100 could include one or more of various mobile components 110, such as AGV's 112, autonomous fork trucks 114, robotic truck loaders/unloaders 116”). Examiner Notes: See any one of the “mobile components 110” as the housing.
a navigation controller coupled to the housing, the navigation controller to wirelessly communicate with the material handling vehicle; ([0042] “The robotic fleet 100 may include various types of mobile vehicles. One example type of robotic device shown within robotic fleet 100 is an autonomous guided vehicle (AGV) 112”, [0046] “To coordinate actions of separate components, a control system 150, such as a remote, cloud-based server system, may communicate (e.g., by way of wireless communication interfaces) with some or all of the system components and/or with separate local control systems of individual components.”, [0047] “Any of the mobile components 110 may include one or more processors 113a and a non-transitory computer-readable storage medium 113b that stores instructions executable by the one or more processors 113a to perform any function or action described herein. The mobile components 110 may also each include a wireless communication interface (e.g., WIFI, Bluetooth, etc.) so that the mobile components 110 may transmit data to and/or receive data from any of the other mobile components 110, the pedestal robots 122, and/or the control system 150.”).
a lighting system coupled to the housing, the lighting system including at least one directable light; ([0031] “the first robot may use a light source such as a light-emitting diode (LED) to display a series of flashing visible or infrared pulses as the optical identifier. Thereafter, the control system may periodically, or upon identification of a potential security breach of the warehouse, send to the first robot new data encoding a new optical identifier of the first robot. The first robot may then display the new optical identifier for detection by other robots or the control system.”, [0034] “the first robot may detect the optical identifier of the second robot, determine that the priority status of the first robot is higher than the priority status of the second robot, and proceed to bring the first item to the third robot without yielding to the second robot. Similarly, the second robot may detect the optical identifier of the first robot, determine that the priority status of the first robot is higher than the priority status of the second robot, and proceed to bring the second item to the third robot only after determining that the first robot has departed the vicinity of the third robot.”, and [0070] “The truck unloader 200 may include an optical communication interface, such as a display screen 219a and/or a light source 219b.”, and [0072] “The light source 219b may include an incandescent light bulb, an LED, or any other light source configured to generate pulses of visible or infrared light.”)
at least one reflector coupled to the housing; ([0082] “the AGV 240 may include a camera 238 configured to capture images of the environment of the AGV 240 (including other robots).”, and [0101] “first optical identifier may include a two-dimensional matrix code, such as a quick response (QR) code or an augmented reality tag (ARTag). An example first optical identifier 221a takes the form of a QR code in FIG. 2A. The first optical identifier 221a may be displayed by the display screen 219. In this case, the first optical identifier 221a is recognizable (e.g., with reference to the database stored by control system 150) as being associated with the truck unloader 200. The first optical identifier 221a may convey additional information about the truck unloader 200 as well. In other examples, the first optical identifier 221a may take the form of any fiducial marker recognizable as being associated with the truck unloader 200. Generally, an optical identifier may include any fiducial marker, symbol, or information that is detectable by an optical sensor such as a camera, light detector, photosensor, photodiode, charge-coupled device, photoresistor, photomultiplier, image sensor, or photodetector, for example. An optical identifier may be communicated via visible, infrared, and/or ultraviolet light, for example.”)
a communication system ([0047] “The mobile components 110 may also each include a wireless communication interface (e.g., WIFI, Bluetooth, etc.) so that the mobile components 110 may transmit data to and/or receive data from any of the other mobile components 110, the pedestal robots 122, and/or the control system 150.”)
(Figure 1A and 3A, [0046] “a robotic fleet 100, according to an example embodiment. The robotic fleet 100 could include one or more of various mobile components 110, such as AGV's 112, autonomous fork trucks 114, robotic truck loaders/unloaders 116, and delivery trucks 118.”, and [0047] “The mobile components 110 may also each include a wireless communication interface (e.g., WIFI, Bluetooth, etc.) so that the mobile components 110 may transmit data to and/or receive data from any of the other mobile components 110…the control system 150.”). Examiner Notes: See figure 1A (e.g. Figure 3A-3E) as the first navigation area.
a set of rollers coupled to the housing, the set of rollers to allow navigation of the navigation alignment system. (200, Figure 2A, and [0066] “The robotic truck unloader 200 may include a robotic arm 202 with a gripping component 204 for gripping objects within the environment. The robotic arm 202 may use the gripping component 204 to pick up and place boxes to load or unload trucks or other containers. The truck unloader 200 may also include a moveable cart 212 with wheels 214 for locomotion. The wheels 214 may be holonomic wheels that allow the cart 212 to move with two degrees of freedom.”)
Lutz et al. fails to explicitly disclose wherein the at least one antenna is configured to move and direct a Wi-Fi signal to a first navigation area,
Thubert et al. teaches wherein the at least one antenna is configured to move and direct a Wi-Fi signal to a first navigation area, and (Figures 3 and 4A, [0041] “a client device 402, such as a mobile robot 304… or any other wireless client requiring connectivity, may include a transceiver or transceiver that comprises a directional antenna 404 that is pointed substantially upward relative to client device 402. For instance, in the case of a warehouse, factory, or the like, directional antenna 404 may be pointed substantially towards the ceiling of the building.)
It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention with reasonable expectations of success to modify the system of Lutz et al. to incorporate directed antenna as taught by Thubert et al. for the purpose of “ensuring that only one access point is able to transmit to a given location at any given time, so that there is no interference from other access points from the perspective of the client device.” ([0130], Thubert et al.)
However, Lutz et al. in combination with Thubert et al. fails to explicitly disclose wherein the at least one antenna is configured to wirelessly connect the navigation controller to the material handling vehicle to allow the navigation controller to wirelessly communicate with the material handling vehicle
Loughran teaches wherein the at least one antenna is configured to move and direct a Wi-Fi signal (Figure 5, 524) and wherein the at least one antenna is configured to wirelessly connect the navigation controller to the material handling vehicle to allow the navigation controller to wirelessly communicate with the material handling vehicle (Figure 8, [0066] “communication subsystem 524 enabling the robot to communicate to other robots or devices, using a variety of wireless communication methods (for example, WiFi™, Cellular, short-range interconnected device protocol, and the like) as described previously. In some embodiment, communications may be wired. Communication subsystem 524 may communicate with external channel 504, for transporting various types of information (as outlined previously) between a robot and central point station 716 (referring to FIG. 7) or for communications to other components. Communication subsystem 524 may also include, a means for communicating with peer channel 536, which may facilitate information transport such as, location info, relative position, system status, information form external channels, information from service channels, information form sensors, information to actuators between other robots working within that team.”, and [0092] “It is noted that with such spatial separation, a wireless interfacing means embodied as part of communications subsystem 524, can operate as one of “n” elements in a wireless interface array. For example, in RF wireless technology these would operate as antenna array elements.”)
It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention with reasonable expectations of success to modify the system of Lutz et al. in combination with Thubert et al. to incorporate peer channel communication (e.g. using antenna(s)) as taught by Loughran for the purpose “facilitate[ing] information transport such as, location info, relative position,…information to actuators between other robots working within the team.” ([0066], Loughran)
Regarding claim 2, Lutz et al. in view of Thubert et al. and Loughran discloses The navigation
alignment system of claim 1,
Lutz et al. discloses wherein the navigation controller is communicatively coupled to the
set of rollers to cause the navigation of the navigation alignment system. ([0042] “The robotic fleet 100 may include various types of mobile vehicles. One example type of robotic device shown within robotic fleet 100 is an autonomous guided vehicle (AGV) 112, which may be a relatively small, mobile device with wheels that may function to transport individual packages, cases, or totes from one location to another within the warehouse.”, [0046] “To coordinate actions of separate components, a control system 150, such as a remote, cloud-based server system, may communicate (e.g., by way of wireless communication interfaces) with some or all of the system components and/or with separate local control systems of individual components.”, [0047] “Any of the mobile components 110 may include one or more processors 113a and a non-transitory computer-readable storage medium 113b that stores instructions executable by the one or more processors 113a to perform any function or action described herein. The mobile components 110 may also each include a wireless communication interface (e.g., WIFI, Bluetooth, etc.) so that the mobile components 110 may transmit data to and/or receive data from any of the other mobile components 110, the pedestal robots 122, and/or the control system 150.”).
Regarding claim 3, Lutz et al. in view of Thubert et al. and Loughran discloses The navigation
alignment system of claim 1,
Lutz et al. discloses wherein the housing is adjustable in any of height, width, or depth so
as to achieve different shapes and sizes. ([0067] “a sensing system of robotic truck unloader 200 may use one or more sensors attached to a robotic arm 202, such as sensor 206 and sensor 208, which may be 2D sensors and/or 3D depth sensors that sense information about the environment as the robotic arm 202 moves.”, and [0073] “The pedestal robot 220 may include a robotic arm 222 with an end-effector-mounted gripper 224, which may be of the same type as the robotic manipulator 202 and gripper 204 described with respect to the robotic truck unloader 200…the pedestal 226 may include an actuator to allow a control system to change the height of the robotic arm 222.”)
Regarding claim 5, Lutz et al. in view of Thubert et al. and Loughran discloses The navigation
alignment of claim 1,
Lutz et al. discloses wherein the navigation controller communicates directly and/or indirectly with any of at least one of the navigation alignment system, the material handling vehicle, a trailer, a trailer bay, a warehouse management system, and/or any other device. ([0042] “The robotic fleet 100 may include various types of mobile vehicles. One example type of robotic device shown within robotic fleet 100 is an autonomous guided vehicle (AGV) 112”, [0046] “To coordinate actions of separate components, a control system 150, such as a remote, cloud-based server system, may communicate (e.g., by way of wireless communication interfaces) with some or all of the system components and/or with separate local control systems of individual components.”, [0047] “Any of the mobile components 110 may include one or more processors 113a and a non-transitory computer-readable storage medium 113b that stores instructions executable by the one or more processors 113a to perform any function or action described herein. The mobile components 110 may also each include a wireless communication interface (e.g., WIFI, Bluetooth, etc.) so that the mobile components 110 may transmit data to and/or receive data from any of the other mobile components 110, the pedestal robots 122, and/or the control system 150.”).
Regarding claim 6, Lutz et al. in view of Thubert et al. and Loughran discloses The navigation
alignment of claim 1,
Lutz et al. discloses wherein the navigation controller causes the housing of the navigation alignment system to autonomously maneuver into an alignment position with the trailer, turn on the lighting system, and turn on the communication system. (Figure 3D, [0042] “robotic truck unloader 116 may be used to load boxes onto delivery truck 118, which may be parked adjacent to the warehouse. In some examples, movements of delivery truck 118 (e.g., to deliver packages to another warehouse) may also be coordinated with robotic devices within the fleet.”, and [0046] “To coordinate actions of separate components, a control system 150, such as a remote, cloud-based server system, may communicate (e.g., by way of wireless communication interfaces) with some or all of the system components and/or with separate local control systems of individual components.”, [0050] “control system 150 may include a central planning system that assigns tasks to different robotic devices within fleet 100. The central planning system may employ various scheduling algorithms to determine which devices will complete which tasks at which times. For instance, an auction type system may be used in which individual robots bid on different tasks, and the central planning system may assign tasks to robots to minimize overall costs. In additional examples, the central planning system may optimize across one or more different resources, such as time, space, or energy utilization. In further examples, a planning or scheduling system may also incorporate particular aspects of the geometry and physics of box picking, packing, or storing.”, and see at least [0097] and figure 8)
Regarding claim 9, Lutz et al. in view of Thubert et al. and Loughran discloses The navigation
alignment system of claim 1,
Lutz et al. discloses wherein the navigation alignment system includes at least one autonomous navigation sensor configured to allow the housing to navigate autonomously in proximity to, into, out of, and/or within a second navigation area. ([0065] “a robotic truck unloader, according to an example embodiment. In some examples, a robotic truck unloader may include one or more sensors, one or more computers, and one or more robotic arms. The sensors may scan an environment containing one or more objects in order to capture visual data and/or three-dimensional (3D) depth information. Data from the scans may then be integrated into a representation of larger areas in order to provide digital environment reconstruction. In additional examples, the reconstructed environment may then be used for identifying objects to pick up, determining pick positions for objects, and/or planning collision-free trajectories for the one or more robotic arms and/or a mobile base.”, and [0067] “sensing system of robotic truck unloader 200 may use one or more sensors attached to a robotic arm 202, such as sensor 206 and sensor 208, which may be 2D sensors and/or 3D depth sensors that sense information about the environment as the robotic arm 202 moves. The sensing system may determine information about the environment that can be used by a control system (e.g., a computer running motion planning software) to pick and move boxes efficiently… and one or more sensors mounted on a robotic arm, such as sensor 206 and sensor 208, may be integrated to build up a digital model of the environment, including the sides, floor, ceiling, and/or front wall of a truck or other container. Using this information, the control system may cause the mobile base to navigate into a position for unloading or loading. In some examples, the sensor 208 may include a camera configured to capture images of the environment of the truck unloader 200 (including other robots), and see at least [0042])
Regarding claim 10, Lutz et al. in view of Thubert et al. and Loughran discloses The
navigation alignment system of claim 1,
Lutz et al. discloses wherein the material handling vehicle is configured to navigate a path upon achieving an alignment with the navigation alignment system. (Figure 8, and [0143] “the first truck unloader 200 and the second truck unloader 800 may approach each other as they navigate their own respective paths to complete respective tasks. In the case where the first priority status is higher than the second priority status, the second truck unloader 800 may yield to the first truck unloader 200 so that the first truck unloader 200 may move past the second truck unloader 800. In the case where the first priority status is lower than the second priority status, the first truck unloader 200 may yield to the second truck unloader 800 so that the second truck unloader 800 may move past the first truck unloader 200.”)
Regarding claim 18, Lutz et al. in view of Thubert et al. and Loughran discloses The
navigation alignment system of claim 16,
Lutz et al. discloses further comprising at least one imaging sensor, ([0051] “The control
system 150 may also include a camera (or be communicatively coupled to a camera) for capturing images of the environment of the control system 150.”, [0067] “sensing system of robotic truck unloader 200 may use one or more sensors attached to a robotic arm 202, such as sensor 206 and sensor 208, which may be 2D sensors and/or 3D depth sensors that sense information about the environment as the robotic arm 202 moves. The sensing system may determine information about the environment that can be used by a control system (e.g., a computer running motion planning software) to pick and move boxes efficiently… and one or more sensors mounted on a robotic arm, such as sensor 206 and sensor 208, may be integrated to build up a digital model of the environment, including the sides, floor, ceiling, and/or front wall of a truck or other container. Using this information, the control system may cause the mobile base to navigate into a position for unloading or loading. In some examples, the sensor 208 may include a camera configured to capture images of the environment of the truck unloader 200 (including other robots).) wherein the navigation controller is configured to receive data from the at least one imaging sensor and align the housing of the navigation alignment system to a navigation area based on the data received from the at least one imaging sensor. ([0047] “The mobile components 110 may also each include a wireless communication interface (e.g., WIFI, Bluetooth, etc.) so that the mobile components 110 may transmit data to and/or receive data from any of the other mobile components 110, the pedestal robots 122, and/or the control system 150.”, [0116] “identifying the first robot by detecting the first optical identifier within the captured image. For example, the control system 150 may use known image processing techniques to identify the first optical identifier 221a within the image, and then identify the truck unloader 200 by using the database to associate the first optical identifier 221a with the truck unloader 200.”, and [0117] “ the control system 150 may determine that the truck unloader 200 has deviated from its predetermined course because of navigational error. In response, the control system 150 may send a message to the truck unloader 200 that includes the current location of the truck unloader 200 and/or instructions for the truck unloader 200 to navigate back to the predetermined course…determining the state of the first robot (e.g., truck unloader 200) may include determining the state of the first robot based on the at least one captured image.”)
Regarding claim 19, Lutz et al. in view of Thubert et al. and Loughran discloses The
navigation alignment system of claim 18,
Lutz et al. discloses wherein the navigation area is a trailer and wherein the housing of the
navigation alignment system is aligned with a perimeter of the trailer based on the data received from the at least one imaging sensor. (Figure 8, and ([0047] “The mobile components 110 may also each include a wireless communication interface (e.g., WIFI, Bluetooth, etc.) so that the mobile components 110 may transmit data to and/or receive data from any of the other mobile components 110, the pedestal robots 122, and/or the control system 150.”, [0114] “performing the action may include retrieving a first item from a third robot after the second robot retrieves a second item from the third robot. In the case where the first priority status is higher than the second priority status, the first truck unloader 200 may proceed to load or unload items onto or from the delivery truck 118 before the second truck unloader 800 performs such actions. In the case where the first priority status is lower than the second priority status, the second truck unloader 800 may proceed to load or unload items onto or from the delivery truck 118 before the first truck unloader 200 performs such actions”, [0116] “identifying the first robot by detecting the first optical identifier within the captured image. For example, the control system 150 may use known image processing techniques to identify the first optical identifier 221a within the image, and then identify the truck unloader 200 by using the database to associate the first optical identifier 221a with the truck unloader 200.”, and [0117] “ the control system 150 may determine that the truck unloader 200 has deviated from its predetermined course because of navigational error. In response, the control system 150 may send a message to the truck unloader 200 that includes the current location of the truck unloader 200 and/or instructions for the truck unloader 200 to navigate back to the predetermined course…determining the state of the first robot (e.g., truck unloader 200) may include determining the state of the first robot based on the at least one captured image.”)
Regarding claim 20, Lutz et al. in view of Thubert et al. and Loughran discloses The
navigation alignment system of claim 16,
Lutz et al. discloses further comprising at least one presence detection sensor, wherein the
navigation controller is configured to receive data from the at least one presence detection sensor and determine at least one of an approach of the material handling vehicle to a navigation area or an exit of the material handling vehicle from the navigation area. ([0030] “Multiple robots may act in coordination to perform a variety of tasks, perhaps at the direction of a control system. In one such example, the control system may coordinate the robots as the robots move items within a warehouse. In this context, the control system may monitor and identify various robots within the warehouse to observe and aid in tasks being performed by the robots. One way for the control system to identify a given robot in the warehouse might be to use a camera or another optical sensor to detect and identify a “static” identifier (e.g., a two-dimensional matrix code) that appears on the exterior of the given robot.”, [0032] “The control system may detect the first robot's optical identifier and access the database to associate the detected optical identifier with the first robot. The control system may then observe the first robot to determine a state of the first robot, including one or more of a current location of the first robot, the identity of an item or another robot that the first robot is interacting with, or an operational status etc. Based on the state of the first robot and in accordance with overall system goals, the control system may send a command to the first robot to perform a particular function.”, and [0104] “The first robot may be represented by any of the AGVs 112, autonomous fork trucks 114, truck (un)loaders 116, delivery trucks 118, or pedestal robots 122.”)
Regarding claim 23, Lutz et al. in view of Thubert et al. and Loughran discloses The
navigation alignment system of claim 16,
Lutz et al. discloses wherein the navigation area is an enclosed space. ([0030] “Multiple
robots may act in coordination to perform a variety of tasks, perhaps at the direction of a control system. In one such example, the control system may coordinate the robots as the robots move items within a warehouse.”, [0040] “A further possible application includes cross-docking, which may involve transporting between shipping containers without storing anything (e.g., items may be moved from four 40-foot trailers and loaded into three lighter tractor trailers”, and [0042] “robotic truck unloader 116 may be used to load boxes onto delivery truck 118, which may be parked adjacent to the warehouse. In some examples, movements of delivery truck 118 (e.g., to deliver packages to another warehouse) may also be coordinated with robotic devices within the fleet.”)
Allowable Subject Matter
Claims 4, 7, and 21-22 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MISA HUYNH NGUYEN whose telephone number is (571)270-5604. The examiner can normally be reached Monday-Friday.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MISA H NGUYEN/Examiner, Art Unit 3666
/ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666