DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments, see Remarks, filed 12/12/2025, with respect to the rejection(s) of claim(s) 1 and 11 under 35 USC 103 have been fully considered and are found persuasive. However, Applicant’s amendments changed the scope of the claims thereby necessitating a new ground(s) of rejection.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1, 2, 5-6, 9, 11-13, 15-16, and 22-27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Drexler et al. (US 20170080850 A1, Drexler) in view of Pinter et al. (US 20150088310 A1, hereinafter Pinter).
Regarding claim 1, Drexler discloses:
A mobile robot configured to perform one or more tasks in a warehouse (at least as in paragraph 0017, “a system 100 including an autonomous vehicle 104 for deployment in a facility, such as a manufacturing facility, warehouse or the like”), comprising:
a mobile base including an (at least as in paragraph 0023, “Vehicle 104 includes a chassis 200 containing or otherwise supporting various components, including one or more locomotive devices 204. Devices 204 in the present example are wheels. In other embodiments, however, any suitable locomotive device, or combination thereof, may be employed (e.g. tracks, propellers, and the like)”);
a navigation module configured to provide control instructions to the (at least as in paragraph 0020, “When vehicle 104 is assigned a task by computing device 108, vehicle 104 is configured to generate a path for completing the task (e.g. a path leading from the vehicle's current location to the end location of the task; the path may include one or more intermediate locations between the start location and the end location)”; at least as in paragraph 0041, “Trajectory data defines a trajectory of vehicle 104. Trajectory data can therefore include the current location and velocity of vehicle 104 (either received from computing device 108 or from onboard sensors such as a GPS sensor within chassis 200). Trajectory data can also include planned locations and velocities of vehicle 104, in the form of one or more sets of path data, each set identifying a sequence of locations (and, in some embodiments, accompanying vectors) to which vehicle 104 is to travel. The trajectory data can also include locomotion commands (e.g. defined as vectors consisting of a direction and a speed; or defined as power instructions to the motors driving wheels 204) generated by either or both of processor 250 and computing device 108 based on the path data”);
a plurality of lighting modules disposed at corners of the mobile base including a first lighting module disposed in a first corner of the mobile base and a second lighting module disposed in a second corner of the mobile base, the second corner being different from the first corner, wherein the first lighting module includes a first set of individually-controllable light sources and the second lighting module includes a second set of individually-controllable light sources (at least as in paragraph 0027, vehicle 104 includes an illumination system 224. In general, illumination system 224 is configured to emit visible light from at least a portion of chassis 200. In the present embodiment, illumination system 224 includes an array of light-emitting components, such as light emitting diodes (LEDs) extending substantially entirely around the perimeter of chassis 200”); and
a controller configured to control an operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources based, at least in part, on navigation information received from the navigation module, the navigation information including a future travel direction of the mobile robot through the warehouse (at least as in paragraph 0038, “As will be apparent from Table 1, the lighting parameters can also include variable parameters whose values depend on other data available to processor 250. For example, the pattern “P-03” specifies that an arrow image is to be projected not in any predefined direction, but rather in the direction (either current or planned) of travel of vehicle 104”),
wherein the first set of individually-controllable light sources and the second set of individually-controllable light sources are controlled to indicate the future travel direction of the mobile robot through the warehouse by selecting, as the first lighting module, a lighting module arranged on a first side of the mobile robot corresponding to the future travel direction and selecting, as the second lighting module, a lighting module arranged on a second side of the mobile robot opposite of the future travel direction of the mobile robot (see at least Fig. 5A, 6B, 6C, and 7A, wherein the lights are explicitly shown on the corners, the front two sets are set as a headlight and rear set are set as taillights to indicate the direction travel, “a new direction of travel,” or “a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”; at least as in Fig. 5A-5C and 0064, wherein “a pattern defining headlights 500 on the array of LEDs (on the “forward” segment of the array, based on the direction of motion of vehicle 104, illustrated by arrow 502), and a second pattern defining tail lights 504 on the array, are illustrated. When the direction of travel of vehicle 104 changes (as illustrated by arrow 502′), the headlights can be repositioned (as shown by headlights 500′) in the new direction of travel”; at least as in paragraph 0068-0069 & Fig. 6B-6C, wherein “headlights 610 defined by a first pattern are illustrated in a forward direction (as indicated by arrow 612); tail lights 614 defined by a second pattern are illustrated in a rearward direction”; at least as in paragraph 0070 & Fig. 7A, “the array is controlled by processor 250 to activate a plurality of discrete sections 702 that gather in a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”).
However, Drexler does not explicitly disclose “omnidirectional.”
Pinter, in the same field of endeavor of an autonomous control system for a telepresence robot with lights for indicating the status of the robot, specifically teaches “omnidirectional” (at least as in paragraph 0018, “The robot 100 includes a base 102, an upper portion 104, and a head 106”; at least as in paragraph 0019, “The base 102 supports the robot 100 and may include a drive system for moving the robot 100 about a work area”; at least as in paragraph 0024, “In one embodiment, the drive system 202 is an omnidirectional drive system that allows the robot 100 to move in any direction”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include Pinter’s teaching of an omnidirectional drive system, since Pinter teaches wherein an omnidirectional drive system allows the robot to move in any direction regardless of the orientation or angle of the base 102 relative to the motion, thus improving the mobility and safety of the robot and those around the robot.
Regarding claim 2, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim, wherein the first set of individually-controllable light sources and the second set of individually-controllable light sources include programmable light emitting diodes (LEDs) (at least as in paragraph 0027, “illumination system 224 includes an array of light-emitting components, such as light emitting diodes (LEDs) extending substantially entirely around the perimeter of chassis 200. In the present embodiment, the LEDs are individually addressable and each capable of emitting multiple colours (e.g. red, green and blue)”).
Regarding claim 5, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 1, wherein controlling the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate the future travel direction of the mobile robot through the warehouse comprises controlling the first lighting module to display a first color and controlling the second lighting module to display a second color (at least as in paragraph 0034-0038, wherein Table 1 describes example lighting patterns including P-02 wherein arrays 1 and 2 illuminate in a white color in the direction of path; at least as in paragraph 0064 & Fig. 5A, “a pattern defining headlights 500 on the array of LEDs (on the “forward” segment of the array, based on the direction of motion of vehicle 104, illustrated by arrow 502), and a second pattern defining tail lights 504 on the array, are illustrated… headlights 500 can be defined by a pattern definition similar to P-02 shown in Table 1”).
Regarding claim 6, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 1, wherein the navigation information received from the navigation module includes speed information for the mobile robot, and wherein controlling an operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources based, at least in part, on navigation information comprises controlling the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate the speed information for the mobile robot (at least as in paragraph 0040, “At block 310, processor 250 is configured to receive state data defining a current state of vehicle 104… the state data can include any one of, or any combination of, trajectory data, environmental data, and control data”; at least as in paragraph 0041, “Trajectory data can therefore include the current location and velocity of vehicle 104… The trajectory data can also include locomotion commands (e.g. defined as vectors consisting of a direction and a speed”; at least as in paragraph 0047, “Turning briefly to FIG. 4, an example implementation of block 315 of FIG. 3 is illustrated, based on the contents of Table 2”; at least as in paragraph 0048, “processor 250 is configured to determine, at blocks 415, 425, 435 and 445 respectively, whether the trajectory, objects, warnings and errors sub-states are active based on the state data and the conditions in Table 2”).
Regarding claim 9, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 1, further comprising:
a status tracker module configured to determine status information associated with one or more operations of the mobile robot (at least as in paragraph 0021, “The presence or absence of objects, the task and movement commands, and the operational parameters mentioned above collectively define a current state of vehicle. As will be described herein, vehicle 104 includes an illumination system, and is configured to control the illumination system to signal its current state to outside viewers”; at least as in paragraph 0040, “At block 310, processor 250 is configured to receive state data defining a current state of vehicle 104. The state data can be received from various sources, including internal sensors and other systems housed within chassis 200, and computing device 108. In general, the state data can include any one of, or any combination of, trajectory data, environmental data, and control data”),
wherein the controller is further configured to control the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate status information received from the status tracker module (at least as in paragraph 0036, “Each lighting pattern definition record also includes an indication of a state of vehicle 104 in which the pattern is to be used to control illumination system 224”; at least as in paragraph 0044, “Having received the state data, at block 315 processor 250 is configured to identify any active sub-states based on the state data. In the present example, the identification is performed by determining, at processor 250, whether each of a plurality of previously defined sub-states is active based on the state data. The sub-states are ranked in order of their importance to the control of illumination system 224”; at least as in paragraph 0051, “Having selected a lighting pattern definition, in some embodiments processor 250 can be configured to proceed directly to block 350, and control illumination system 224 according to the selected lighting pattern definition”).
Regarding claim 11, Drexler discloses:
A method of controlling a plurality of lighting modules disposed at corners of a mobile base of a mobile robot configured to perform one or more task in a warehouse (at least as in paragraph 0017, “a system 100 including an autonomous vehicle 104 for deployment in a facility, such as a manufacturing facility, warehouse or the like”), the mobile robot including an (at least as in paragraph 0023, “Vehicle 104 includes a chassis 200 containing or otherwise supporting various components, including one or more locomotive devices 204. Devices 204 in the present example are wheels. In other embodiments, however, any suitable locomotive device, or combination thereof, may be employed (e.g. tracks, propellers, and the like)”), a first lighting module disposed in a first corner of the mobile base, a second lighting module disposed in a second corner of the mobile base, the second corner being different from the first corner, wherein the first lighting module includes a first set of individually-controllable light sources and the second lighting module includes a second set of individually-controllable light sources (at least as in paragraph 0027, vehicle 104 includes an illumination system 224. In general, illumination system 224 is configured to emit visible light from at least a portion of chassis 200. In the present embodiment, illumination system 224 includes an array of light-emitting components, such as light emitting diodes (LEDs) extending substantially entirely around the perimeter of chassis 200”), the method comprising:
receiving navigation information indicating a future travel direction of motion of the mobile robot through the warehouse (at least as in paragraph 0020, “When vehicle 104 is assigned a task by computing device 108, vehicle 104 is configured to generate a path for completing the task (e.g. a path leading from the vehicle's current location to the end location of the task; the path may include one or more intermediate locations between the start location and the end location)”; at least as in paragraph 0041, “Trajectory data defines a trajectory of vehicle 104. Trajectory data can therefore include the current location and velocity of vehicle 104 (either received from computing device 108 or from onboard sensors such as a GPS sensor within chassis 200). Trajectory data can also include planned locations and velocities of vehicle 104, in the form of one or more sets of path data, each set identifying a sequence of locations (and, in some embodiments, accompanying vectors) to which vehicle 104 is to travel. The trajectory data can also include locomotion commands (e.g. defined as vectors consisting of a direction and a speed; or defined as power instructions to the motors driving wheels 204) generated by either or both of processor 250 and computing device 108 based on the path data”); and
controlling, by at least one computing device, an operation of the first set of individually- controllable light sources and the second set of individually-controllable light sources to indicate the future travel direction of the mobile robot through the warehouse, wherein the first set of individually-controllable light sources and the second set of individually-controllable light sources are controlled to indicate the future travel direction of the mobile robot by selecting, as the first lighting module, a lighting module arranged on a first side of the mobile robot corresponding to the future travel direction and selecting, as the second lighting module, a lighting module arranged on a second side of the mobile robot opposite of the future travel direction of the mobile robot (at least as in paragraph 0038, “As will be apparent from Table 1, the lighting parameters can also include variable parameters whose values depend on other data available to processor 250. For example, the pattern “P-03” specifies that an arrow image is to be projected not in any predefined direction, but rather in the direction (either current or planned) of travel of vehicle 104”; see at least Fig. 5A, 6B, 6C, and 7A, wherein the lights are explicitly shown on the corners, the front two sets are set as a headlight and rear set are set as taillights to indicate the direction travel, “a new direction of travel,” or “a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”; at least as in Fig. 5A-5C and 0064, wherein “a pattern defining headlights 500 on the array of LEDs (on the “forward” segment of the array, based on the direction of motion of vehicle 104, illustrated by arrow 502), and a second pattern defining tail lights 504 on the array, are illustrated. When the direction of travel of vehicle 104 changes (as illustrated by arrow 502′), the headlights can be repositioned (as shown by headlights 500′) in the new direction of travel”; at least as in paragraph 0068-0069 & Fig. 6B-6C, wherein “headlights 610 defined by a first pattern are illustrated in a forward direction (as indicated by arrow 612); tail lights 614 defined by a second pattern are illustrated in a rearward direction”; at least as in paragraph 0070 & Fig. 7A, “the array is controlled by processor 250 to activate a plurality of discrete sections 702 that gather in a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”).
wherein the first set of individually-controllable light sources and the second set of individually-controllable light sources are controlled to indicate the future travel direction of the mobile robot through the warehouse by selecting, as the first lighting module, a lighting module arranged on a first side of the mobile robot corresponding to the future travel direction and selecting, as the second lighting module, a lighting module arranged on a second side of the mobile robot opposite of the future travel direction of the mobile robot (see at least Fig. 5A, 6B, 6C, and 7A, wherein the lights are explicitly shown on the corners, the front two sets are set as a headlight and rear set are set as taillights to indicate the direction travel, “a new direction of travel,” or “a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”; at least as in Fig. 5A-5C and 0064, wherein “a pattern defining headlights 500 on the array of LEDs (on the “forward” segment of the array, based on the direction of motion of vehicle 104, illustrated by arrow 502), and a second pattern defining tail lights 504 on the array, are illustrated. When the direction of travel of vehicle 104 changes (as illustrated by arrow 502′), the headlights can be repositioned (as shown by headlights 500′) in the new direction of travel”; at least as in paragraph 0068-0069 & Fig. 6B-6C, wherein “headlights 610 defined by a first pattern are illustrated in a forward direction (as indicated by arrow 612); tail lights 614 defined by a second pattern are illustrated in a rearward direction”; at least as in paragraph 0070 & Fig. 7A, “the array is controlled by processor 250 to activate a plurality of discrete sections 702 that gather in a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”).
However, Drexler does not explicitly disclose “omnidirectional.”
Pinter, in the same field of endeavor of an autonomous control system for a telepresence robot with lights for indicating the status of the robot, specifically teaches “omnidirectional” (at least as in paragraph 0018, “The robot 100 includes a base 102, an upper portion 104, and a head 106”; at least as in paragraph 0019, “The base 102 supports the robot 100 and may include a drive system for moving the robot 100 about a work area”; at least as in paragraph 0024, “In one embodiment, the drive system 202 is an omnidirectional drive system that allows the robot 100 to move in any direction”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include Pinter’s teaching of an omnidirectional drive system, since Pinter teaches wherein an omnidirectional drive system allows the robot to move in any direction regardless of the orientation or angle of the base 102 relative to the motion, thus improving the mobility and safety of the robot and those around the robot.
Regarding claim 12 in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 11, wherein controlling an operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate the future travel direction of the mobile robot through the warehouse comprises controlling at least some of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate a current travel direction of the mobile robot (at least as in Fig. 5A-5C and 0064, wherein “a pattern defining headlights 500 on the array of LEDs (on the “forward” segment of the array, based on the direction of motion of vehicle 104, illustrated by arrow 502), and a second pattern defining tail lights 504 on the array, are illustrated. When the direction of travel of vehicle 104 changes (as illustrated by arrow 502′), the headlights can be repositioned (as shown by headlights 500′) in the new direction of travel”).
Regarding claim 13, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 12, wherein controlling at least some of the first set of individually-controllable light sources and the second set of individually- controllable light sources to indicate the current travel direction of the mobile robot comprises controlling the first lighting module to display a first color and controlling the second set of lighting module to display a second color, wherein the first lighting module is located relative to the second lighting module in the current travel direction of the mobile robot (at least as in paragraph 0034-0038, wherein Table 1 describes example lighting patterns including P-02 wherein arrays 1 and 2 illuminate in a white color in the direction of path; at least as in paragraph 0064 & Fig. 5A, “a pattern defining headlights 500 on the array of LEDs (on the “forward” segment of the array, based on the direction of motion of vehicle 104, illustrated by arrow 502), and a second pattern defining tail lights 504 on the array, are illustrated… headlights 500 can be defined by a pattern definition similar to P-02 shown in Table 1”).
Regarding claim 15, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 11, wherein controlling the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate the future travel direction of the mobile robot comprises:
controlling the first lighting module to display a first color and controlling the second lighting module to display a second color; and
controlling a third lighting module to display the first color and controlling a fourth lighting module to display the second color, wherein the third lighting module is different from the first lighting module and the fourth lighting module is different from the second lighting module (at least as in paragraph 0034-0038, wherein Table 1 describes example lighting patterns including P-02 wherein arrays 1 and 2 illuminate in a white color in the direction of path; at least as in paragraph 0064 & Fig. 5A, “a pattern defining headlights 500 on the array of LEDs (on the “forward” segment of the array, based on the direction of motion of vehicle 104, illustrated by arrow 502), and a second pattern defining tail lights 504 on the array, are illustrated… headlights 500 can be defined by a pattern definition similar to P-02 shown in Table 1”).
Regarding claim 16, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 11, wherein the navigation information includes speed information for the mobile robot, and wherein the method further comprises:
controlling an operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate the speed information for the mobile robot (at least as in paragraph 0040, “At block 310, processor 250 is configured to receive state data defining a current state of vehicle 104… the state data can include any one of, or any combination of, trajectory data, environmental data, and control data”; at least as in paragraph 0041, “Trajectory data can therefore include the current location and velocity of vehicle 104… The trajectory data can also include locomotion commands (e.g. defined as vectors consisting of a direction and a speed”; at least as in paragraph 0047, “Turning briefly to FIG. 4, an example implementation of block 315 of FIG. 3 is illustrated, based on the contents of Table 2”; at least as in paragraph 0048, “processor 250 is configured to determine, at blocks 415, 425, 435 and 445 respectively, whether the trajectory, objects, warnings and errors sub-states are active based on the state data and the conditions in Table 2”).
Regarding claim 22, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 1, wherein the plurality of lighting modules includes a continuous band of lighting,
the first set of individually-controllable light sources are located at a first location within the continuous band of lighting, and
the second set of individually-controllable light sources are located at a second location within the continuous band of lighting (at least as in paragraph 0027, “illumination system 224 is configured to emit visible light from at least a portion of chassis 200. In the present embodiment, illumination system 224 includes an array of light-emitting components, such as light emitting diodes (LEDs) extending substantially entirely around the perimeter of chassis 200”; see at least Fig. 5A, 6B, 6C, and 7A, wherein the lights are explicitly shown on the corners, the front two sets are set as a headlight and rear set are set as taillights to indicate the direction travel, “a new direction of travel,” or “a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”).
Regarding claim 23, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 11, wherein the plurality of lighting modules includes a continuous band of lighting,
the first set of individually-controllable light sources are located at a first location within the continuous band of lighting, and
the second set of individually-controllable light sources are located at a second location within the continuous band of lighting (at least as in paragraph 0027, “illumination system 224 is configured to emit visible light from at least a portion of chassis 200. In the present embodiment, illumination system 224 includes an array of light-emitting components, such as light emitting diodes (LEDs) extending substantially entirely around the perimeter of chassis 200”; see at least Fig. 5A, 6B, 6C, and 7A, wherein the lights are explicitly shown on the corners, the front two sets are set as a headlight and rear set are set as taillights to indicate the direction travel, “a new direction of travel,” or “a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”).
Regarding claim 24, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 1, wherein controlling the operation of the first set of individually-controllable light sources and the second set of individually- controllable light sources based, at least in part, on navigation information received from the navigation module comprises:
controlling the first lighting module to display a first color and controlling the second lighting module to display a second color based, at least in part, on a first current travel direction or a first future travel direction of the mobile robot (at least as in paragraph 0038, wherein the vehicle can indicate the current or planned direction of travel using lighting patterns; at least as in Fig. 5A-5C and 0064, wherein four corner lights defining headlights and taillights may be used to indicate the direction; at least as in paragraph 0048 and Fig. 4, wherein the control system continually updates its trajectory data; at least as in paragraph 0064, wherein a change in the travel direction is made),
wherein the controller is further configured to control the first lighting module to display the second color and control the second lighting module to display the first color based, at least in part, on navigation information indicating a second current travel direction or second future travel direction of the mobile robot different from the first current travel direction or the first future travel direction of the mobile robot (at least as in paragraph 0064, wherein the headlights and taillights may be repositioned in the new direction of travel when the direction of travel changes; at least as in paragraph 0077, wherein the planned change in trajectory of reversing the direction of travel is indicated).
Regarding claim 25, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 11, wherein the future travel direction of the mobile robot corresponds to a first future travel direction and controlling the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate the future travel direction of the mobile robot comprises:
controlling the first lighting module to display a first color and controlling the second lighting module to display a second color to indicate the first future travel direction (at least as in paragraph 0038, wherein the vehicle can indicate the current or planned direction of travel using lighting patterns; at least as in Fig. 5A-5C and 0064, wherein four corner lights defining headlights and taillights may be used to indicate the direction);
receiving navigation information indicating a second future travel direction of the mobile robot different from the first future travel direction (at least as in paragraph 0048 and Fig. 4, wherein the control system continually updates its trajectory data; at least as in paragraph 0064, wherein a change in the travel direction is made); and
controlling the first lighting module to display the second color and controlling the second lighting module to display the first color to indicate the second future travel direction (at least as in paragraph 0064, wherein the headlights and taillights may be repositioned in the new direction of travel when the direction of travel changes; at least as in paragraph 0077, wherein the planned change in trajectory of reversing the direction of travel is indicated).
Regarding claim 26, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 5, wherein controlling the first lighting module to display the first color and controlling the second lighting module to display the second color comprises controlling the first lighting module to display a white color and controlling the second lighting module to display a red color (at least as in paragraph 0034-0035, 0053, 0064, and Table 1, wherein the headlight array illuminates in white; at least as in paragraph 0064, 0068, 0069, wherein the rear lighting are referred to as tail lights; at least as in paragraph 0027, wherein the LEDs are addressable to red).
Regarding claim 27, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 15, wherein controlling the first lighting module to display the first color and controlling the second lighting module to display the second color comprises controlling the first lighting module to display a white color and controlling the second lighting module to display a red color (at least as in paragraph 0034-0035, 0053, 0064, and Table 1, wherein the headlight array illuminates in white; at least as in paragraph 0064, 0068, 0069, wherein the rear lighting are referred to as tail lights; at least as in paragraph 0027, wherein the LEDs are addressable to red).
Claim(s) 7-8, and 17-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Drexler et al. (US 20170080850 A1, Drexler) in view of Pinter et al. (US 20150088310 A1, hereinafter Pinter), and further in view of White et al. (US 20220041098 A1, hereinafter White).
Regarding claim 7, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 1, further comprising:
a mode determining component configured to determine whether the mobile robot is operating in an autonomous mode or a manual mode (at least as in paragraph 0017, “Vehicle 104 need not be entirely autonomous. That is, vehicle 104 can receive instructions from human operators, computing devices and the like, from time to time, and can act with varying degrees of autonomy in executing such instructions”; at least as in paragraph 0043, “the control data can include indications of any warning conditions that are active (e.g. a low battery warning), any error conditions that are active (e.g. an emergency stop error), and identifiers of any discrete operating modes currently active in vehicle 104”; at least as in paragraph 0048, wherein “At block 405, processor 250 is configured to determine whether the operational data of the state data received at block 310 indicates that a discrete operating mode (such as a docking mode) is active”),
wherein the controller is further configured to control the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources (at least as in paragraph 0057, “When the determination at block 340 is negative—that is, when all segments have been processed—processor 250 is configured to proceed to block 350. At block 350, processor 250 is configured to control illumination system 224 according to the selected lighting pattern definition”).
However, Drexler does not explicitly disclose “based, at least in part, on whether the mobile robot is determined to be operating in the autonomous mode or the manual mode.”
White discloses an autonomous mobile robot including a light indicator system that generates a visual indication in the form of a pattern of illumination indicative of a status, service condition, etc. of the robot. White specifically teaches wherein “based, at least in part, on whether the mobile robot is determined to be operating in the autonomous mode or the manual mode” (at least as in paragraph 0113, wherein “the controller 302 operates the light sources to emit a certain colored light, such as a blue light, whenever the controller 302 intends to convey information pertaining to an operational status of the robot 100 . . . for example . . . manual drive behavior in which a user uses a remote computing device to manually control motion of the robot 100” and further wherein the light indicator system may indicate autonomous mode functions such as “tracking a progress of a cleaning mission . . . a particular cleaning mode of the robot 100 . . . execution of a robot behavior, for example, behavior to avoid a virtual barrier in response to detecting the virtual barrier, behavior to avoid a drop-off in response to detecting the drop-off, spot cleaning behavior in response to detecting a large amount of debris on a portion of a floor surface . . . and wall following behavior to clean a perimeter of an area”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include White’s teaching of a light indicator system indicating whether the autonomous mobile robot is functioning in a manual drive behavior or an autonomous mode and its orientation, since White teaches wherein the system may create an improved user experience by providing visual indications with consistent meanings across multiple devices, by providing a consistent aesthetic experience for the user, and by providing an intuitive and easily accessible feedback regarding the robot’s status.
Regarding claim 8, in view of the above combination of Drexler, Pinter, and White, Drexler further discloses:
The mobile robot of claim 7, wherein
the controller is configured to control the operation of the first set of individually- controllable light sources and the second set of individually-controllable light sources to indicate a movement intent of the mobile robot when it is determined that the mobile robot is operating in autonomous mode (at least as in paragraph 0017, wherein the vehicle functions under varying degrees of autonomy; see at least Fig. 5A, 6B, 6C, and 7A, wherein the lights are explicitly shown on the corners, the front two sets are set as a headlight and rear set are set as taillights to indicate the direction travel, “a new direction of travel,” or “a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”), and
However, Drexler does not explicitly disclose “the controller is configured to control the operation of the first set of individually- controllable light sources and the second set of individually-controllable light sources to indicate an orientation of the mobile robot relative to a reference frame when it is determined that the mobile robot is operating in manual mode.”
White discloses an autonomous mobile robot including a light indicator system that generates a visual indication in the form of a pattern of illumination indicative of a status, service condition, etc. of the robot. White specifically teaches wherein “the controller is configured to control the operation of the first set of individually- controllable light sources and the second set of individually-controllable light sources to indicate an orientation of the mobile robot relative to a reference frame when it is determined that the mobile robot is operating in manual mode” (at least as in paragraph 0113, wherein “the controller 302 operates the light sources to emit a certain colored light, such as a blue light, whenever the controller 302 intends to convey information pertaining to an operational status of the robot 100 . . . for example . . . manual drive behavior in which a user uses a remote computing device to manually control motion of the robot 100” and further wherein the light indicator system may indicate autonomous mode functions such as “tracking a progress of a cleaning mission . . . a particular cleaning mode of the robot 100 . . . execution of a robot behavior, for example, behavior to avoid a virtual barrier in response to detecting the virtual barrier, behavior to avoid a drop-off in response to detecting the drop-off, spot cleaning behavior in response to detecting a large amount of debris on a portion of a floor surface . . . and wall following behavior to clean a perimeter of an area”; at least as in paragraph 0084 and 0154-0155, wherein performance of a cleaning mission is an autonomous navigation function or mode; at least as in paragraph 0138, wherein “the light indicator system 102 is operated to indicate a direction of movement of a position of the robot 100 . . . the direction of movement, i.e., the rearward drive direction 206 . . . a targeted direction 208 for reorienting the front portion 211 of the robot 100 . . . the targeted direction 208 as the robot 100 turns in place, for example, by activating and deactivating the light sources proximate the targeted direction 208 as the robot 100 turns in place. . . the targeted direction of forward movement of the robot 100 while the robot 100 rotates and reorients itself”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include White’s teaching of a light indicator system indicating whether the autonomous mobile robot is functioning in a manual drive behavior or an autonomous mode and its orientation, since White teaches wherein the system may create an improved user experience by providing visual indications with consistent meanings across multiple devices, by providing a consistent aesthetic experience for the user, and by providing an intuitive and easily accessible feedback regarding the robot’s status.
Regarding claim 17, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 11, further comprising:
determining whether the mobile robot is operating in an autonomous mode or a manual mode (at least as in paragraph 0017, “Vehicle 104 need not be entirely autonomous. That is, vehicle 104 can receive instructions from human operators, computing devices and the like, from time to time, and can act with varying degrees of autonomy in executing such instructions”; at least as in paragraph 0043, “the control data can include indications of any warning conditions that are active (e.g. a low battery warning), any error conditions that are active (e.g. an emergency stop error), and identifiers of any discrete operating modes currently active in vehicle 104”; at least as in paragraph 0048, wherein “At block 405, processor 250 is configured to determine whether the operational data of the state data received at block 310 indicates that a discrete operating mode (such as a docking mode) is active”); and
controlling the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources (at least as in paragraph 0057, “When the determination at block 340 is negative—that is, when all segments have been processed—processor 250 is configured to proceed to block 350. At block 350, processor 250 is configured to control illumination system 224 according to the selected lighting pattern definition”).
However, Drexler does not explicitly disclose “based, at least in part, on whether the mobile robot is determined to be operating in the autonomous mode or the manual mode.”
White discloses an autonomous mobile robot including a light indicator system that generates a visual indication in the form of a pattern of illumination indicative of a status, service condition, etc. of the robot. White specifically teaches wherein “based, at least in part, on whether the mobile robot is determined to be operating in the autonomous mode or the manual mode” (at least as in paragraph 0113, wherein “the controller 302 operates the light sources to emit a certain colored light, such as a blue light, whenever the controller 302 intends to convey information pertaining to an operational status of the robot 100 . . . for example . . . manual drive behavior in which a user uses a remote computing device to manually control motion of the robot 100” and further wherein the light indicator system may indicate autonomous mode functions such as “tracking a progress of a cleaning mission . . . a particular cleaning mode of the robot 100 . . . execution of a robot behavior, for example, behavior to avoid a virtual barrier in response to detecting the virtual barrier, behavior to avoid a drop-off in response to detecting the drop-off, spot cleaning behavior in response to detecting a large amount of debris on a portion of a floor surface . . . and wall following behavior to clean a perimeter of an area”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include White’s teaching of a light indicator system indicating whether the autonomous mobile robot is functioning in a manual drive behavior or an autonomous mode and its orientation, since White teaches wherein the system may create an improved user experience by providing visual indications with consistent meanings across multiple devices, by providing a consistent aesthetic experience for the user, and by providing an intuitive and easily accessible feedback regarding the robot’s status.
Regarding claim 18, in view of the above combination of Drexler, Pinter, and White, Drexler further discloses:
The method of claim 17, wherein determining whether the mobile robot is operating in an autonomous mode or a manual mode (at least as in paragraph 0017, “vehicle 104 can receive instructions from human operators, computing devices and the like, from time to time, and can act with varying degrees of autonomy in executing such instructions”; at least as in paragraph 0019, “Computing device 108 can transmit instructions to vehicle 104, such as instructions to carry out tasks within the facility, to travel to certain locations in the facility, and the like”).
However, Drexler does not explicitly disclose “comprises determining that the mobile robot is operating in the manual mode when a pendant accessory is communicatively coupled to the mobile robot.”
Pinter, in the same field of endeavor of an autonomous control system for a telepresence robot with lights for indicating the status of the robot, specifically teaches “comprises determining that the mobile robot is operating in the manual mode when a pendant accessory is communicatively coupled to the mobile robot” (at least as in paragraph 0079, “The lights may also indicate if a robot 100 is being tele-operated or is autonomously navigating”; at least as in paragraph 0029, “In a semi-autonomous mode, the control system 204 may receive instructions from a user and then operate autonomously to accomplish the instructions. For example, a user may provide an instruction to navigate to a specific patient room. The control system 204 may then navigate to the patient room autonomously, accounting for objects, individuals, routes, or other information to arrive at the room in a timely and safe manner. The control system 204 may receive input from the other components 202, 206, 208, 210, 212, 214, and 216 to navigate in a social and safe manner”; at least as in paragraph 0030, “In a manual mode, the control system 204 may perform instructions as provided by a user. For example, a user may remotely drive the robot 100 using a joystick or other input device or method and the control system 204 may cause the drive system 202 to move the robot 100 in the manner defined by the user. Of course, some aspects of the operation of the robot 100 may still be automated and may not require explicit instruction from a user. In any of the manual, semi-autonomous, or autonomous modes, a user may be able to remotely operate (or tele-operate) and/or view information provided by the robot 100”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include Pinter’s teaching of the control system receiving and utilizing user input for remote operation, since Pinter teaches the control system improves navigation in a social and safe manner by allowing for variable control methods that can adapt to multiple environments.
Regarding claim 19, in view of the above combination of Drexler, Pinter, and White, Drexler further discloses:
The method of claim 17, wherein controlling the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources based, at least in part, on whether the mobile robot is determined to be operating in the autonomous mode or the manual mode comprises:
controlling the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate a movement intent of the mobile robot when it is determined that the mobile robot is operating in autonomous mode (at least as in paragraph 0017, wherein the vehicle functions under varying degrees of autonomy; see at least Fig. 5A, 6B, 6C, and 7A, wherein the lights are explicitly shown on the corners, the front two sets are set as a headlight and rear set are set as taillights to indicate the direction travel, “a new direction of travel,” or “a planned direction of travel 704 (that is, a direction specified by path data), prior to actual movement of vehicle 104”); and
However, Drexler does not explicitly disclose “controlling the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate an orientation of the mobile robot relative to a reference frame when it is determined that the mobile robot is operating in manual mode.”
White discloses an autonomous mobile robot including a light indicator system that generates a visual indication in the form of a pattern of illumination indicative of a status, service condition, etc. of the robot. White specifically teaches wherein “controlling the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate an orientation of the mobile robot relative to a reference frame when it is determined that the mobile robot is operating in manual mode” (at least as in paragraph 0113, wherein “the controller 302 operates the light sources to emit a certain colored light, such as a blue light, whenever the controller 302 intends to convey information pertaining to an operational status of the robot 100 . . . for example . . . manual drive behavior in which a user uses a remote computing device to manually control motion of the robot 100” and further wherein the light indicator system may indicate autonomous mode functions such as “tracking a progress of a cleaning mission . . . a particular cleaning mode of the robot 100 . . . execution of a robot behavior, for example, behavior to avoid a virtual barrier in response to detecting the virtual barrier, behavior to avoid a drop-off in response to detecting the drop-off, spot cleaning behavior in response to detecting a large amount of debris on a portion of a floor surface . . . and wall following behavior to clean a perimeter of an area”; at least as in paragraph 0084 and 0154-0155, wherein performance of a cleaning mission is an autonomous navigation function or mode; at least as in paragraph 0138, wherein “the light indicator system 102 is operated to indicate a direction of movement of a position of the robot 100 . . . the direction of movement, i.e., the rearward drive direction 206 . . . a targeted direction 208 for reorienting the front portion 211 of the robot 100 . . . the targeted direction 208 as the robot 100 turns in place, for example, by activating and deactivating the light sources proximate the targeted direction 208 as the robot 100 turns in place. . . the targeted direction of forward movement of the robot 100 while the robot 100 rotates and reorients itself”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include White’s teaching of a light indicator system indicating whether the autonomous mobile robot is functioning in a manual drive behavior or an autonomous mode and its orientation, since White teaches wherein the system may create an improved user experience by providing visual indications with consistent meanings across multiple devices, by providing a consistent aesthetic experience for the user, and by providing an intuitive and easily accessible feedback regarding the robot’s status.
Claim(s) 10, 20, and 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Drexler et al. (US 20170080850 A1, Drexler) in view of Pinter et al. (US 20150088310 A1, hereinafter Pinter), and further in view of Gagne et al. (US 20200356094 A1, hereinafter Gagne).
Regarding claim 10, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The mobile robot of claim 9, wherein the controller is further configured to control the operation of (although, at least as in paragraph 0068-0069 and Fig. 6B-6C, Drexler discloses wherein the illumination system may indicate both the direction of travel and a warning sub-state at the same time, the corner lights only indicates the direction of travel while the side lights show the warning sub-state).
However, Drexler does not explicitly disclose “the first set of individually-controllable light sources and the second set of individually-controllable light sources and.”
Gagne discloses a mobile robotic device with a plurality of light emitting modules that is configured to control the operation of the one or more light emitting modules to provide visual feedback about one or more machine states of the robotic device using a light pattern. Gagne specifically teaches “the first set of individually-controllable light sources and the second set of individually-controllable light sources and” (at least as in paragraph 0057, wherein the “the lighting modules of the robotic device 100 may provide information about the status of different machine states simultaneously . . . For example, the first row 217 of the light emitting modules may provide information relating to navigation of the robotic device 100, the second row 218 may provide about sensor function, and the third row 219 may provide information about errors and/or collision alert”; at least as in paragraph 0069, wherein “the system may provide visual feedback about all the identified machine states of the robotic device at the same time (for example, using different light emitting modules)”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include Gagne's teaching of light emitting modules indicating numerous machine states and status at the same time, since Gagne teaches wherein the modules can easily provide feedback and other important information regarding the robot’s status, thus improving user experience.
Regarding claim 20, in view of the above combination of Drexler and Pinter, Drexler further discloses:
The method of claim 11, further comprising:
controlling the operation of (although, at least as in paragraph 0068-0069 and Fig. 6B-6C, Drexler discloses wherein the illumination system may indicate both the direction of travel and a warning sub-state at the same time, the corner lights only indicates the direction of travel while the side lights show the warning sub-state).
However, Drexler does not explicitly disclose “the first set of individually-controllable light sources and the second set of individually-controllable light sources.”
Gagne discloses a mobile robotic device with a plurality of light emitting modules that is configured to control the operation of the one or more light emitting modules to provide visual feedback about one or more machine states of the robotic device using a light pattern. Gagne specifically teaches “the first set of individually-controllable light sources and the second set of individually-controllable light sources and” (at least as in paragraph 0057, wherein the “the lighting modules of the robotic device 100 may provide information about the status of different machine states simultaneously . . . For example, the first row 217 of the light emitting modules may provide information relating to navigation of the robotic device 100, the second row 218 may provide about sensor function, and the third row 219 may provide information about errors and/or collision alert”; at least as in paragraph 0069, wherein “the system may provide visual feedback about all the identified machine states of the robotic device at the same time (for example, using different light emitting modules)”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include Gagne's teaching of light emitting modules indicating numerous machine states and status at the same time, since Gagne teaches wherein the modules can easily provide feedback and other important information regarding the robot’s status, thus improving user experience.
Regarding claim 21, in view of the above combination of Drexler, Pinter, and Gagne, Drexler further discloses:
The method of claim 20, wherein controlling the operation of the first set of individually-controllable light sources and the second set of individually-controllable light sources to indicate the status information associated with the mobile robot and the future travel direction of the mobile robot at a same time comprises controlling the operation of (although, at least as in paragraph 0068-0069 and Fig. 6B-6C, Drexler discloses wherein the illumination system may indicate both the direction of travel and a warning sub-state at the same time, the corner lights only indicates the direction of travel while the side lights show the warning sub-state).
However, Drexler does not explicitly disclose “the first set of individually-controllable light sources for one of the first lighting module or the second lighting module such that the first lighting module or the second lighting module.”
Gagne discloses a mobile robotic device with a plurality of light emitting modules that is configured to control the operation of the one or more light emitting modules to provide visual feedback about one or more machine states of the robotic device using a light pattern. Gagne specifically teaches “the first set of individually-controllable light sources for one of the first lighting module or the second lighting module such that the first lighting module or the second lighting module” (at least as in paragraph 0057, wherein the “the lighting modules of the robotic device 100 may provide information about the status of different machine states simultaneously . . . For example, the first row 217 of the light emitting modules may provide information relating to navigation of the robotic device 100, the second row 218 may provide about sensor function, and the third row 219 may provide information about errors and/or collision alert”; at least as in paragraph 0069, wherein “the system may provide visual feedback about all the identified machine states of the robotic device at the same time (for example, using different light emitting modules)”).
Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Drexler, to include Gagne's teaching of light emitting modules indicating numerous machine states and status at the same time, since Gagne teaches wherein the modules can easily provide feedback and other important information regarding the robot’s status, thus improving user experience.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RICARDO ICHIKAWA VISCARRA whose telephone number is (571)270-0154. The examiner can normally be reached M-F 9-12 & 2-4 PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached on (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RICARDO I VISCARRA/Examiner, Art Unit 3657
/ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657