DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 124 are rejected under 35 U.S.C. 102(a)1 and 102(a)2 as being anticipated by Russell`194, Pub. No.: US 10108194 B1.
Regarding claims 1 & 13, Russell`194 discloses an autonomous mobile robot (AMR) (col.4 lines 15-25 (21) “An autonomous or semi-autonomous vehicle… the vehicle may be a forklift or a pallet jack working within a warehouse environment”) & a pose validation method, comprising;
a chassis and a manipulatable payload engagement portion (col.11 lines 10-30 (55) “the vehicle system 100 may include a chassis and/or an operator cabin, which may connect to or house components of the vehicle system 100.” & col.15 lines 30-40 (75) “The robotic truck unloader 300 may include a robotic arm 302 with a gripping component 304 for gripping objects within the environment.”);
sensors configured to acquire real-time sensor data (col.4 lines 25-50 (22) The vehicle may be equipped with a sensor such as a camera, stereo camera, depth sensor, LIDAR, Radar, and/or infrared sensor, among other possibilities.” & col.9 lines 40-60 (47) The sensor(s) 112 may provide sensor data to the processor(s) 102 (perhaps by way of data 107) to allow for interaction of the vehicle system 100 with its environment, as well as monitoring of the operation of the vehicle system 100. …The sensor(s) 112 may monitor the environment in real time”);
a pose validation system comprising computer program code executable by at least one
processor to evaluate the sensor data (at least some of the sensor data to validate a pose of the AMR including) to: determine a pose of an object located on an infrastructure (the ref. paragraphs discloses the claim elements to provide “sensor data evaluation” by a processor comprising a portion of program code / related data”. See col.29/30 lines 50-10 (140) “a portion of program code (including related data). The program code may include one or more instructions executable by a processor” & (col.29 lines 15-25 (97) “verifying and/or validating that each respective point of the plurality of points is observable from at least a respective portion of the nominal path.” & (col.4 lines 35-55 (23) “the portion of the environment observable by the sensor may depend on and/or be determined based on the position and orientation of the vehicle within the environment.” & col.4-5 lines 55-5 (24) “the vehicle may scan one or more portions of the environment or objects located therein. … the path planning operations may consider constraints imposed on the planned path by any sensor blind spots. … to verify that the target location is free of obstacles and other hazards.”);
process at least some of the sensor data to generate at least one exclusion region and/or volume and exclude sensor data from the at least one exclusion region and/or volume to determine whether the AMR will collide with the infrastructure if a pose of the AMR matches the goal pose. (In order to determine collision status, sensor data process disclosed at the referenced paragraphs, i.e. : See, the col.4-5 lines 55-5 (24) “the path planning operations may consider constraints imposed on the planned path by any sensor blind spots. … the vehicle may use the sensors to scan the target location to verify that the target location is free of obstacles and other hazards. … the vehicle may scan the extent or volume of the drop-off location to verify that the drop-off location is free of other objects that the vehicle or object might collide with and/or that no humans are working or occupying the target location. Thus, given any sensor blind spots, the path planning operations may determine a path that allows the sensors on the vehicle to scan the entire extent or volume of the target location.” & col.9-10 lines 40-10 (47) The sensor(s) 112 may provide sensor data to the processor(s) 102 to allow for interaction of the vehicle system 100 with its environment, as well as monitoring of the operation of the vehicle system 100. The sensor data may be used in evaluation of various factors … the sensor(s) 112 may capture data corresponding to the terrain of the environment, location and/or identity of nearby objects (e.g., pallets, environmental landmarks), which may assist with environment recognition and navigation.” & see also ); (col.24 lines 15-30 (115) , col.25 lines 40-55 (121) The path validation and verification operations.… the respective point will be observable to the one or more sensors connected to the vehicle.).
Regarding claims 2 & 14, Russell`194 discloses the AMR of claim 1 & the method of claim 13, the AMR is configured to adjust a pose of the AMR if a potential collision with the infrastructure is determined (see col. 4-5 lines 55-5 (24) “the path planning operations may consider constraints imposed on the planned path by any sensor blind spots. … the vehicle may use the sensors to scan the target location to verify that the target location is free of obstacles and other hazards.” & col.6 lines 10-30 (30) “the control system may determine to omit scanning the target location or specific points therein. Omitting to scan the target location might not pose a significant risk of collision … The time threshold may be dynamically determined”).
Regarding claims 3 & 15, Russell`194 discloses the AMR of claim 1 & the method of claim 13,
wherein the pose validation system is configured to process at least some of the sensor data from at least one sensor to generate at least one two-dimensional (2D) polygon and /or at least one three-dimensional (3D) volume between the AMR and the infrastructure to determine whether the AMR taking the goal pose will result in a collision with infrastructure (see col.15 lines 15-50 (74) “The sensors may move along a sensor trajectory to scan an environment containing one or more objects in order to capture visual data and/or three-dimensional (3D) depth information. Data from the scans may then be integrated into a representation of larger areas in order to provide digital environment reconstruction. In additional examples, the reconstructed environment may then be used for identifying objects to pick up, determining pick positions for objects, and/or planning collision-free trajectories for the one or more robotic arms and/or a mobile base.” & (76) “sensor 306 and sensor 308, which may be two-dimensional (2D) sensors and/or 3D depth sensors that sense information about the environment as the robotic arm 302 moves. Robotic arm 302 may be controlled to move sensor 306 to control a portion of the environment observable by the sensor 306.” & col.19 lines 40-50 (94) “The respective field of visibility to the respective point may define a two-dimensional (2D) or three-dimensional (3D) extent of the environment from within which the respective point may be observable. … a sensor placed within the respective field of visibility of the respective point may scan the point as well as portions of the environment surrounding the point.”).
Regarding claims 4 & 16, Russell`194 discloses the AMR of claim 1 & the method of claim 13, wherein the pose validation system is configured to process sensor data from at least one
first sensor to generate a two-dimensional (2D) polygon around the goal pose and to
exclude points from the outside the 2D polygon to determine whether the AMR taking
the goal pose will result in a collision with infrastructure (col.18 lines 15-45 (88) “The portion of the field of view of sensor 402 obstructed by object 408 may be referred to as a sensor blind spot.” & (89) “a sensor blind spot of a sensor connected to a vehicle may be an area or volume, defined relative to the vehicle, that is obstructed or excluded from the view of the sensor by portions of the vehicle, objects carried by the vehicle, and/or a mechanism connecting the sensor to the vehicle and defining the range of motion of the sensor relative to the vehicle. Sensor blind spot may be considered when planning a path for the vehicle to follow and/or when planning a sensor trajectory along which to move the sensor to scan points of interest within the environment as the vehicle moves along the planned path.” & col.21 lines 45-60 (104) “the plurality of points 614-620 may be points in a 2D plane defining the floor of the drop-off location 604. The control system may determine or assume that drop-off locations are sufficiently tall to fit any objects placed therein without risk of collision (e.g., pallets always fit on pallet racks). Thus, the sensors of the vehicle might scan only the floor of the drop-off location 604 to verify that the drop-off location is free of obstacles.”).
Regarding claims 5 & 17, Russell`194 discloses the AMR of claim 1 & the method of claim 13, wherein the pose validation system is configured to process sensor data from at least one second sensor to generate a three-dimensional (3D) volume between the chassis and the payload engagement portion and to exclude points from the 3D volume to determine whether the AMR taking the goal pose will result in a collision with infrastructure (see col.15 lines 15-50 (74) “The sensors may move along a sensor trajectory to scan an environment containing one or more objects in order to capture visual data and/or three-dimensional (3D) depth information … determining pick positions for objects, and/or planning collision-free trajectories for the one or more robotic arms and/or a mobile base.” & (76) “sensor 306 and sensor 308, which may be two-dimensional (2D) sensors and/or 3D depth sensors that sense information about the environment as the robotic arm 302 moves.” & col.19 lines 40-50 (94) “The respective field of visibility to the respective point may define a two-dimensional (2D) or three-dimensional (3D) extent of the environment from within which the respective point may be observable.” & (col.18 lines 15-45 (89) “a sensor blind spot of a sensor connected to a vehicle may be an area or volume, defined relative to the vehicle, that is obstructed or excluded from the view of the sensor” & col.21 lines 45-60 (104) “the plurality of points 614-620 may be points in 3D space defining a boundary of the volume to be occupied by the object 408 at the drop-off location 604. Additional points not illustrated in FIG. 6A may be determined in order to define a 3D space. When determining the plurality of points, a margin of error may be included to define a volume that is larger than the actual volume and/or dimensions of the object 408 to account for any error in sensing and/or vehicle control.).
Regarding claims 6 & 18, Russell`194 discloses the AMR of claim 5 & the method of claim 17, wherein the at least one first sensor includes a sensor different from the at least one second sensor (col.9 lines 25-35 (46) “The sensor(s) 112 may include … position sensors, proximity sensors, motion sensors, location sensors, … depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras” & (47) “sensor(s) 112 may include RADAR … LIDAR … SONAR … VICON®… one or more cameras (e.g., stereoscopic cameras for 3D vision)” & col.15 lines 45-65 (76) “a sensing system …such as sensor 306 and sensor 308, which may be two-dimensional (2D) sensors and/or 3D depth sensors that sense information about the environment as the robotic arm 302 moves.” & (77) “scans from one or more 2D or 3D sensors with fixed mounts on a mobile base, such as a front navigation sensor 316 and a rear navigation sensor 318, and one or more sensors mounted on a robotic arm, such as sensor 306 and sensor 308” & & col.17 lines 1-20 (83) “position sensors, touch sensors, depth sensors, … radar, cameras, depth sensors, lasers, a light detection and ranging (LIDAR) device, a time-of-flight camera), a stereo camera, motion sensors” ).
Regarding claims 7 & 19, Russell`194 discloses the AMR of claim 1 & the method of claim 17, wherein the payload engagement portion is a pair of forks and the chassis includes outriggers and the 3D volume is located between the forks and the outriggers (col.11 lines 10-30 (55) “the vehicle system 100 may include a chassis and/or an operator cabin, which may connect to or house components of the vehicle system 100. The structure of the chassis and/or cabin may vary within examples and may further depend on operations that a given vehicle may have been designed to perform. For example, a vehicle developed to carry large, heavy loads may have a wide, rigid chassis that enables placement of the load. Similarly, a vehicle designed to carry light loads at high speeds may have a narrow, small chassis that does not have substantial weight. Further, the chassis, cabin, and/or the other components may be developed using various types of materials, such as metals or plastics. Within other examples, a vehicle may have a chassis with a different structure or made of various types of materials.” & (56) The chassis, cabin, and/or the other components may include or carry the sensor(s) 112. These sensors may be positioned in various locations on the vehicle system 100, such as on top of the chassis to provide a high vantage point for the sensor(s) 112.).
Regarding claims 8 & 20, Russell`194 discloses the AMR of claim 7 & the method of claim 19, wherein one or more of the forks includes at least one LiDAR scanner (col.17 lines1-20 (83) “light sensors … lasers, a light detection and ranging (LIDAR) device, a structured-light scanner”).
Regarding claims 9 & 21, Russell`194 discloses the AMR of claim 1 & the method of claim 13, wherein at least some of the sensor data includes point cloud data (see col.12 lines 55-65 (63) “To coordinate actions of separate components, a global control system 250, such as a remote, cloud-based server system, may communicate (e.g., through wireless communication) with some or all of the system components and/or with separate local control systems of individual components.” & col.14 lines 1-20 (68) “a cloud-based server system may incorporate data and information from individual vehicles/robots within the fleet and/or from external sources.” & (69) “as an automatic pallet jack passes and identifies an object, the pallet jack may send information up to a remote, cloud-based server system. Such information may be used to fix errors in central planning, help to localize robotic devices, or to identify lost objects.)
Regarding claims 10 & 22, Russell`194 discloses the AMR of claim 1 & the method of claim 13, wherein the sensors include at least one 3D camera (col.9 lines 40-60 (47) “sensor(s) 112 may include … one or more cameras (e.g., stereoscopic cameras for 3D vision),” ).
Regarding claims 11 & 23, Russell`194 discloses the AMR of claim 1 & the method of claim 13, wherein the sensors include at least one LiDAR scanner (col.17 lines 1-20 (83) “light sensors … lasers, a light detection and ranging (LIDAR) device, a structured-light scanner”).
Regarding claims 12 & 24, Russell`194 discloses the AMR of claim 1 & the method of claim 13, wherein the infrastructure includes a table and/or a shelf (col.19 lines 1-15 (91) “a target location may be determined for an object carried by a vehicle. ... The object may be, for example, a pallet, box, barrel, or storage container. Other types of objects not limited to units of storage are possible. The target location may be, for example, a drop-off location for the object. The drop-off location may be, for example, a storage shelf, a pallet rack, or an area within an environment designated for receiving and/or storing the object.”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See Notice of References Cited.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jalal C CODUROGLU whose telephone number is (408)918-7527. The examiner can normally be reached Monday -Friday 8-6 PT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at 571-272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Jalal C CODUROGLU/Examiner, Art Unit 3665