DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) s 1-18 are0 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sakai 2018/0074201 in view of Lewis 2013/0054133.
Regarding claims 1 and 10, Sakai ‘201 discloses an unmanned vehicle management system (Abstract;
[0001]: work machine management system; [0041]: unmanned vehicle) including:
Abstract
A control system is provided with a position detection device which detects a position of a work machine, a contactless sensor which detects an object around the work machine in a contactless manner, a position calculation unit which calculates a position of the work machine on the basis of at least map data indicating a position of the object and detection data of the contactless sensor, and a diagnosis unit which compares the position of the work machine derived from detection data of the position detection device and the position of the work machine calculated by the position calculation unit and diagnoses that there is an abnormality in either the detection data of the position detection device or a calculation result of the position calculation unit.
[0041] In the first embodiment, the dump truck 2 is a so-called unmanned dump truck which autonomously travels the travel route on the basis of a command signal from a management device 10. The autonomous travel of the dump truck 2 means travel based on the command signal from the management device 10 without depending on operation of the worker. Meanwhile, the dump truck 2 may also travel by the operation of the worker.
a first unmanned vehicle including a target position sensor that detects a relative position with
respect to objects ([0041], [0061], [0063]: unmanned vehicle 2 includes a contactless sensor, such as
a radar, for detecting relative position of vehicle to objects);
0061] As illustrated in FIG. 4, the dump truck 2 is provided with a vehicle body 21, a vessel 22, a wheel 23, the contactless sensor 24, and a control system 30. An engine 2E such as a diesel engine, a generator 2G operated by the engine 2E, and a motor 23M operated with electric power generated by the generator are provided in the vehicle body 21. The wheel 23 includes a front wheel 23F and a rear wheel 23R. The rear wheel 23R is driven by the motor 23M. Meanwhile, the power of the engine 2E may be transmitted to the rear wheel 23R through a transmission including a torque converter. A steering device 2S for steering the front wheel 23F is provided on the vehicle body 21. The vessel 22 is loaded by the loading machine. In the discharging operation, the vessel 22 is lifted, and the load is discharged from the vessel 22.
[0062] As illustrated in FIG. 4, the contactless sensor 24 is arranged in a front lower portion of the vehicle body 21. The contactless sensor 24 detects an object around the dump truck 2 in a contactless manner. The object around the dump truck 2 includes the object (bank BK, side wall and the like) present beside the travel route RP. The contactless sensor 24 serves as an obstacle sensor which detects an obstacle in front of the dump truck 2 in a contactless manner.
[0063] The contactless sensor 24 may detect a relative position of the object with respect to the contactless sensor 24 (dump truck 2). The contactless sensor 24 includes a radar 24A and the laser sensor 24B. Resolution of the laser sensor 24B is higher than the resolution of the radar 24A.
a second unmanned vehicle including a dump body onto which a load is loaded by the loader (Fig
1: shows multiple vehicles 2 on work site, including vehicle with dump body receiving load from
loader); and
PNG
media_image1.png
798
707
media_image1.png
Greyscale
a management device that manages travel of each of the first unmanned vehicle and the second
unmanned vehicle ([0041]: unmanned vehicles travel autonomously based on command signal from
management device).
Sakai does not disclose the use of a target position sensor to detect the relative position of the dump body vehicle with respect to the loader.
Abstract(Lewis)
A system and method for navigating a first heavy equipment to a target destination is provided. A location of the target destination is retrieved from a distributed objects database. The location of the target destination is at least partially determined by a position of a second heavy equipment. A position sensor identifies a current position and orientation of the first heavy equipment, and a path from the current position of the first heavy equipment to the location of the target destination is calculated. The calculated path is selected to avoid hazards. A progress of the first heavy equipment along the calculated path is monitored using the position sensor. When the first heavy equipment deviates from the calculated path, a message is outputted to an operator of at least one of the first heavy equipment and the second heavy equipment.
Lewis ‘133 discloses maneuvering of autonomous vehicles on a worksite (Abstract; [(0011]) discloses use of a target position sensor, such as a radar, to detect a relative
position of the vehicle to another piece of equipment, such as a loader ([0045] - [0046)).
[0045] In FIG. 3 system 300 includes a position sensor 302. The position sensor 302 detects the position of the vehicle, for example, by triangulating the vehicle's position in relation to fixed satellites, such as is known in GPS related art. The position sensor 302 might also determine the position of the vehicle by other means such as by triangulating the vehicle's position in relation to terrestrial transmitters located in a mining environment. In certain embodiments, WiFi or WiMax network transceivers with fixed, known positions may be used to provide terrestrial points of reference. The position sensor 302 optionally can use a combination of methods or systems to determine position, for example, by determining a rough position using GPS and performing error correction by terrestrial references, such as broadcasting beacons mounted in and around the mining environment or other terrestrial reference points. In alternative embodiments, position sensor 302 also takes data from conventional RFID, RADAR, optical or other proximity or collision warning systems. These conventional systems can provide a warning signal to the vehicle operator and/or the operator of equipment in proximity to the vehicle if a piece of equipment such as a mine haul truck comes within some predefined range of another piece of equipment. Position sensor 302 also includes one or more systems for determining an orientation of the vehicle. In some cases, orientation may be determined by an electronically-readable compass or other systems that uses the earth's magnetic poles to determine orientation. In other cases, the vehicle's orientation may be sensed using one or more terrestrial beacons or devices mounted in and around the mining environment. In other cases, the vehicle's orientation can be determined algorithmically, for example by tracking a movement of the vehicle over time, sensor 302 can make an accurate determination of the vehicle's orientation.
[0046] In other implementations, position sensor 302 is assisted by a number of external devices that are mounted around various objects in the mining environment to assist in determining a location and an orientation of the vehicle. For example, a number of radar, LIDAR, laser, or other object-detection systems could be installed at the entrance to a crusher bay or other equipment disposed around the mining environment. As a vehicle approaches the bay, object-detection systems can scan the entrance to the bay and communicate the results of their scan to the vehicle. The vehicle uses the information received from the externally-mounted object-detection systems to supplement the information retrieved from position sensor 302 to generate a more accurate description of the vehicle's current position and orientation. These object-detection systems can be used in any location of the mining environment, but may be particularly useful at bay entrances or at any location where a vehicle must navigate particularly accurately. These externally-mounted systems can be mounted on any equipment, features, or objects within the mining environment (e.g., shovels, buildings, crushers, etc.). The externally-mounted systems allow for peer-to-peer aggregation of vehicle and object positional data within the mining environment allowing for more accurate information that can be acquired from sensors mounted on a single vehicle. In one example externally-mounted sensor system, a particular shovel may have a mounted scanning laser to accurately determine the position of a truck relative to the shovel. The data collected by the shovel using the laser system can then be communicated to the truck. That additional data can then be used by the truck to refine its own positional data with respect to the shovel. The combination of positional data collected by the truck's sensors, as well as the shovel's sensors can then be used in navigating the truck into position beside the shovel, for example.
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to provide the device of Sakai ‘201 with a sensor on the dump body to detect the position of the loader as taught by Lewis ‘133 in order to safely autonomously load the dump body on a construction site.
Claim 1. An unmanned vehicle management system comprising:a first unmanned vehicle including a target position sensor that detects a relative position with respect to aloader;a second unmanned vehicle including a dump body onto which a load is loaded by the loader; anda management device that manages travel of each of the first unmanned vehicle and the second unmanned vehicle.(see above rejection)
Claim 2. The unmanned vehicle management system according to claim 1, wherein the loader includes a self-position sensor that detects a position of the loader, and the management device includes a sensor data reception unit that receives detection data of the self-position sensor,a travel path generation unit that generates a travel path to a monitoring point defined around the loader whenthe sensor data reception unit cannot receive the detection data of the self-position sensor, anda travel path transmission unit that transmits, to the first unmanned vehicle, the travel path to the monitoring point.(See Sakai paragraphs 58, 82 and 84 with Lewis paragraphs 11 and 69-70 that teach use of scan match navigation when a GPS signal is poor(Sakai) to determine a target path to the loader(Lewis)).
Claim 3. The unmanned vehicle management system according to claim 1, wherein the loader includes a self-position sensor that detects a position of the loader, and the management device includes a sensor data reception unit that receives detection data of the self-position sensor, andan entry-prohibited area setting unit that sets, forthe loader, an entry-prohibited area in which an unmanned vehicle is prohibited from entering, andthe entry-prohibited area setting unit expands the entry-prohibited area when the sensor data reception unitcannot receive the detection data of the self-position sensor.(See Lewis paragraph 62 definition of Hazard zones and out of bounds areas)
[0062] System 300 may also include proximity detector 314 that checks the vehicle's position against the location of objects defined in distributed objects database 304. The vehicle's position is typically checked against objects such as, for example, defined hazards, other vehicles, areas that have been defined as out-of-bounds or not on a defined route, or areas that are on a defined route but that only permit a particular direction of travel. In some cases, proximity detector 314 compares the vehicle's current position to a number of hazard zones that are defined around a particular object. Depending upon which (if any) hazard zones the vehicle currently occupies, proximity detector 314 can cause different levels of alarm to be sounded for the operator of the vehicle.
Claim 4. The unmanned vehicle management system according to claim 3, wherein the management device includes a position acquisition unit that acquires from the first unmanned vehicle, a position of the loader calculated based on detection data of the target position sensor, andthe entry-prohibited area setting unit stops expansionof the entry-prohibited area when the position acquisition unit acquires the position of the loader.(old and well known to limit access of danger areas in construction sites follows from claim 3)
Claim 5. The unmanned vehicle management system according to claim 4, wherein when the position acquisition unit acquires the position of the loader, the entry-prohibited area setting unit returns the entry-prohibited area to an entry- prohibited area that is set when the sensor data reception unit can receive the detection data of the self-position sensor. (old and well known to limit access of danger areas in construction sites)
Claim 6. The unmanned vehicle management system according to claim 1, wherein the management device includes a position acquisition unit that acqu from the first unmanned vehicle, a position of the loader calculated based on detection data of the target position sensor,a travel path generation unit that generates a travelpath to a loading point based on the position of the loader acquired by the position acquisition unit, anda travel path transmission unit that transmits, to the second unmanned vehicle, the travel path to the loadingpoint.(see Lewis Paragraphs 46-49 and 53-57 using position information obtained from external and peer to peer vehicle sensors for path planning)
[0046] In other implementations, position sensor 302 is assisted by a number of external devices that are mounted around various objects in the mining environment to assist in determining a location and an orientation of the vehicle. For example, a number of radar, LIDAR, laser, or other object-detection systems could be installed at the entrance to a crusher bay or other equipment disposed around the mining environment. As a vehicle approaches the bay, object-detection systems can scan the entrance to the bay and communicate the results of their scan to the vehicle. The vehicle uses the information received from the externally-mounted object-detection systems to supplement the information retrieved from position sensor 302 to generate a more accurate description of the vehicle's current position and orientation. These object-detection systems can be used in any location of the mining environment, but may be particularly useful at bay entrances or at any location where a vehicle must navigate particularly accurately. These externally-mounted systems can be mounted on any equipment, features, or objects within the mining environment (e.g., shovels, buildings, crushers, etc.). The externally-mounted systems allow for peer-to-peer aggregation of vehicle and object positional data within the mining environment allowing for more accurate information that can be acquired from sensors mounted on a single vehicle. In one example externally-mounted sensor system, a particular shovel may have a mounted scanning laser to accurately determine the position of a truck relative to the shovel. The data collected by the shovel using the laser system can then be communicated to the truck. That additional data can then be used by the truck to refine its own positional data with respect to the shovel. The combination of positional data collected by the truck's sensors, as well as the shovel's sensors can then be used in navigating the truck into position beside the shovel, for example.
[0047] When interacting with externally mounted object-detection systems, the external systems may only be able to observe a small portion of the vehicle. For example, when using LIDAR, or radar for example, the systems may only be able to communicate information regarding distance from the detection system to the side of the vehicle that is being presented to the object-detection system--the other sides of the vehicle will be obscured. In that case, though, the present system can use the information received from the object-detection system (including the location of the object-detection system itself) to supplement data retrieved from position sensor 302.
[0048] System 300 includes a number of databases storing information useful in providing the functionality of the present disclosure. Distributed objects database 304 stores a listing of objects that are present within the mining environment. Distributed objects database 304 can store listings of candidate target destinations (where each object in the database may be a target), the position of vehicles and hazards or boundaries within the mining environment. Additional objects stored in distributed objects database 304 can include roadways, parking areas, repair facilities, buildings or structures, dumping areas, or power lines.
[0049] For each object, distributed objects database 304 can store, in addition to the location information for each object, additional descriptive information that identifies characteristics of the object. For example, in the case of vehicles, the database can store information describing the type of vehicle, its size and capacity, its current status (e.g., loaded or unloaded, in use or not in use, etc.), weight, and velocity. For each vehicle, the database may also store information describing the operator of the vehicle (e.g., the operator's experience level, current assignment, shift status, etc.). In the case of hazards, the database can store information describing the severity of the hazards and may define a number of hazard zones around each hazard. In fact, for each object, the database may define a number of hazard zones around the object, with each zone (e.g., a circular area defined around the hazard) representing a different degree of danger. The database can also store information describing roadways and boundaries of the mining environment. In the case of roadways, the database can store information describing a weight limit for vehicles traversing the roadway. Additional information such as slope, consistency, and speed limit can be stored.
[0053] The remote application 300 includes a number of modules that act on data received from one or more of position sensor 302, distributed objects database 304, vehicle condition monitor 306, and configuration database 310.
[0054] The remote application 300 includes navigation aid 322 that is configured to one or more of position sensor 302, distributed objects database 304, vehicle condition monitor 306, and configuration database 310 to assist an operator of the vehicle to navigate to a particular target destination. To begin a navigation maneuver, navigation aid 322 is configured to access position sensor 302 and distributed objects database 304 to identify a listing of potential target destinations. The list of potential targets can be filtered by navigation aid 322 on a number of variables. For example, the listing can be ordered based upon proximity to the vehicle, with targets that are over a threshold distance away being filtered out. Also, based upon various attributes of the vehicle (the attributes can be retrieved from vehicle condition monitor 306 and/or distributed objects database 304) the targets can be filtered. If, for example the vehicle is a shovel, then targets that are only useful to haul trucks can be filtered out. Conversely, if the vehicle is a haul truck, only targets useful to haul trucks are used. Additionally, if the haul truck is fully loaded, for example, only targets that are useful to fully loaded haul trucks are included in the listing of potential targets.
[0055] After identifying the listing of potential targets, navigation aid 322 can display the listing via screen 320. A user interface (e.g., a touch screen, keyboard, voice input, or other user input system) allows an operator of the vehicle to select one of the targets. In other implementations, an automated system selects the target automatically and the selected target is displayed via the user interface.
[0056] After a particular target destination is selected, navigation aid 322 uses position sensor 302, distributed objects database 304, vehicle condition monitor 306 and configuration database 310 to identify a best path for the vehicle to follow in order to maneuver into position at the target.
[0057] After identifying a best path, navigation aid 322 verifies that the vehicle can begin moving using vehicle condition monitor 306 and configuration database 310. If so, navigation aid 322 constantly monitors the current position of the vehicle with respect to the selected path using position sensor 302. Using the vehicle's current position and orientation, navigation aid 322 uses screen 320 to provide feedback to the vehicle operator to assist the operator in maneuvering the vehicle along the selected path. As the vehicle begins to deviate from the selected path, for example, navigation aid 322 may use screen 320 to provide feedback to the operator instructing the operator to turn the vehicle to return to the selected path. Alternatively, feedback could be provided via other user interfaces 324. For example, navigation instruction could be provided by the vehicle's rear view mirrors. A number of light sources (e.g., LEDs) may be disposed around the housing of the rear view mirror. By illuminating various combinations or colors of the light sources, the vehicle operator can be instructed to maintain the current course, steer to the left by a small degree, steer to the right by a small degree, steer to the left by a large degree, or steer to the right by a large degree. The light sources may also indicate when no alignment with a defined route or path is possible. In other implementations, user interface 324 could include a heads-up display or virtual reality output for displaying a particular path, route, or other information for the vehicle operator. Additionally, voice instruction could assist an operator in navigating a particular path.
Claim 7. The unmanned vehicle management system according to claim 3, wherein the management device includes a position acquisition unit that acquires, from the first unmanned vehicle, a position of the loader calculated based on detection data of the target position sensor,a travel path generation unit that newly generates a travel path to a loading point based on the position of theloader acquired by the position acquisition unit when the sensor data reception unit cannot receive the detection data of the self-position sensor, anda travel path transmission unit that transmits, to the second unmanned vehicle, the travel path to the loadingpoint. .(see Lewis Paragraphs 46-49 and 53-57 above using position information obtained from external and peer to peer vehicle sensors for path planning)
Claim 8. The unmanned vehicle management system according to claim 1, wherein the target position sensor includes a laser sensor, and the first unmanned vehicle includes a target position calculation unit that calculates a position of the loader based on detection data of the target position sensor.(see Lewis paragraphs 46-47)
Claim 9. The unmanned vehicle management system according to claim 1, wherein the loader is an excavator including a turning body and working equipment supported by the turning body.(see Sakai paragraph 37 and Fig1 above)
[0037] The mining machine 4 is a collective term of machinery used in various operations in a mine. The mining machine 4 includes at least one of a boring machine, an excavating machine, a loading machine, a transporting machine, a crusher, and a vehicle driven by a worker. The excavating machine is the mining machine for excavating the mine. The loading machine is the mining machine for loading the transporting machine. The loading machine includes at least one of a hydraulic excavator, an electric excavator, and a wheel loader. The transporting machine is the mining machine including a moving body such as a dump truck movable in the mine for transporting the load. The load includes at least one of earth and sand and minerals generated by excavating. The crusher crushes discharged earth input from the transporting machine.
Claim 10. An unmanned vehicle management method comprising:detecting a relative position with respect to a loader by a target position sensor included in a first unmannedvehicle;loading a load onto a dump body included in a second unmanned vehicle by the loader; andmanaging travel of each of the first unmanned vehicle and the second unmanned vehicle.(follows from claim 1)
Claim 11. The unmanned vehicle management method according to claim 10, comprising:receiving detection data of a self-position sensor that detects a position of the loader;generating a travel path to a monitoring point defined around the loader when the detection data of the self- position sensor cannot be received; andtransmitting, to the first unmanned vehicle, the travel path to the monitoring point.(follows from claim 2)
Claim 12. The unmanned vehicle management method according to claim 10, comprising:receiving detection data of a self-position sensor that detects a position of the loader;setting, for the loader, an entry-prohibited area in which an unmanned vehicle is prohibited from entering; andexpanding the entry-prohibition area when the detection data of the self-position sensor cannot be received.(Follows from claim 3)
Claim 13. The unmanned vehicle management method according to claim 12, comprising:acquiring, from the first unmanned vehicle, a positionof the loader calculated based on detection data of the target position sensor; andstopping expansion of the entry-prohibited area when the position of the loader is acquired.(follows from claim 4)
Claim 14. The unmanned vehicle management method according to claim 13, wherein when the position of the loader is acquired, the entry-prohibited area is returned to an entry-prohibited area that is set when the detection data of the self- position sensor can be received.(follows from claim 5)
Claim 15. The unmanned vehicle management method according to claim 10, comprising:acquiring, from the first unmanned vehicle, a position of the loader calculated based on detection data of the target position sensor;generating a travel path to a loading point based on the position of the loader; andtransmitting, to the second unmanned vehicle, the travel path to the loading point.(follows from claim 6)
Claim 16. The unmanned vehicle management method according to claim 12, comprising:acquiring, from the first unmanned vehicle, a position of the loader calculated based on detection data of the target position sensor;newly generating a travel path to a loading point based on the position of the loader when the detection dataof the self-position sensor cannot be received; andtransmitting, to the second unmanned vehicle, the travel path to the loading point.(follows from claim 7)
Claim 17. The unmanned vehicle management method according to claim 10, wherein the target position sensor includes a laser sensor, and a position of the loader is calculated based on detection data of the target position sensor.(follows from claim 8)
Claim 18. The unmanned vehicle management method according to claim 10, wherein the loader is an excavator including a turning body and working equipment supported by the turning body.(follows from claim 9)
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RICHARD M CAMBY whose telephone number is (571)272-6958. The examiner can normally be reached M - F flex.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Peter D Nolan can be reached at 571 270 7016. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RICHARD M CAMBY/Primary Examiner, Art Unit 3661