Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This is the first Office action on the merits. Claims 1-20 are currently pending and addressed below.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 08/16/2024 has been received. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
The information disclosure statement (IDS) submitted on 12/16/2024 has been received. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
The information disclosure statement (IDS) submitted on 09/26/2025 has been received. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Although conflicting claims are not identical, they are not patentably distinct from each other because removing inherent and/or unnecessary limitation(s)/step(s) or adding an element and its function would be within the level of one of ordinary skill in the art. It is well settled that the adding or deleting of an element and its function(s) in the claim of the present application are an obvious expedient if the remaining elements perform the same function as before. In re Karlson, 136 USPQ 184 (CCPA 1963). Also note Ex parte Rainu, 168 USPQ 375 (Bd. App. 1969). Omission of a referenced element or step whose function is not needed would be obvious to one of ordinary skill in the art. Examiner further notes wherein although the claims are not identical (slightly broader), they are commensurate in scope to the claim limitations provided in the issued U.S. Patent, and likewise would anticipate the currently provided claim limitations.
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. See In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the conflicting application or patent is shown to be commonly owned with this application. See 37 CFR 1.131(c). A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional, the reply must be complete. MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application will determine what form should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to [ 1 ].
Claims 1-20 are rejected on the ground of nonstatutory double patenting of the claims in United States Patent No. 12090661. Although the conflicting claims are not identical, they are not patentably distinct from each other because:
Table 1: Comparison of claims in instant application No. 18/807,758 vs. U.S. Patent No 12090661
Instant Application No. 18/807,758
(Difference Emphasis Added)
U.S. Patent No. US 12090661 B2
(Difference Emphasis Added)
1. A robotic system, comprising:
a communication interface; and
a processor coupled to the communication interface and configured to:
receive via the communication interface sensor data from one or more sensors deployed in a physical space in which a robot is located;
determine based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot including by determining a vector to intercept a moving item at an intercept location that is based on an initial location of the moving item and a velocity associated with the moving item; and
send to the robot, via the communication interface, a command to implement the at least partly velocity-based trajectory.
1. (Currently Amended) A robotic system, comprising:
a communication interface; and
a processor coupled to the communication interface and configured to:
receive via the communication interface sensor data from one or more sensors deployed in a physical space in which a robot is located;
determine based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot including by determining a vector to intercept a moving item located at a first location and a first distance away from the element or location, wherein the moving item is grasped by the element at a second location that is based on the a velocity associated with the moving item and the first distance, wherein the determined intercept vector matches [[a]] the velocity associated with the moving item or location;
send to the robot, via the communication interface, a command to implement the at least partly velocity-based trajectory;
compare an observed velocity of the element to a corresponding simulated velocity of the element according to a simulated operation of the robot;
determine a difference between the observed velocity of the element and the corresponding simulated velocity of the element; and
send to the robot, via the communication interface, a second command that causes the robot to reduce the difference between the observed velocity of the element and the corresponding simulated velocity of the element.
2. The system of claim 1, wherein the sensor data includes image sensor data.
2. (Previously Presented) The system of claim 1, wherein the sensor data includes image sensor data.
3. The system of claim 1, wherein the communication interface comprises a wireless interface.
3. (Original) The system of claim 1, wherein the communication interface comprises a wireless interface.
4. The system of claim 1, wherein the processor is further configured to simulate operation of the robot in a location.
4. (Currently Amended) The system of claim 1, wherein the processor is further configured to simulate operation of the robot in [[the]] a location.
5. The system of claim 4, wherein the processor is configured to determine the at least partly velocity-based trajectory at least in part by comparing an observed velocity of the element comprising the robot to a corresponding simulated velocity of the element according to the simulated operation of the robot.
See claim 1 above.
6. The system of claim 1, wherein the element comprising the robot comprises an end effector.
6. (Original) The system of claim 1, wherein the element comprising the robot comprises an end effector.
7. The system of claim 1, wherein the processor is further configured to determine based at least in part on an attribute of an object currently within a grasp of the element comprising the robot a set of limits including one or more of a velocity limit and an acceleration limit, and to enforce the set of limits in determining the at least partly velocity-based trajectory.
7. (Original) The system of claim 1, wherein the processor is further configured to determine based at least in part on an attribute of an object currently within a grasp of the element comprising the robot a set of limits including one or more of a velocity limit and an acceleration limit, and to enforce the set of limits in determining the at least partly velocity-based trajectory.
8. The system of claim 1, the command includes a torque-based command associated with a computed torque to be applied at a joint associated with the command to achieve the at least partly velocity-based trajectory.
8. (Original) The system of claim 1, the command includes a torque-based command associated with a computed torque to be applied at a joint associated with the command to achieve the at least partly velocity-based trajectory.
9. The system of claim 1, wherein the processor is further configured to determine the at least partly velocity-based trajectory based at least in part on an imputed repulsion force associated with an item or structure in a location.
9. (Currently Amended) The system of claim 1, wherein the processor is further configured to determine the at least partly velocity-based trajectory based at least in part on an imputed repulsion force associated with an item or structure in [[the]] a location.
10. The system of claim 9, wherein the item or structure includes one or more of a chassis or other structure comprising the robot, a rail or other structure on which one or more of the robot and the chassis are configured to ride, a second robot present in the location, and a fixed structure present in the location.
10. (Original) The system of claim 9, wherein the item or structure includes one or more of a chassis or other structure comprising the robot, a rail or other structure on which one or more of the robot and the chassis are configured to ride, a second robot present in the location, and a fixed structure present in the location.
11. The system of claim 1, wherein the processor is configured to receive an indication to divert from a first task associated with a first velocity-based trajectory to a second task, and to determine and implement a second velocity-based trajectory to perform the second task.
11. (Original) The system of claim 1, wherein the processor is configured to receive an indication to divert from a first task associated with a first velocity-based trajectory to a second task, and to determine and implement a second velocity-based trajectory to perform the second task.
12. The system of claim 11, wherein the processor is configured to include in the second velocity-based trajectory a velocity-based transition from moving the element in a first direction comprising the first velocity-based trajectory to a trajectory to a second direction associated with the second task.
12. (Original) The system of claim 11, wherein the processor is configured to include in the second velocity-based trajectory a velocity-based transition from moving the element in a first direction comprising the first velocity-based trajectory to a trajectory to a second direction associated with the second task.
13. The system of claim 1, wherein the robot comprises a first robot and the processor is configured to grasp an object using the first robot and a second robot, and to use velocity control to move the first robot and the second robot in synchronization to move the object to a destination position.
13. (Original) The system of claim 1, wherein the robot comprises a first robot and the processor is configured to grasp an object using the first robot and a second robot, and to use velocity control to move the first robot and the second robot in synchronization to move the object to a destination position.
14. The system of claim 1, wherein the processor is configured to determine and use a position error or difference between an expected position of the element and an observed position determined based at least in part on the sensor data to determine and implement an adjustment to the at least partly velocity-based trajectory.
14. (Previously Presented) The system of claim 1, wherein the processor is configured to determine and use a position error or difference between an expected position of the element and an observed position determined based at least in part on the sensor data to determine and implement an adjustment to the at least partly velocity-based trajectory.
It is well settled that it is unobviousness in the overall appearance of the claimed design, when compared with the prior art, rather than minute details or small variations in design as appears to be the case here, that constitutes the test of design patentability. See In re Frick, 275 F.2d 741, 125 USPQ 191 (CCPA 1960) and In re Lamb, 286 F.2d 610, 128 USPQ 539 (CCPA 1961).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-4, 6-7, 15-16, and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kanemoto et al. (US 20210094187 A1), hereinafter Kanemoto in view of Gross et al. (US 20070272513 A1), hereinafter Gross and Hudson et al. (US 10166676 B1), hereinafter Hudson.
Regarding claim 1, Kanemoto teaches:
1. A robotic system, comprising:
a communication interface; and
a processor coupled to the communication interface (Paragraphs 0189-0191, "The computer 3000 according to the present embodiment can include a CPU 3012, a RAM 3014, a graphic controller 3016, and a display device 3018, which are mutually connected by a host controller 3010. The computer 3000 can also include input/output units such as a communication interface 3022, a hard disk drive 3024, a DVD-ROM drive 3026 and an IC card drive, which are connected to the host controller 3010 via an input/output controller 3020. The computer can also include legacy input/output units such as a ROM 3030 and a keyboard 3042, which are connected to the input/output controller 3020 through an input/output chip 3040.
The CPU 3012 can operate according to programs stored in the ROM 3030 and the RAM 3014, thereby controlling each unit. The graphic controller 3016 can obtain image data generated by the CPU 3012 on a frame buffer or the like provided in the RAM 3014 or in itself, and cause the image data to be displayed on the display device 3018.
The communication interface 3022 can communicate with other electronic devices via a network. The hard disk drive 3024 can store programs and data used by the CPU 3012 within the computer 3000. The DVD-ROM drive 3026 can read the programs or the data from the DVD-ROM 3001, and provide the hard disk drive 3024 with the programs or the data via the RAM 3014. The IC card drive can read programs and data from an IC card, and/or write programs and data into the IC card.") and configured to:
receive via the communication interface sensor data from one or more sensors deployed in a physical space in which a robot is located; (Paragraph 0040, "In the present embodiment, the drive control unit 134 can control the operation of the robot arm 132 and the end effector 140. The drive control unit 134 may control the operation of the robot arm 132 and the end effector 140 in accordance with instructions from the transfer control apparatus 150. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the robot arm 132. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the end effector 140. The drive control unit 134 may transmit the outputs of the sensors described above to the transfer control apparatus 150." as well as Paragraph 0044, "The image capturing apparatus 160 may include a plurality of cameras or sensors that are each arranged at a different position. Each of these cameras or sensors may output, by itself, a two-dimensional image, three-dimensional image, or distance image (sometimes referred to as a point group) of a subject. The image capturing apparatus 160 may process the outputs of the plurality of cameras or sensors, and output a three-dimensional image or distance image (sometimes to referred to as a point group) of the subject. The image may be a still image or a moving image.")
determine based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.") including by … ; and
send to the robot, via the communication interface, a command to implement the at least partly velocity-based trajectory. (Paragraph 0102, "In the present embodiment, the control signal output unit 552 can acquire the plan information from the trajectory path planning unit 530. The transport target specifying unit 522 can generate a control signal for controlling the operation of the robot 130, based on the plan information. The transport target specifying unit 522 can transmit the generated control signal to the drive control unit 134.")
Kanemoto does not specifically teach determining an intercept vector to intercept a moving object. However, Gross, in the same field of endeavor of robotics, teaches:
… determining a … to intercept a moving item at an intercept location (Paragraph 0038, "In the second calculation step, which is carried out immediately before the start of the approach movement toward the relevant article, only the determined position of the article 3 currently to be approached is taken into account with regard to the starting position GP, i.e. the approach position is shifted by a certain amount Δy in the movement direction of the conveyor belt 2 in relation to the starting position GP. In the second calculation step, therefore, based on the movement set that was calculated in the first calculation step and with the aid of the difference Δy, a modified, new movement set is calculated, which is based on an exact approach position so that during execution of the approach movement, the robotic arm 1 and the relevant article 3 both reach the approach position at the same time. In the second calculation step, the end point is shifted by Δy.") that is based on an initial location of the moving item and a velocity associated with the moving item (Paragraph 0035, "The current position of the article 3 on the conveyor belt 2 is determined with the aid of a position detection system 11 that determines the more precise absolute position of the article 3 on the conveyor belt 2. With the aid of the second calculation means 10, the respective movement set is calculated so that the approach position of the robotic arm 1 corresponds to the position of the article 3 to be approached at the time at which the grasping element 5 is expected to reach the approach position. Then the grasping element 5 of the robotic arm 1 reaches the approach position at precisely the same time that the article 3 to be approached arrives at the approach position." as well as Paragraph 0039, "The belt speed is added to the path segments of the movement set in a suitable fashion. In order to achieve as smooth as possible an acceleration and braking motion of the robotic arm 1, it is possible for the conveyor speed to be taken into account in a sinusoidal/quadratic fashion in the path segments of the precalculated movement set and for it to be added to the path segments.") …
However, Hudson, in the same field of endeavor of robotics, teaches:
… vector (Column 7, Lines 28-38, "The pose of an end effector may be defined in various manners, such as in joint space and/or in Cartesian/configuration space. A joint space pose of an end effector may be a vector of values that define the states of each of the actuators that dictate the position of the end effector. A Cartesian space pose of an end effector may utilize coordinates or other values that define all six degrees of freedom of the end effector relative to a reference frame. It is noted that some robots may have kinematic redundancy and that more than one joint space pose of an end effector may map to the same Cartesian space pose of the end effector in those robots.") …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and operating methods as taught by Kanemoto with the ability to determine a trajectory which would intercept a moving object as taught by Gross and to define the trajectory using a vector as taught by Hudson. Kanemoto teaches determining a trajectory for the end effector. However, they are silent on the method of intercepting the object as it moves. Combining these teachings would allow the system to operate in a greater variety of environments making it more versatile and efficient.
Regarding claim 2, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
2. The system of claim 1, wherein the sensor data includes image sensor data. (Paragraph 0044, "The image capturing apparatus 160 may include a plurality of cameras or sensors that are each arranged at a different position. Each of these cameras or sensors may output, by itself, a two-dimensional image, three-dimensional image, or distance image (sometimes referred to as a point group) of a subject. The image capturing apparatus 160 may process the outputs of the plurality of cameras or sensors, and output a three-dimensional image or distance image (sometimes to referred to as a point group) of the subject. The image may be a still image or a moving image.")
Regarding claim 3, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
3. The system of claim 1, wherein the communication interface comprises a wireless interface. (Paragraphs 0189-0191, "The computer 3000 according to the present embodiment can include a CPU 3012, a RAM 3014, a graphic controller 3016, and a display device 3018, which are mutually connected by a host controller 3010. The computer 3000 can also include input/output units such as a communication interface 3022, a hard disk drive 3024, a DVD-ROM drive 3026 and an IC card drive, which are connected to the host controller 3010 via an input/output controller 3020. The computer can also include legacy input/output units such as a ROM 3030 and a keyboard 3042, which are connected to the input/output controller 3020 through an input/output chip 3040.
The CPU 3012 can operate according to programs stored in the ROM 3030 and the RAM 3014, thereby controlling each unit. The graphic controller 3016 can obtain image data generated by the CPU 3012 on a frame buffer or the like provided in the RAM 3014 or in itself, and cause the image data to be displayed on the display device 3018.
The communication interface 3022 can communicate with other electronic devices via a network. The hard disk drive 3024 can store programs and data used by the CPU 3012 within the computer 3000. The DVD-ROM drive 3026 can read the programs or the data from the DVD-ROM 3001, and provide the hard disk drive 3024 with the programs or the data via the RAM 3014. The IC card drive can read programs and data from an IC card, and/or write programs and data into the IC card.")
Regarding claim 4, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
4. The system of claim 1, wherein the processor is further configured to simulate operation of the robot in a location. (Paragraph 0123, "In the present embodiment, the output simulator 730 can simulate the output of the force sensation sensor 242, based on the plan information. For example, the output simulator 730 can access the product information storage unit 452 and acquire the information representing the mass of the package corresponding to the workpiece. Furthermore, as an example, the output simulator 730 can access the workpiece information storage unit 454 and acquire the plan information relating to the workpiece. Next, the output simulator 730 can estimate the magnitude of at least one of the force and the torque to be detected by the force sensation sensor 242 when the robot 130 transports the work piece, based on the information representing the mass described above and the plan information. In this way, the magnitude of at least one of the force and the torque that would be detected by the force sensation sensor 242 when the robot 130 transports the workpiece can be estimated. The output simulator 730 may estimate the magnitude and the direction of at least one of the force and torque to be detected by the force sensation sensor 242. In this way, the magnitude and direction of at least one of the force and the torque that would be detected by the force sensation sensor 242 can be estimated.")
Regarding claim 6, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
6. The system of claim 1, wherein the element comprising the robot comprises an end effector. (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.")
Regarding claim 7, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
7. The system of claim 1, wherein the processor is further configured to determine based at least in part on an attribute of an object currently within a grasp of the element comprising the robot (Paragraph 0025, " In the registration process, the transfer system 100 can register a characteristic of the package 102 in a database. Examples of the characteristic of the package 102 include dimensions, a shape, a feature of the outer appearance, mass, position of the center of mass, a grip position, a grip state, and the like. Examples of the feature of the outer appearance include a character, symbol, code, image, illustration, pattern, and the like applied to the outer appearance. The position of the center of mass of a package 102 may be a relative position between a reference position of the package 102 and the center of mass of the package 102." as well as Paragraph 0097, "As an example, the trajectory path planning unit 530 can access the workpiece information storage unit 454 and acquires at least one of the information representing the position of the workpiece, the information representing various characteristics relating to the workpiece, the information representing the grip position of the workpiece, and the information indicating the grip strength of the workpiece. Furthermore, the trajectory path planning unit 530 can access the model information storage unit 456 and acquires the three-dimensional model of the robot 130 and the three-dimensional models of one or more packages 102 mounted on the depalletizing platform 110. The trajectory path planning unit 530 can access the setting information storage unit 458 and acquires various types of information relating to the settings of the robot 130. The trajectory path planning unit 530 can plan the trajectory path 600 described above using the information described above.") a set of limits including one or more of a velocity limit and an acceleration limit, and to enforce the set of limits in determining the at least partly velocity-based trajectory. (Paragraph 0082, "The setting information storage unit 458 can store information indicating the content of various setting relating to each unit of the transfer system 100. The setting information storage unit 458 may store information relating to an amount of mass that is transportable by the end effector 140. This transportable mass may be a rated transportable mass of the end effector 140, or may be a maximum transportable mass possible within the prescribed range of a transport velocity or a transport acceleration. The setting information storage unit 458 may store information representing the rated output of the robot 130. The setting information storage unit 458 may store information representing a setting value relating to an upper limit of the output of the robot 130. The setting information storage unit 458 may store information representing a setting value relating to an upper limit of a transport velocity or a transport acceleration of the robot 130.")
Regarding claim 15, Kanemoto further teaches:
15. A method to control a robotic system, comprising:
receiving sensor data from one or more sensors deployed in a physical space in which a robot is located; (Paragraph 0040, "In the present embodiment, the drive control unit 134 can control the operation of the robot arm 132 and the end effector 140. The drive control unit 134 may control the operation of the robot arm 132 and the end effector 140 in accordance with instructions from the transfer control apparatus 150. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the robot arm 132. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the end effector 140. The drive control unit 134 may transmit the outputs of the sensors described above to the transfer control apparatus 150." as well as Paragraph 0044, "The image capturing apparatus 160 may include a plurality of cameras or sensors that are each arranged at a different position. Each of these cameras or sensors may output, by itself, a two-dimensional image, three-dimensional image, or distance image (sometimes referred to as a point group) of a subject. The image capturing apparatus 160 may process the outputs of the plurality of cameras or sensors, and output a three-dimensional image or distance image (sometimes to referred to as a point group) of the subject. The image may be a still image or a moving image.")
using a processor (Paragraphs 0189-0191, "The computer 3000 according to the present embodiment can include a CPU 3012, a RAM 3014, a graphic controller 3016, and a display device 3018, which are mutually connected by a host controller 3010. The computer 3000 can also include input/output units such as a communication interface 3022, a hard disk drive 3024, a DVD-ROM drive 3026 and an IC card drive, which are connected to the host controller 3010 via an input/output controller 3020. The computer can also include legacy input/output units such as a ROM 3030 and a keyboard 3042, which are connected to the input/output controller 3020 through an input/output chip 3040.
The CPU 3012 can operate according to programs stored in the ROM 3030 and the RAM 3014, thereby controlling each unit. The graphic controller 3016 can obtain image data generated by the CPU 3012 on a frame buffer or the like provided in the RAM 3014 or in itself, and cause the image data to be displayed on the display device 3018.
The communication interface 3022 can communicate with other electronic devices via a network. The hard disk drive 3024 can store programs and data used by the CPU 3012 within the computer 3000. The DVD-ROM drive 3026 can read the programs or the data from the DVD-ROM 3001, and provide the hard disk drive 3024 with the programs or the data via the RAM 3014. The IC card drive can read programs and data from an IC card, and/or write programs and data into the IC card.") to determine based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.") including by … ; and
sending to the robot, via a communication interface, (Paragraphs 0189-0191, "The computer 3000 according to the present embodiment can include a CPU 3012, a RAM 3014, a graphic controller 3016, and a display device 3018, which are mutually connected by a host controller 3010. The computer 3000 can also include input/output units such as a communication interface 3022, a hard disk drive 3024, a DVD-ROM drive 3026 and an IC card drive, which are connected to the host controller 3010 via an input/output controller 3020. The computer can also include legacy input/output units such as a ROM 3030 and a keyboard 3042, which are connected to the input/output controller 3020 through an input/output chip 3040.
The CPU 3012 can operate according to programs stored in the ROM 3030 and the RAM 3014, thereby controlling each unit. The graphic controller 3016 can obtain image data generated by the CPU 3012 on a frame buffer or the like provided in the RAM 3014 or in itself, and cause the image data to be displayed on the display device 3018.
The communication interface 3022 can communicate with other electronic devices via a network. The hard disk drive 3024 can store programs and data used by the CPU 3012 within the computer 3000. The DVD-ROM drive 3026 can read the programs or the data from the DVD-ROM 3001, and provide the hard disk drive 3024 with the programs or the data via the RAM 3014. The IC card drive can read programs and data from an IC card, and/or write programs and data into the IC card.") a command to implement the at least partly velocity-based trajectory. (Paragraph 0102, "In the present embodiment, the control signal output unit 552 can acquire the plan information from the trajectory path planning unit 530. The transport target specifying unit 522 can generate a control signal for controlling the operation of the robot 130, based on the plan information. The transport target specifying unit 522 can transmit the generated control signal to the drive control unit 134.")
Kanemoto does not specifically teach determining an intercept vector to intercept a moving object. However, Gross, in the same field of endeavor of robotics, teaches:
… determining a … to intercept a moving item at an intercept location (Paragraph 0038, "In the second calculation step, which is carried out immediately before the start of the approach movement toward the relevant article, only the determined position of the article 3 currently to be approached is taken into account with regard to the starting position GP, i.e. the approach position is shifted by a certain amount Δy in the movement direction of the conveyor belt 2 in relation to the starting position GP. In the second calculation step, therefore, based on the movement set that was calculated in the first calculation step and with the aid of the difference Δy, a modified, new movement set is calculated, which is based on an exact approach position so that during execution of the approach movement, the robotic arm 1 and the relevant article 3 both reach the approach position at the same time. In the second calculation step, the end point is shifted by Δy.") that is based on an initial location of the moving item and a velocity associated with the moving item (Paragraph 0035, "The current position of the article 3 on the conveyor belt 2 is determined with the aid of a position detection system 11 that determines the more precise absolute position of the article 3 on the conveyor belt 2. With the aid of the second calculation means 10, the respective movement set is calculated so that the approach position of the robotic arm 1 corresponds to the position of the article 3 to be approached at the time at which the grasping element 5 is expected to reach the approach position. Then the grasping element 5 of the robotic arm 1 reaches the approach position at precisely the same time that the article 3 to be approached arrives at the approach position." as well as Paragraph 0039, "The belt speed is added to the path segments of the movement set in a suitable fashion. In order to achieve as smooth as possible an acceleration and braking motion of the robotic arm 1, it is possible for the conveyor speed to be taken into account in a sinusoidal/quadratic fashion in the path segments of the precalculated movement set and for it to be added to the path segments.") …
However, Hudson, in the same field of endeavor of robotics, teaches:
… vector (Column 7, Lines 28-38, "The pose of an end effector may be defined in various manners, such as in joint space and/or in Cartesian/configuration space. A joint space pose of an end effector may be a vector of values that define the states of each of the actuators that dictate the position of the end effector. A Cartesian space pose of an end effector may utilize coordinates or other values that define all six degrees of freedom of the end effector relative to a reference frame. It is noted that some robots may have kinematic redundancy and that more than one joint space pose of an end effector may map to the same Cartesian space pose of the end effector in those robots.") …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and operating methods as taught by Kanemoto with the ability to determine a trajectory which would intercept a moving object as taught by Gross and to define the trajectory using a vector as taught by Hudson. Kanemoto teaches determining a trajectory for the end effector. However, they are silent on the method of intercepting the object as it moves. Combining these teachings would allow the system to operate in a greater variety of environments making it more versatile and efficient.
Regarding claim 16, where all the limitations of claim 15 are discussed above, Kanemoto further teaches:
16. The method of claim 15, further comprising using the processor to simulate operation of the robot in a location. (Paragraph 0123, "In the present embodiment, the output simulator 730 can simulate the output of the force sensation sensor 242, based on the plan information. For example, the output simulator 730 can access the product information storage unit 452 and acquire the information representing the mass of the package corresponding to the workpiece. Furthermore, as an example, the output simulator 730 can access the workpiece information storage unit 454 and acquire the plan information relating to the workpiece. Next, the output simulator 730 can estimate the magnitude of at least one of the force and the torque to be detected by the force sensation sensor 242 when the robot 130 transports the work piece, based on the information representing the mass described above and the plan information. In this way, the magnitude of at least one of the force and the torque that would be detected by the force sensation sensor 242 when the robot 130 transports the workpiece can be estimated. The output simulator 730 may estimate the magnitude and the direction of at least one of the force and torque to be detected by the force sensation sensor 242. In this way, the magnitude and direction of at least one of the force and the torque that would be detected by the force sensation sensor 242 can be estimated.")
Regarding claim 18, where all the limitations of claim 17 are discussed above, Kanemoto further teaches:
18. The method of claim 17, further comprising determining based at least in part on an attribute of an object currently within a grasp of the element comprising the robot (Paragraph 0025, " In the registration process, the transfer system 100 can register a characteristic of the package 102 in a database. Examples of the characteristic of the package 102 include dimensions, a shape, a feature of the outer appearance, mass, position of the center of mass, a grip position, a grip state, and the like. Examples of the feature of the outer appearance include a character, symbol, code, image, illustration, pattern, and the like applied to the outer appearance. The position of the center of mass of a package 102 may be a relative position between a reference position of the package 102 and the center of mass of the package 102." as well as Paragraph 0097, "As an example, the trajectory path planning unit 530 can access the workpiece information storage unit 454 and acquires at least one of the information representing the position of the workpiece, the information representing various characteristics relating to the workpiece, the information representing the grip position of the workpiece, and the information indicating the grip strength of the workpiece. Furthermore, the trajectory path planning unit 530 can access the model information storage unit 456 and acquires the three-dimensional model of the robot 130 and the three-dimensional models of one or more packages 102 mounted on the depalletizing platform 110. The trajectory path planning unit 530 can access the setting information storage unit 458 and acquires various types of information relating to the settings of the robot 130. The trajectory path planning unit 530 can plan the trajectory path 600 described above using the information described above.") a set of limits including one or more of a velocity limit and an acceleration limit, and using the processor to enforce the set of limits in determining the at least partly velocity-based trajectory. (Paragraph 0082, "The setting information storage unit 458 can store information indicating the content of various setting relating to each unit of the transfer system 100. The setting information storage unit 458 may store information relating to an amount of mass that is transportable by the end effector 140. This transportable mass may be a rated transportable mass of the end effector 140, or may be a maximum transportable mass possible within the prescribed range of a transport velocity or a transport acceleration. The setting information storage unit 458 may store information representing the rated output of the robot 130. The setting information storage unit 458 may store information representing a setting value relating to an upper limit of the output of the robot 130. The setting information storage unit 458 may store information representing a setting value relating to an upper limit of a transport velocity or a transport acceleration of the robot 130.")
Regarding claim 19, Kanemoto further teaches:
19. A computer program product embodied in a non-transitory computer readable medium and comprising computer instructions for:
receiving sensor data from one or more sensors deployed in a physical space in which a robot is located; (Paragraph 0040, "In the present embodiment, the drive control unit 134 can control the operation of the robot arm 132 and the end effector 140. The drive control unit 134 may control the operation of the robot arm 132 and the end effector 140 in accordance with instructions from the transfer control apparatus 150. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the robot arm 132. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the end effector 140. The drive control unit 134 may transmit the outputs of the sensors described above to the transfer control apparatus 150." as well as Paragraph 0044, "The image capturing apparatus 160 may include a plurality of cameras or sensors that are each arranged at a different position. Each of these cameras or sensors may output, by itself, a two-dimensional image, three-dimensional image, or distance image (sometimes referred to as a point group) of a subject. The image capturing apparatus 160 may process the outputs of the plurality of cameras or sensors, and output a three-dimensional image or distance image (sometimes to referred to as a point group) of the subject. The image may be a still image or a moving image.")
determining based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.") including by … ; and
sending to the robot, via a communication interface, (Paragraphs 0189-0191, "The computer 3000 according to the present embodiment can include a CPU 3012, a RAM 3014, a graphic controller 3016, and a display device 3018, which are mutually connected by a host controller 3010. The computer 3000 can also include input/output units such as a communication interface 3022, a hard disk drive 3024, a DVD-ROM drive 3026 and an IC card drive, which are connected to the host controller 3010 via an input/output controller 3020. The computer can also include legacy input/output units such as a ROM 3030 and a keyboard 3042, which are connected to the input/output controller 3020 through an input/output chip 3040.
The CPU 3012 can operate according to programs stored in the ROM 3030 and the RAM 3014, thereby controlling each unit. The graphic controller 3016 can obtain image data generated by the CPU 3012 on a frame buffer or the like provided in the RAM 3014 or in itself, and cause the image data to be displayed on the display device 3018.
The communication interface 3022 can communicate with other electronic devices via a network. The hard disk drive 3024 can store programs and data used by the CPU 3012 within the computer 3000. The DVD-ROM drive 3026 can read the programs or the data from the DVD-ROM 3001, and provide the hard disk drive 3024 with the programs or the data via the RAM 3014. The IC card drive can read programs and data from an IC card, and/or write programs and data into the IC card.") a command to implement the at least partly velocity-based trajectory. (Paragraph 0102, "In the present embodiment, the control signal output unit 552 can acquire the plan information from the trajectory path planning unit 530. The transport target specifying unit 522 can generate a control signal for controlling the operation of the robot 130, based on the plan information. The transport target specifying unit 522 can transmit the generated control signal to the drive control unit 134.")
Kanemoto does not specifically teach determining an intercept vector to intercept a moving object. However, Gross, in the same field of endeavor of robotics, teaches:
… determining a … to intercept a moving item at an intercept location (Paragraph 0038, "In the second calculation step, which is carried out immediately before the start of the approach movement toward the relevant article, only the determined position of the article 3 currently to be approached is taken into account with regard to the starting position GP, i.e. the approach position is shifted by a certain amount Δy in the movement direction of the conveyor belt 2 in relation to the starting position GP. In the second calculation step, therefore, based on the movement set that was calculated in the first calculation step and with the aid of the difference Δy, a modified, new movement set is calculated, which is based on an exact approach position so that during execution of the approach movement, the robotic arm 1 and the relevant article 3 both reach the approach position at the same time. In the second calculation step, the end point is shifted by Δy.") that is based on an initial location of the moving item and a velocity associated with the moving item (Paragraph 0035, "The current position of the article 3 on the conveyor belt 2 is determined with the aid of a position detection system 11 that determines the more precise absolute position of the article 3 on the conveyor belt 2. With the aid of the second calculation means 10, the respective movement set is calculated so that the approach position of the robotic arm 1 corresponds to the position of the article 3 to be approached at the time at which the grasping element 5 is expected to reach the approach position. Then the grasping element 5 of the robotic arm 1 reaches the approach position at precisely the same time that the article 3 to be approached arrives at the approach position." as well as Paragraph 0039, "The belt speed is added to the path segments of the movement set in a suitable fashion. In order to achieve as smooth as possible an acceleration and braking motion of the robotic arm 1, it is possible for the conveyor speed to be taken into account in a sinusoidal/quadratic fashion in the path segments of the precalculated movement set and for it to be added to the path segments.") …
However, Hudson, in the same field of endeavor of robotics, teaches:
… vector (Column 7, Lines 28-38, "The pose of an end effector may be defined in various manners, such as in joint space and/or in Cartesian/configuration space. A joint space pose of an end effector may be a vector of values that define the states of each of the actuators that dictate the position of the end effector. A Cartesian space pose of an end effector may utilize coordinates or other values that define all six degrees of freedom of the end effector relative to a reference frame. It is noted that some robots may have kinematic redundancy and that more than one joint space pose of an end effector may map to the same Cartesian space pose of the end effector in those robots.") …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and operating methods as taught by Kanemoto with the ability to determine a trajectory which would intercept a moving object as taught by Gross and to define the trajectory using a vector as taught by Hudson. Kanemoto teaches determining a trajectory for the end effector. However, they are silent on the method of intercepting the object as it moves. Combining these teachings would allow the system to operate in a greater variety of environments making it more versatile and efficient.
Regarding claim 20, where all the limitations of claim 19 are discussed above, Kanemoto further teaches:
20. The computer program product of claim 19, further comprising computer instructions to simulate operation of the robot in a location. (Paragraph 0123, "In the present embodiment, the output simulator 730 can simulate the output of the force sensation sensor 242, based on the plan information. For example, the output simulator 730 can access the product information storage unit 452 and acquire the information representing the mass of the package corresponding to the workpiece. Furthermore, as an example, the output simulator 730 can access the workpiece information storage unit 454 and acquire the plan information relating to the workpiece. Next, the output simulator 730 can estimate the magnitude of at least one of the force and the torque to be detected by the force sensation sensor 242 when the robot 130 transports the work piece, based on the information representing the mass described above and the plan information. In this way, the magnitude of at least one of the force and the torque that would be detected by the force sensation sensor 242 when the robot 130 transports the workpiece can be estimated. The output simulator 730 may estimate the magnitude and the direction of at least one of the force and torque to be detected by the force sensation sensor 242. In this way, the magnitude and direction of at least one of the force and the torque that would be detected by the force sensation sensor 242 can be estimated.")
Claim(s) 5 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kanemoto in view of Gross and Hudson and in further view of Pandya et al. (US 20210205992 A1), hereinafter Pandya.
Regarding claim 5, where all the limitations of claim 4 are discussed above, Kanemoto further teaches:
5. The system of claim 4, wherein the processor is configured to determine the at least partly velocity-based trajectory (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.") … simulated (Paragraph 0123, "In the present embodiment, the output simulator 730 can simulate the output of the force sensation sensor 242, based on the plan information. For example, the output simulator 730 can access the product information storage unit 452 and acquire the information representing the mass of the package corresponding to the workpiece. Furthermore, as an example, the output simulator 730 can access the workpiece information storage unit 454 and acquire the plan information relating to the workpiece. Next, the output simulator 730 can estimate the magnitude of at least one of the force and the torque to be detected by the force sensation sensor 242 when the robot 130 transports the work piece, based on the information representing the mass described above and the plan information. In this way, the magnitude of at least one of the force and the torque that would be detected by the force sensation sensor 242 when the robot 130 transports the workpiece can be estimated. The output simulator 730 may estimate the magnitude and the direction of at least one of the force and torque to be detected by the force sensation sensor 242. In this way, the magnitude and direction of at least one of the force and the torque that would be detected by the force sensation sensor 242 can be estimated.") …
Kanemoto does not specifically teach monitoring the velocity and comparing against the simulation. However, Pandya, in the same field of endeavor of robotics, teaches:
… at least in part by comparing an observed velocity of the element comprising the robot to a corresponding …velocity of the element according to the simulated operation of the robot. (Paragraph 0062, "The robotic system 100 can derive one or more of the updated waypoints 532 within the feasibility region 530. The robotic system 100 can initiate the adjustment action at the current location 406 such that the tracked portion can complete the adjustment action at the next updated waypoint (e.g., the waypoint within the feasibility region 530. As an illustrative example, the robotic system 100 can stop the end-effector and/or the carried target object 112 at the next updated waypoint. Also, the robotic system 100 can achieve a targeted speed (e.g., an increase or a decrease in the movement speed in comparison to the planned speed) by the next updated waypoint. The robotic system 100 can use multiple updated waypoints 532 to achieve a desired end state, such as by iteratively increasing or decreasing the movement speed. In deriving the updated waypoints 532, the robotic system 100 can account for the updated movement speeds. The processing period 404 of FIG. 4 can remain constant, and the updated waypoints 532 can correspond to the updated movement speeds with respect to the constant processing period 404. For example, the distance/separation between the updated waypoints 532 can decrease in comparison to the planned waypoints 402 when the updated movement speeds are slower.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and methods of operation as taught by Kanemoto with the ability to monitor the velocity via the simulation as taught by Pandya. While Kanemoto teaches monitoring the actual force as well as the simulated/anticipated force, they are silent on the monitoring/comparing of the velocity of the component. However, Pandya, teaches monitoring the actual speed with reference to the desired/anticipated speed. This would allow for the system to efficiently monitor for operation errors and increase the safety of the system.
Regarding claim 17, where all the limitations of claim 16 are discussed above, Kanemoto further teaches:
17. The method of claim 16, wherein the at least partly velocity-based trajectory is determined (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.") … simulated (Paragraph 0123, "In the present embodiment, the output simulator 730 can simulate the output of the force sensation sensor 242, based on the plan information. For example, the output simulator 730 can access the product information storage unit 452 and acquire the information representing the mass of the package corresponding to the workpiece. Furthermore, as an example, the output simulator 730 can access the workpiece information storage unit 454 and acquire the plan information relating to the workpiece. Next, the output simulator 730 can estimate the magnitude of at least one of the force and the torque to be detected by the force sensation sensor 242 when the robot 130 transports the work piece, based on the information representing the mass described above and the plan information. In this way, the magnitude of at least one of the force and the torque that would be detected by the force sensation sensor 242 when the robot 130 transports the workpiece can be estimated. The output simulator 730 may estimate the magnitude and the direction of at least one of the force and torque to be detected by the force sensation sensor 242. In this way, the magnitude and direction of at least one of the force and the torque that would be detected by the force sensation sensor 242 can be estimated.") …
Kanemoto does not specifically teach monitoring the velocity and comparing against the simulation. However, Pandya, in the same field of endeavor of robotics, teaches:
… at least in part by comparing an observed velocity of the element comprising the robot to a corresponding … velocity of the element according to the simulated operation of the robot. (Paragraph 0062, "The robotic system 100 can derive one or more of the updated waypoints 532 within the feasibility region 530. The robotic system 100 can initiate the adjustment action at the current location 406 such that the tracked portion can complete the adjustment action at the next updated waypoint (e.g., the waypoint within the feasibility region 530. As an illustrative example, the robotic system 100 can stop the end-effector and/or the carried target object 112 at the next updated waypoint. Also, the robotic system 100 can achieve a targeted speed (e.g., an increase or a decrease in the movement speed in comparison to the planned speed) by the next updated waypoint. The robotic system 100 can use multiple updated waypoints 532 to achieve a desired end state, such as by iteratively increasing or decreasing the movement speed. In deriving the updated waypoints 532, the robotic system 100 can account for the updated movement speeds. The processing period 404 of FIG. 4 can remain constant, and the updated waypoints 532 can correspond to the updated movement speeds with respect to the constant processing period 404. For example, the distance/separation between the updated waypoints 532 can decrease in comparison to the planned waypoints 402 when the updated movement speeds are slower.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and methods of operation as taught by Kanemoto with the ability to monitor the velocity via the simulation as taught by Pandya. While Kanemoto teaches monitoring the actual force as well as the simulated/anticipated force, they are silent on the monitoring/comparing of the velocity of the component. However, Pandya, teaches monitoring the actual speed with reference to the desired/anticipated speed. This would allow for the system to efficiently monitor for operation errors and increase the safety of the system.
Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kanemoto in view of Gross and Hudson and in further view of Masoud et al. (US 20150168939 A1), hereinafter Masoud.
Regarding claim 8, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
8. The system of claim 1, … to achieve the at least partly velocity-based trajectory. (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.")
Kanemoto does not specifically discuss the use of torque based commands. However, Masoud, in the same field of endeavor of robotics, teaches:
… the command includes a torque-based command associated with a computed torque to be applied at a joint associated with the command (Paragraph 0010, "A mobile robot may be provided with at least one wheel and a controller including a processor and memory. Joint planning and control of a position of the mobile robot may be achieved through a velocity control approach and/or a torque control approach. In providing a plan to the mobile robot, a control signal may be calculated and generated to synchronize the actual velocity and/or actual torque of the mobile robot. The ability to perform the complex task of joint planning and control at a servo-level, in a provably-correct manner, increases the speed of operation, reduces energy consumption, and leads to a high quality response.") …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and methods as taught by Kanemoto with the torque based command system as taught by Masoud. It is well known in the field of robotics to use torque based commands to control the operation of a robotic system. This allows for efficient and accurate system operation.
Claim(s) 9-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kanemoto in view of Gross and Hudson and in further view of Nakaya et al. (US 20170348856 A1), hereinafter Nakaya.
Regarding claim 9, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
9. The system of claim 1, wherein the processor is further configured to determine the at least partly velocity-based trajectory (Paragraph 0096, "In the present embodiment, the trajectory path planning unit 530 can plan at least one of a trajectory path of the distal end of the robot arm 132 and a trajectory path of the end effector 140. The trajectory path planning unit 530 can output information representing the content of the plan relating to the trajectory path described above (sometimes referred to as plan information) to at least one of the algorithm determining unit 542, the abnormality detecting unit 544, the changing unit 546, and the control signal output unit 552, for example. The trajectory path planning unit 530 may store the plan information in the workpiece information storage unit 454. The plan information may include information representing a plurality of elapsed times from when the workpiece passes a reference position in the trajectory path and information representing the angle of each of the plurality of joints included in the robot arm 132 at each elapsed time.") …
Kanemoto does not specifically teach using a repulsive force to determine a trajectory for the end effector. However, Nakaya, in the same field of endeavor of robotics, teaches:
… based at least in part on an imputed repulsion force associated with an item or structure in a location. (Paragraph 0022, "In order to solve the above problems, according to one aspect of the present invention, a control device of an arm robot having a robotic arm to which a plurality of links are coupled by joints and to which a hand is provided in a tip-end part thereof is provided, which includes a geometric model expresser configured to model the robot so as to have a geometric shape and express the model as a geometric model, an area setter configured to set a no-entry area into which the geometric model is not to enter and an operating area, that is defined by the no-entry area and where the geometric model operates, a final posture decider configured to decide a final posture of the robot, an initial route decider configured to decide an initial route of the hand when the robot changes from the current posture toward the final posture, a virtual posture calculator configured to calculate a virtual posture of the robot corresponding to a given point on the initial route, an interference determinator configured to determine whether the geometric model in the virtual posture interferes with the no-entry area, a way-point posture decider configured to decide a way-point posture, and an updated route decider. The way-point posture decider decides, when the interference determinator determines that the geometric model does not interfere with the no-entry area, the virtual posture as the way-point posture, and when the interference determinator determines that the geometric model interferes with the no-entry area, the way-point posture decider virtually generates a repulsive force for relatively repelling an interfering portion of the geometric model from an interfering portion of the no-entry area, and the way-point posture decider calculates a posture in a state where the interfering portion of the geometric model is pushed out from the no-entry area into the operating area by the virtual repulsive force, and decides the calculated posture as the way-point posture. The updated route decider decides a route of the hand when the posture changes from the current posture to the final posture via the way-point posture, as an updated route. The initial route decider, the virtual posture calculator, the interference determinator, the way-point posture decider, and the updated route decider process repeatedly under an assumption of the latest way-point posture decided by the way-point posture decider being a current posture of the initial route decider.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and methods of controlling same as taught by Kanemoto with the functionality of a repulsive force guiding the end effector to avoid certain areas associated with a variety of objects/structures as taught by Nakaya. This would ensure that the system avoids collisions and thereby reduces the likelihood of damaging the robot, the workpiece, or the environment.
Regarding claim 10, where all the limitations of claim 9 are discussed above, Kanemoto does not specifically teach utilizing a repulsive force associated with items/structures within the environment to control the trajectory. However, Nakaya, in the same field of endeavor of robotics, teaches:
10. The system of claim 9, wherein the item or structure includes one or more of a chassis or other structure comprising the robot, a rail or other structure on which one or more of the robot and the chassis are configured to ride, a second robot present in the location, and a fixed structure present in the location. (Paragraph 0022, "In order to solve the above problems, according to one aspect of the present invention, a control device of an arm robot having a robotic arm to which a plurality of links are coupled by joints and to which a hand is provided in a tip-end part thereof is provided, which includes a geometric model expresser configured to model the robot so as to have a geometric shape and express the model as a geometric model, an area setter configured to set a no-entry area into which the geometric model is not to enter and an operating area, that is defined by the no-entry area and where the geometric model operates, a final posture decider configured to decide a final posture of the robot, an initial route decider configured to decide an initial route of the hand when the robot changes from the current posture toward the final posture, a virtual posture calculator configured to calculate a virtual posture of the robot corresponding to a given point on the initial route, an interference determinator configured to determine whether the geometric model in the virtual posture interferes with the no-entry area, a way-point posture decider configured to decide a way-point posture, and an updated route decider. The way-point posture decider decides, when the interference determinator determines that the geometric model does not interfere with the no-entry area, the virtual posture as the way-point posture, and when the interference determinator determines that the geometric model interferes with the no-entry area, the way-point posture decider virtually generates a repulsive force for relatively repelling an interfering portion of the geometric model from an interfering portion of the no-entry area, and the way-point posture decider calculates a posture in a state where the interfering portion of the geometric model is pushed out from the no-entry area into the operating area by the virtual repulsive force, and decides the calculated posture as the way-point posture. The updated route decider decides a route of the hand when the posture changes from the current posture to the final posture via the way-point posture, as an updated route. The initial route decider, the virtual posture calculator, the interference determinator, the way-point posture decider, and the updated route decider process repeatedly under an assumption of the latest way-point posture decided by the way-point posture decider being a current posture of the initial route decider.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and methods of controlling same as taught by Kanemoto with the functionality of a repulsive force guiding the end effector to avoid certain areas associated with a variety of objects/structures as taught by Nakaya. This would ensure that the system avoids collisions and thereby reduces the likelihood of damaging the robot, the workpiece, or the environment.
Claim(s) 11-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kanemoto in view of Gross and Hudson and in further view of Herbach et al. (US 10901415 B1), hereinafter Herbach.
Regarding claim 11, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
11. The system of claim 1, wherein the processor is configured to … and to determine and implement a second velocity-based trajectory to perform the second task. (Paragraph 0103, "When change information is acquired from the changing unit 546, the control signal output unit 552 can generate the control signal for controlling the operation of the robot 130, based on this change information. The control signal output unit 552 can transmit the generated control signal to the drive control unit 134.")
Kanemoto does not specifically teach providing a higher priority task which overrides the current operation. However, Herbach, in the same field of endeavor of robotics and autonomous systems, teaches:
… receive an indication to divert from a first task associated with a first velocity-based trajectory to a second task, (Col 11, Lines 14-36, "A malformed task request may be one sent with a trip request where the priority level of the task request would effectively cancel the trip before the trip has been executed. For example, a trip request may be sent with a non-passenger task request for parking nearby and waiting for the next request. In this situation, if the n on-passenger task of parking nearby and waiting for the next request has a higher priority than the trip request, the trip would never be executed. Additionally, a malformed task request may be one that overrides a currently executed trip request without addressing the presence of passengers. If passengers have already been picked up in the execution of a trip request, an incoming non-passenger task request that overrides the trip request is malformed. If the task request were to override the trip request at that moment, the vehicle would be diverted with its passengers to a different location than what was requested. Also, examples of a task request that may be rejected because the vehicle is not capable of performing the task may include a charging task sent to a gas powered vehicle or a parking task specifying a parking location that is inaccessible by an autonomous vehicle. There may be other rejection reasons that are included in the system that are not listed here.") …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and operating methods as taught by Kanemoto with the ability to indicate higher priority tasks which override the current operation as taught by Herbach. This, combined with the functionality which Kanemoto discusses of providing a change to the trajectory, would allow the system to smoothly transition between tasks to fulfill the most pressing operations without decreasing efficiency more than necessary.
Regarding claim 12, where all the limitations of claim 11 are discussed above, Kanemoto further teaches:
12. The system of claim 11, wherein the processor is configured to include in the second velocity-based trajectory a velocity-based transition from moving the element in a first direction comprising the first velocity-based trajectory to a trajectory to a second direction associated with the second task. (Paragraph 0183, "If the abnormality detecting unit 544 has detected an abnormality (S936: No), the plan change process can be performed at S940. Specifically, the changing unit 546 can make a determination to change the plan of the trajectory path planning unit 530, update the product information storage unit 452, and the like. If the changing unit 546 has made a determination to change the plan of the trajectory path planning unit 530, the changing unit 546 can transmit information representing the content of this change to the control signal output unit 552. The control signal output unit 552 can generate a control signal causing compliance with the content of this change. The control signal output unit 552 can transmit the generated signal to the drive control unit 134. When the plan change process is completed, the process of S952 can be performed.")
Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kanemoto in view of Gross and Hudson and in further view of Kawasaki et al. (JP 2002018752 A), hereinafter Kawasaki.
Regarding claim 13, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
13. The system of claim 1, wherein the robot comprises a first robot (Paragraph 0040, "In the present embodiment, the drive control unit 134 can control the operation of the robot arm 132 and the end effector 140. The drive control unit 134 may control the operation of the robot arm 132 and the end effector 140 in accordance with instructions from the transfer control apparatus 150. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the robot arm 132. The drive control unit 134 may acquire the outputs of one or more sensors arranged on the end effector 140. The drive control unit 134 may transmit the outputs of the sensors described above to the transfer control apparatus 150.") and the processor is configured to grasp an object using the first robot (Paragraph 0041, "In the present embodiment, the end effector 140 can grip and release the packages 102. For example, the end effector 140 can grip a package 102 arranged on the depalletizing platform 110. The end effector 140 can grip the package 102 until the package 102 has been transported to a predetermined position above the reception platform 120. After this, the end effector 140 can release the package 102.") and …
Kanemoto does not specifically teach a second robot which works cooperatively with a first robot to grasp a work object. However, Kawasaki, in the same field of endeavor of robotics, teaches:
… a second robot, and to use velocity control to move the first robot and the second robot in synchronization to move the object to a destination position. (Page 2, Paragraph 6, "In order to achieve the above object, in a cooperative control method in which a plurality of robots cooperate to grasp and work on a target object, a dynamic parameter of the target object is estimated, A target force to be applied to the target object that causes the target object to move in the target trajectory given in the task coordinate system in the target coordinate system is obtained using the estimated target object dynamic parameters and dynamic model. Calculating the target trajectory of each robot's force from the target force to be applied to the obtained target object, and obtaining the target trajectory of each robot's motion from the target trajectory of the target object given in the task coordinate system. A target trajectory of force and motion for each robot is distributed to each robot, and each robot controls a motion from a given target trajectory of force and motion. . The control method includes a robot target trajectory generation unit and a robot control unit corresponding to each of the plurality of robots, and can be implemented in a cooperative control device that cooperatively grasps a target object and performs work. The control of the movement of each robot includes estimating a dynamic parameter of a specific robot, performing feedforward compensation based on the estimated dynamic parameter of the specific robot, a dynamic model of the specific robot, and a target trajectory of force, and Feedback compensation based on the trajectory error of the robot motion can be performed.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and control methods as taught by Kanemoto with the functionality of having a plurality of robots operate cooperatively to grasp and perform work operations as taught by Kawasaki. This would allow for the robots to perform work which a single robot is not capable of. For example, lifting and operating on an object which is too large and/or heavy for a single robot.
Claim(s) 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kanemoto in view of Gross and Hudson and in further view of Lokhorst et al. (US 20220168895 A1), hereinafter Lokhorst.
Regarding claim 14, where all the limitations of claim 1 are discussed above, Kanemoto further teaches:
14. The system of claim 1, wherein the processor is configured to … determined based at least in part on the sensor data (Paragraph 0163, "The abnormality detecting unit 544 may be an example of a first detecting unit, a second detecting unit, and a third detecting unit. The registration data comparing unit 720 may be an example of a mass information acquiring unit and a center of mass identifying unit. The output simulator 730 may be an example of a mass information acquiring unit, a plan information acquiring unit, and a force sensation estimating unit. The mass abnormality detecting unit 762 may be an example of a first detecting unit. The center of mass abnormality detecting unit 764 may be an example of a second detecting unit and a third detecting unit. The magnitude of at least one of the force and the torque detected by the force sensation sensor 242 may be an example of the magnitude of the at least one of the force and the torque detected at the distal end of the manipulator. The abnormality relating to the mass detected by the mass abnormality detecting unit 762 may be an example of an abnormality relating to the mass of a target item represented by the force sensation information. The abnormality relating to the center of mass detected by the center of mass abnormality detecting unit 764 may be an example of an abnormality relating to the center of mass of a target item represented by the force sensation information. The abnormality in the pressure of the depressurization chamber 312 may be an example of an abnormality relating to the pressure represented by the depressurization information.") to determine and implement an adjustment to the at least partly velocity-based trajectory. (Paragraph 0183, "If the abnormality detecting unit 544 has detected an abnormality (S936: No), the plan change process can be performed at S940. Specifically, the changing unit 546 can make a determination to change the plan of the trajectory path planning unit 530, update the product information storage unit 452, and the like. If the changing unit 546 has made a determination to change the plan of the trajectory path planning unit 530, the changing unit 546 can transmit information representing the content of this change to the control signal output unit 552. The control signal output unit 552 can generate a control signal causing compliance with the content of this change. The control signal output unit 552 can transmit the generated signal to the drive control unit 134. When the plan change process is completed, the process of S952 can be performed.")
Kanemoto does not specifically discuss using positional error based on a difference between the actual position and the expected position to determine trajectory changes. However, Lokhorst, in the same field of endeavor of robotics, teaches:
… determine and use a position error or difference between an expected position of the element and an observed position (Paragraph 0016, "The robotic arm condition may comprise at least one of: a position of the robotic arm, a velocity of the robotic arm, and an acceleration of the robotic arm. Since these values are relatively easy to measure, and can be reliably measured, this may improve the reliability of the method." as well as Paragraph 0060, "The measured position is then output to the comparison module 112, which compares the measured and expected position at step 208. If the measured and expected positions are the same or differ by less than a predetermined threshold, then the arm will continue to move at step 204 and the process will be repeated until the path has been completed. Otherwise, if the measured and expected positions differ by a value greater than the predetermined threshold, then movement is ceased, and the brake 110 will be activated at step 210.") …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system and operating methods as taught by Kanemoto with the positional error comparison methods as taught by Lokhorst. While Kanemoto teaches detecting abnormalities and issues with the transport of the workpiece as well as adjusting the operation based on this detection, they are silent on the detection of a positional error between expected and observed position. However, Lokhorst teaches monitoring the actual vs expected position and controlling the system based on the results. This allows for detection and mitigation of collisions and prevents the arm from continuing a movement that is not the desired movement and possibly damaging a workpiece or the system.
Conclusion
The Examiner has cited particular paragraphs or columns and line numbers in the referencesapplied to the claims above for the convenience of the Applicant. Although the specified citations arerepresentative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the Applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP 2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed Invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HEATHER KENIRY whose telephone number is (571)270-5468. The examiner can normally be reached M-F 7:30-5:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached at (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/H.J.K./Examiner, Art Unit 3657
/ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657