Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
2. This communication is responsive to Application No. 18/489,629 and the amendments filed on 2/4/2026.
3. Claims 21, 25, 27, 29-41, and 46 are presented for examination.
Information Disclosure Statement
4. The information disclosure statement (IDS) submitted on 10/18/2023 has been fully considered by the Examiner.
Response to Arguments
5. Applicant’s arguments with respect to the rejection of claim(s) 21, 25, 27-41, and 46 under 35 U.S.C. 103 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Regarding independent claim 21, the Examiner agrees that the combination of US 9592608 B1 to Bingham and US 20180319013 A1 to Shimodaira fails to teach all of the amended limitations of the claim. However, in light of the amendments and the Applicant’s remarks, an updated search was conducted, and a new ground of rejection concerning claim 21 has been determined, in which will be described later.
Regarding independent claims 38 and 40, as both of these claims contain similar limitations as claim 21, are still rejected for similar reasons as claim 21 is, in which will be described later.
Regarding dependent claims 25, 27, 29-37, 39, 41, and 46, as all of these claims depend from either claims 21, 38, or 40, are still rejected, in which will be described later.
Regarding dependent claims 28, this claim has been cancelled, and thus, is withdrawn from further consideration.
Claim Objections
6. Claim 33 is objected to because of the following informalities:
Regarding Claim 33, the term “wherein in a case where” recited in line 10 of claim 33 should read “wherein when” to avoid a conditional limitation.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
8. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
9. Claim(s) 21, 27, 29, and 38-41 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira).
Regarding Claim 21, Bingham teaches a controller of a robot system (Col. 3 lines 58-61, where “The robotic system 100 is shown to include processor(s) 102, data storage 104, program instructions 106, controller 108, ...”),
the robot system comprising a robot having a predetermined portion (Col. 5 lines 13-17, where “Additionally, the robotic arm 200 includes joints 204A-204F each coupled to one or more of the actuator(s) 114. The actuators in joints 204A-204F may operate to cause movement of various movable component(s) 116 such as … end effector 208.”) movable by direct teaching performed by a user (Col. 14 lines 9-16, where “After initiation of teach mode, an external force may be applied by the user 210 that guides one or more movable component(s) 116 of the robotic system 100. For instance, as shown in FIG. 5B, the user 210 guides the end effector 208 along the straight line path 502. The robotic arm 200 may detect the movement by analyzing trajectory of the end effector 208, such as by determining position data (e.g., using position and/or proximity sensors) in the environment.”), (Note: The Examiner interprets the end effector 208 of Bingham as the predetermined portion.),
a first sensor configured to acquire first information related to a force acting on the predetermined portion, a second sensor configured to acquire second information related to a position of the predetermined portion (Col. 5 lines 25-27, where “In yet another example, the end effector may include sensors such as force sensors, location sensors, and/or proximity sensors.”),
the controller comprising a processing part (Col. 3 lines 58-61, where “The robotic system 100 is shown to include processor(s) 102, ...”), (Note: The Examiner interprets the processor(s) 102 as the processing part.),
wherein the processing part is configured to acquire the first information and the second information when the user moves the predetermined portion by direct teaching (Col. 5 lines 29-57, where “In an example implementation, a robotic system 100, such as robotic arm 200, may be capable of operating in a teach mode as illustrated by FIG. 2B. In particular, teach mode may be an operating mode of the robotic arm 200 that allows a user 210 … to physically interact with and guide the robotic arm 200 towards carrying out and recording various movements. In a teaching mode, an external force is applied (e.g., by the user) to the robotic system 100 based on a teaching input that is intended to teach the robotic system regarding how to carry out a specific task. … Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, velocity data, acceleration data, torque data, force data, and power data, among other possibilities. For example, during teach mode the user 210 may grasp onto any part of the robotic arm 200 and provide an external force by physically moving the robotic arm 200. In particular, the user 210 may guide the robotic arm 200 towards grasping onto an object and then moving the object from a first location to a second location. As the user 210 guides the robotic arm 200 during teach mode, the system may obtain and record data related to the movement such that the robotic arm 200 may be configured to independently carry out the task at a future time during independent operation ….”),
wherein the processing part is configured to acquire information relating to a contact force based on the first information, the contact force being generated in a case where the predetermined portion contacts a surrounding object, or in a case where an object held by the predetermined portion contacts the surrounding object (Col. 9 lines 9-32, where “To illustrate, refer to FIG. 3B again depicting the scenario where the user 210 seeks to instruct the robotic arm 200 as to how to wipe the surface. This may specifically involve guiding the robotic arm
200 to apply specific forces onto the surface. In some cases, the user 210 may guide the robotic arm 200 to apply forces onto the surface that exceed the torques (e.g., in one or more of the actuators) that the robotic arm 200 is actually capable of independently generating in order to apply these forces onto the surface. For example, user 210 is shown to move the robotic arm 200 such that a force 308 is applied onto the surface. … As such, the robotic arm
200 may detect, during teach mode, a taught movement involving application of force 308
that exceeds a corresponding torque limit of the actuator in joint 204D. Such detection may be carried out using sensor data from force sensors positioned at the end effector 208, among other possibilities.”),
wherein the processing part is configured to acquire a trajectory of the predetermined portion based on the second information (Col. 5 lines 31-57, where “In particular, teach mode may be an operating mode of the robotic arm 200 that allows a user 210 … to physically interact with and guide the robotic arm 200 towards carrying out and recording various movements. … The robotic arm 200 may thus obtain data regarding how to carry out the specific task based on instructions and guidance from the user 210. Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, … among other possibilities. … In particular, the user 210 may guide the robotic arm 200 towards grasping onto an object and then moving the object from a first location to a second location. As the user 210 guides the robotic arm 200 during teach mode, the system may obtain and record data related to the movement such that the robotic arm 200 may be configured to independently carry out the task at a future time during independent operation ….”), (Col. 13 lines 61-67, where “As illustrated, the snap-to template may be a template having a corresponding straight line path 502 in the environment of the robotic system 100. In particular, this straight line path 502 is shown as arranged along the surface. In this manner, the robotic arm 200 can record movement (i.e., during teach mode) of the end effector 208 along a straight line path 502 for the purpose of wiping the surface.”).
Bingham is silent on wherein the processing part is configured to automatically set a section in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching, and simplify the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot, and wherein the processing part is configured to generate position control data for the position control of the robot based on the simplified trajectory.
However, Fudaba teaches wherein the processing part is configured to automatically set a section in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching ([0274] via “In the "teach mode", a person teaches the robot arm 102 a motion, and the robot arm 102 is moved in accordance with manipulation by the person. In the "replay mode", the robot arm 102 automatically replays a motion taught by a person in the "teach mode", and the robot arm 102 automatically moves without manipulation by a person.”), ([0304] via “FIG. 10A illustrates the time point where the person's hand 1001 grips the robot arm 102 and the person starts teaching the robot arm 102. In this time point, the flexible board 1002 gripped by the hand 701 of the robot arm 102 and the connector 1003 have not yet come into contact with each other. The person's hand 1001 thus receives no reactive force to be generated upon contact between the flexible board 1002 and the connector 1003.”), ([0305] via “Next, FIG. 10C illustrates the time point where the distal end of the flexible board 1002 is in contact with an inlet port of the connector 1003. After elapse of a reaction time from this time point, the person's hand 1001 receives a reactive force generated upon contact, and the person changes his or her behavior. … The force sensor 716 detects the magnitude of the reactive force during this inserting task.”), ([0317] via “FIG. 13 indicates a value of the force sensor 716 and positional information on the robot arm 102 during the replaying motion illustrated in FIGS. 12A to 12J. The positional information on the robot arm 102 corresponds to positional information that is acquired by the motion information acquiring section 106. A first solid line 313 in graph indicates the value detected by the force sensor 716. A second solid line 314 in graph indicates the position of the robot arm 102”), (Note: See Figures 10-13 of Fudaba as well.).
Further, Shimodaira teaches to simplify the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot ([0065] via “In a graph G1, the force Fz is zero at the position where the z coordinate value is 0 (second teaching point TP2). When the first object OB1 advances in the −z direction and the two objects OB1 and OB2 are brought into contact, the force Fz increases as the first object OB1 moves in the −z direction, the force Fz temporarily decreases after reaching a peak value Fpk, and then the force Fz increases again.”), ([0079] via “FIG. 14 is an explanatory diagram showing an example of the movement of the end effector 160 from the first teaching point TP1 to the second teaching point TP2 according to teaching data. … Position control in the movement from the first teaching point TP1 to the second teaching point TP2 is performed by, for example, Continuous Path control (CP control). The CP control is a control method in which two points are continuously interpolated so that the movement path between two points of the end effector 160 follows a certain trajectory. In the example of
FIG. 14, the movement path between the first teaching point TP1 and the second teaching point TP2 is formed so as to follow a linear trajectory. In this way, it is possible to reduce the possibility that the first object OB1 and the end effector 160 physically interfere with other objects. After reaching the second teaching point TP2, the end effector 160 is moved in the −z direction, and the end effector 160 is moved until the force detected by the force detector 150 reaches the second force F2 (FIG. 7) so as to fit the objects OB1 and OB2 with each other.”), (Note: See Figures 4-6 and 14 of Shimodaira. Specifically, where teaching point TP2 of Shimodaira does not result in the generation of contact force between the gripped object OB1 and the surrounding object OB2.), and
wherein the processing part is configured to generate position control data for the position control of the robot based on the simplified trajectory ([0078] via “In step S160, the teaching data 234 is generated using the input of the received teaching point, and stored in the nonvolatile memory 230 of the controller 200. The generation of the teaching data 234 is executed by the teaching data generation unit 214 of the controller 200. As is well known, the teaching data 234 includes a plurality of teaching points, and a description of control modes (position control and force control) to be executed upon movement between respective teaching points. … When the teaching data 234 is completed in this manner, in step S20 of
FIG. 9, in order to manufacture the actual product, the fitting work according to the teaching data is executed for a plurality of sets of objects OB1 and OB2.”), ([0079] via “FIG. 14 is an explanatory diagram showing an example of the movement of the end effector 160 from the first teaching point TP1 to the second teaching point TP2 according to teaching data. … Position control in the movement from the first teaching point TP1 to the second teaching point TP2 is performed by, for example, Continuous Path control (CP control). The CP control is a control method in which two points are continuously interpolated so that the movement path between two points of the end effector 160 follows a certain trajectory. In the example of FIG. 14, the movement path between the first teaching point TP1 and the second teaching point TP2 is formed so as to follow a linear trajectory. In this way, it is possible to reduce the possibility that the first object OB1 and the end effector 160 physically interfere with other objects. After reaching the second teaching point TP2, the end effector 160 is moved in the −z direction, and the end effector 160 is moved until the force detected by the force detector 150 reaches the second force F2 (FIG. 7) so as to fit the objects OB1 and OB2 with each other.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Fudaba wherein the processing part is configured to automatically set a section in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching. Doing so allows the robot to autonomously perform specific tasks after initially learning how to perform the specific task from the direct teaching, as stated by Fudaba ([0323] via “As described above, the feedback rule generating section 111 generates a feedback rule on the basis of taught data, and the controller 114 replays in accordance with the feedback rule thus generated and motion information generated by the motion generating section 112. It is thus possible to replay a motion similar to the teaching motion even in a case where the environment is varied.”).
In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Shimodaira wherein the processing part is configured to simplify the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot, and wherein the processing part is configured to generate position control data for the position control of the robot based on the simplified trajectory. Doing so reduces the chances that the predetermined portion collides in unwanted contact with external objects when executing the trajectory, as stated above by Shimodaira in [0079].
Regarding Claim 27, modified reference Bingham teaches the controller according to claim 21, but is silent on wherein the processing part is configured to perform a position control and not to perform a force control in the section for simplifying the trajectory of the predetermined portion.
However, Shimodaira teaches wherein the processing part is configured to perform a position control and not to perform a force control in the section for simplifying the trajectory of the predetermined portion ([0079] via “Position control in the movement from the first teaching point TP1 to the second teaching point TP2 is performed by, for example, Continuous Path control (CP control). The CP control is a control method in which two points are continuously interpolated so that the movement path between two points of the end effector 160 follows a certain trajectory. In the example of FIG. 14, the movement path between the first teaching point TP1 and the second teaching point TP2 is formed so as to follow a linear trajectory. In this way, it is possible to reduce the possibility that the first object OB1 and the end effector 160 physically interfere with other objects.”), (Note: See explanation of Shimodaira in claim 21 above as well.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Shimodaira wherein the processing part is configured to perform a position control and not to perform a force control in the section for simplifying the trajectory of the predetermined portion. Doing so reduces the chances that the predetermined portion collides in unwanted contact with external objects when executing the trajectory, as stated above by Shimodaira.
Regarding Claim 29, modified reference Bingham teaches the controller according to claim 21, wherein the processing part is configured to set a second section in which to perform a force control in the information related to the trajectory based on the information related to the contact force (Col. 5 lines 36-45, where “In a teaching mode, an external force is applied (e.g., by the user) to the robotic system 100 based on a teaching input that is intended to teach the robotic system regarding how to carry out a specific task. The robotic arm 200 may thus obtain data regarding how to carry out the specific task based on instructions and guidance from the user 210. Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, velocity data, acceleration data, torque data, force data, and power data, among other possibilities.”), (Col. 13 lines 22-36, where “In some cases, the other received information may include selection of the movable component(s) 116 that should be associated with the selected template. For example, user-input may be received selecting the end effector as the component associated with the path in the environment. Upon such selection, the robotic system 100 may be configured to evaluate position (e.g., coordinates) of the end effector in the environment with the position (e.g., coordinates) of the path in the environment. As shown by block 404, method 400 next involves initiating, by the robotic system, a recording process (e.g., operation in teach mode) for storing data related to motion of the at least one component in the environment. Note that, in some cases, receiving information related to the path may take place during the recording process.”).
Regarding Claim 38, Bingham teaches a robot system (Col. 3 lines 58-61, where “The robotic system 100 is shown to include processor(s) 102, data storage 104, program instructions 106, controller 108, sensor(s) 110, power source(s) 112, actuator(s) 114, and movable component(s) 116.”) comprising:
a robot having a predetermined portion (Col. 5 lines 13-17, where “Additionally, the robotic arm 200 includes joints 204A-204F each coupled to one or more of the actuator(s)
114. The actuators in joints 204A-204F may operate to cause movement of various movable component(s) 116 such as … end effector 208.”) movable by direct teaching performed by a user (Col. 14 lines 9-16, where “After initiation of teach mode, an external force may be applied by the user 210 that guides one or more movable component(s) 116 of the robotic system 100. For instance, as shown in FIG. 5B, the user 210 guides the end effector 208 along the straight line path 502. The robotic arm 200 may detect the movement by analyzing trajectory of the end effector 208, such as by determining position data (e.g., using position and/or proximity sensors) in the environment.”), (Note: The Examiner interprets the end effector 208 of Bingham as the predetermined portion.);
a first sensor configured to acquire first information related to a force acting on the predetermined portion of the robot; a second sensor configured to acquire second information related to a position of the predetermined portion of the robot (Col. 5 lines 25-27, where “In yet another example, the end effector may include sensors such as force sensors, location sensors, and/or proximity sensors.”); and
a controller (Col. 3 lines 58-61, where “The robotic system 100 is shown to include …, controller 108, …”),
wherein the controller is configured to acquire the first information and the second information when the user moves the predetermined portion by direct teaching (Col. 5 lines 29-57, where “In an example implementation, a robotic system 100, such as robotic arm 200, may be capable of operating in a teach mode as illustrated by FIG. 2B. In particular, teach mode may be an operating mode of the robotic arm 200 that allows a user 210 … to physically interact with and guide the robotic arm 200 towards carrying out and recording various movements. In a teaching mode, an external force is applied (e.g., by the user) to the robotic system 100 based on a teaching input that is intended to teach the robotic system regarding how to carry out a specific task. … Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, velocity data, acceleration data, torque data, force data, and power data, among other possibilities. For example, during teach mode the user 210 may grasp onto any part of the robotic arm 200 and provide an external force by physically moving the robotic arm 200. In particular, the user 210 may guide the robotic arm 200 towards grasping onto an object and then moving the object from a first location to a second location. As the user 210 guides the robotic arm 200 during teach mode, the system may obtain and record data related to the movement such that the robotic arm 200 may be configured to independently carry out the task at a future time during independent operation ….”),
wherein the controller is configured to acquire information relating to a contact force based on the first information, the contact force being generated in a case where the predetermined portion contacts a surrounding object, or in a case where an object held by the predetermined portion contacts the surrounding object (Col. 9 lines 9-32, where “To illustrate, refer to FIG. 3B again depicting the scenario where the user 210 seeks to instruct the robotic arm 200 as to how to wipe the surface. This may specifically involve guiding the robotic arm
200 to apply specific forces onto the surface. In some cases, the user 210 may guide the robotic arm 200 to apply forces onto the surface that exceed the torques (e.g., in one or more of the actuators) that the robotic arm 200 is actually capable of independently generating in order to apply these forces onto the surface. For example, user 210 is shown to move the robotic arm 200 such that a force 308 is applied onto the surface. … As such, the robotic arm
200 may detect, during teach mode, a taught movement involving application of force 308
that exceeds a corresponding torque limit of the actuator in joint 204D. Such detection may be carried out using sensor data from force sensors positioned at the end effector 208, among other possibilities.”),
wherein the controller is configured to acquire a trajectory of the predetermined portion based on the second information (Col. 5 lines 31-57, where “In particular, teach mode may be an operating mode of the robotic arm 200 that allows a user 210 … to physically interact with and guide the robotic arm 200 towards carrying out and recording various movements. … The robotic arm 200 may thus obtain data regarding how to carry out the specific task based on instructions and guidance from the user 210. Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, … among other possibilities. … In particular, the user 210 may guide the robotic arm 200 towards grasping onto an object and then moving the object from a first location to a second location. As the user 210 guides the robotic arm 200 during teach mode, the system may obtain and record data related to the movement such that the robotic arm 200 may be configured to independently carry out the task at a future time during independent operation ….”), (Col. 13 lines 61-67, where “As illustrated, the snap-to template may be a template having a corresponding straight line path 502 in the environment of the robotic system 100. In particular, this straight line path 502 is shown as arranged along the surface. In this manner, the robotic arm 200 can record movement (i.e., during teach mode) of the end effector 208
along a straight line path 502 for the purpose of wiping the surface.”).
Bingham is silent on wherein the controller is configured to automatically set a section in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching, and simplify the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot, and wherein the controller is configured to generate position control data for the position control of the robot based on the simplified trajectory.
However, Fudaba teaches wherein the controller is configured to automatically set a section in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching ([0274] via “In the "teach mode", a person teaches the robot arm 102 a motion, and the robot arm 102 is moved in accordance with manipulation by the person. In the "replay mode", the robot arm 102 automatically replays a motion taught by a person in the "teach mode", and the robot arm 102 automatically moves without manipulation by a person.”), ([0304] via “FIG. 10A illustrates the time point where the person's hand 1001 grips the robot arm 102 and the person starts teaching the robot arm 102. In this time point, the flexible board 1002 gripped by the hand 701 of the robot arm 102 and the connector 1003 have not yet come into contact with each other. The person's hand 1001 thus receives no reactive force to be generated upon contact between the flexible board 1002 and the connector 1003.”), ([0305] via “Next, FIG. 10C illustrates the time point where the distal end of the flexible board 1002 is in contact with an inlet port of the connector 1003. After elapse of a reaction time from this time point, the person's hand 1001 receives a reactive force generated upon contact, and the person changes his or her behavior. … The force sensor 716 detects the magnitude of the reactive force during this inserting task.”), ([0317] via “FIG. 13 indicates a value of the force sensor 716 and positional information on the robot arm 102 during the replaying motion illustrated in FIGS. 12A to 12J. The positional information on the robot arm 102 corresponds to positional information that is acquired by the motion information acquiring section 106. A first solid line 313 in graph indicates the value detected by the force sensor 716. A second solid line 314 in graph indicates the position of the robot arm 102”), (Note: See Figures 10-13 of Fudaba as well.).
Further, Shimodaira teaches to simplify the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot ([0065] via “In a graph G1, the force Fz is zero at the position where the z coordinate value is 0 (second teaching point TP2). When the first object OB1 advances in the −z direction and the two objects OB1 and OB2 are brought into contact, the force Fz increases as the first object OB1 moves in the −z direction, the force Fz temporarily decreases after reaching a peak value Fpk, and then the force Fz increases again.”), ([0079] via “FIG. 14 is an explanatory diagram showing an example of the movement of the end effector 160 from the first teaching point TP1 to the second teaching point TP2 according to teaching data. … Position control in the movement from the first teaching point TP1 to the second teaching point TP2 is performed by, for example, Continuous Path control (CP control). The CP control is a control method in which two points are continuously interpolated so that the movement path between two points of the end effector 160 follows a certain trajectory. In the example of
FIG. 14, the movement path between the first teaching point TP1 and the second teaching point TP2 is formed so as to follow a linear trajectory. In this way, it is possible to reduce the possibility that the first object OB1 and the end effector 160 physically interfere with other objects. After reaching the second teaching point TP2, the end effector 160 is moved in the −z direction, and the end effector 160 is moved until the force detected by the force detector 150 reaches the second force F2 (FIG. 7) so as to fit the objects OB1 and OB2 with each other.”), (Note: See Figures 4-6 and 14 of Shimodaira. Specifically, where teaching point TP2 of Shimodaira does not result in the generation of contact force between the gripped object OB1 and the surrounding object OB2.), and
wherein the controller is configured to generate position control data for the position control of the robot based on the simplified trajectory ([0078] via “In step S160, the teaching data 234 is generated using the input of the received teaching point, and stored in the nonvolatile memory 230 of the controller 200. The generation of the teaching data 234 is executed by the teaching data generation unit 214 of the controller 200. As is well known, the teaching data 234 includes a plurality of teaching points, and a description of control modes (position control and force control) to be executed upon movement between respective teaching points. … When the teaching data 234 is completed in this manner, in step S20 of
FIG. 9, in order to manufacture the actual product, the fitting work according to the teaching data is executed for a plurality of sets of objects OB1 and OB2.”), ([0079] via “FIG. 14 is an explanatory diagram showing an example of the movement of the end effector 160 from the first teaching point TP1 to the second teaching point TP2 according to teaching data. … Position control in the movement from the first teaching point TP1 to the second teaching point TP2 is performed by, for example, Continuous Path control (CP control). The CP control is a control method in which two points are continuously interpolated so that the movement path between two points of the end effector 160 follows a certain trajectory. In the example of FIG. 14, the movement path between the first teaching point TP1 and the second teaching point TP2 is formed so as to follow a linear trajectory. In this way, it is possible to reduce the possibility that the first object OB1 and the end effector 160 physically interfere with other objects. After reaching the second teaching point TP2, the end effector 160 is moved in the −z direction, and the end effector 160 is moved until the force detected by the force detector 150 reaches the second force F2 (FIG. 7) so as to fit the objects OB1 and OB2 with each other.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Fudaba wherein the controller is configured to automatically set a section in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching. Doing so allows the robot to autonomously perform specific tasks after initially learning how to perform the specific task from the direct teaching, as stated by Fudaba ([0323] via “As described above, the feedback rule generating section 111 generates a feedback rule on the basis of taught data, and the controller 114 replays in accordance with the feedback rule thus generated and motion information generated by the motion generating section 112. It is thus possible to replay a motion similar to the teaching motion even in a case where the environment is varied.”).
In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Shimodaira wherein the controller is configured to simplify the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot, and wherein the controller is configured to generate position control data for the position control of the robot based on the simplified trajectory. Doing so reduces the chances that the predetermined portion collides in unwanted contact with external objects when executing the trajectory, as stated above by Shimodaira in [0079].
Regarding Claim 39, modified reference Bingham teaches a product manufacturing method manufacturing a product by using the robot system as set forth in claim 38 (Col. 17 lines 5-10, where “To illustrate, consider FIGS. 6A-6D depicting an example scenario for teaching the robotic arm 200 a task involving installation of bolts in a circular bolt-hole pattern 602. In this example scenario, the robotic arm 200 may receive information such as selection of a snap-to template corresponding to a circular path 604 in the physical space.”).
Regarding Claim 40, Bingham teaches a control method of a robot system comprising a robot having a predetermined portion (Col. 5 lines 13-17, where “Additionally, the robotic arm 200 includes joints 204A-204F each coupled to one or more of the actuator(s) 114. The actuators in joints 204A-204F may operate to cause movement of various movable component(s) 116 such as … end effector 208.”) movable by direct teaching performed by a user (Col. 14 lines 9-16, where “After initiation of teach mode, an external force may be applied by the user 210 that guides one or more movable component(s) 116 of the robotic system 100. For instance, as shown in FIG. 5B, the user 210 guides the end effector 208 along the straight line path 502. The robotic arm 200 may detect the movement by analyzing trajectory of the end effector 208, such as by determining position data (e.g., using position and/or proximity sensors) in the environment.”), (Note: The Examiner interprets the end effector 208 of Bingham as the predetermined portion.),
a first sensor configured to acquire first information related to a force acting on the predetermined portion of the robot, and a second sensor configured to acquire second information related to a position of the predetermined portion of the robot (Col. 5 lines 25-27, where “In yet another example, the end effector may include sensors such as force sensors, location sensors, and/or proximity sensors.”),
the control method comprising: acquiring the first information and the second information when the user moves the predetermined portion by direct teaching (Col. 5 lines 29-57, where “In an example implementation, a robotic system 100, such as robotic arm 200, may be capable of operating in a teach mode as illustrated by FIG. 2B. In particular, teach mode may be an operating mode of the robotic arm 200 that allows a user 210 … to physically interact with and guide the robotic arm 200 towards carrying out and recording various movements. In a teaching mode, an external force is applied (e.g., by the user) to the robotic system 100 based on a teaching input that is intended to teach the robotic system regarding how to carry out a specific task. … Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, velocity data, acceleration data, torque data, force data, and power data, among other possibilities. For example, during teach mode the user 210 may grasp onto any part of the robotic arm 200 and provide an external force by physically moving the robotic arm 200. In particular, the user 210 may guide the robotic arm 200 towards grasping onto an object and then moving the object from a first location to a second location. As the user 210 guides the robotic arm 200 during teach mode, the system may obtain and record data related to the movement such that the robotic arm 200 may be configured to independently carry out the task at a future time during independent operation ….”),
acquiring information relating to a contact force based on the first information, the contact force being generated in a case where the predetermined portion contacts a surrounding object, or in a case where an object held by the predetermined portion contacts the surrounding object (Col. 9 lines 9-32, where “To illustrate, refer to FIG. 3B again depicting the scenario where the user 210 seeks to instruct the robotic arm 200 as to how to wipe the surface. This may specifically involve guiding the robotic arm 200 to apply specific forces onto the surface. In some cases, the user 210 may guide the robotic arm 200 to apply forces onto the surface that exceed the torques (e.g., in one or more of the actuators) that the robotic arm 200 is actually capable of independently generating in order to apply these forces onto the surface. For example, user 210 is shown to move the robotic arm 200 such that a force 308 is applied onto the surface. … As such, the robotic arm 200 may detect, during teach mode, a taught movement involving application of force 308 that exceeds a corresponding torque limit of the actuator in joint 204D. Such detection may be carried out using sensor data from force sensors positioned at the end effector 208, among other possibilities.”), and
acquiring a trajectory of the predetermined portion based on the second information (Col. 5 lines 31-57, where “In particular, teach mode may be an operating mode of the robotic arm 200 that allows a user 210 … to physically interact with and guide the robotic arm 200
towards carrying out and recording various movements. … The robotic arm 200 may thus obtain data regarding how to carry out the specific task based on instructions and guidance from the user 210. Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, … among other possibilities. … In particular, the user
210 may guide the robotic arm 200 towards grasping onto an object and then moving the object from a first location to a second location. As the user 210 guides the robotic arm 200
during teach mode, the system may obtain and record data related to the movement such that the robotic arm 200 may be configured to independently carry out the task at a future time during independent operation ….”), (Col. 13 lines 61-67, where “As illustrated, the snap-to template may be a template having a corresponding straight line path 502 in the environment of the robotic system 100. In particular, this straight line path 502 is shown as arranged along the surface. In this manner, the robotic arm 200 can record movement (i.e., during teach mode) of the end effector 208 along a straight line path 502 for the purpose of wiping the surface.”).
Bingham is silent on setting a section automatically in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching, and simplifying the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot, and generating position control data for the position control of the robot based on the simplified trajectory.
However, Fudaba teaches setting a section automatically in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching ([0274] via “In the "teach mode", a person teaches the robot arm 102 a motion, and the robot arm 102 is moved in accordance with manipulation by the person. In the "replay mode", the robot arm 102 automatically replays a motion taught by a person in the "teach mode", and the robot arm 102 automatically moves without manipulation by a person.”), ([0304] via “FIG. 10A illustrates the time point where the person's hand 1001 grips the robot arm 102 and the person starts teaching the robot arm 102. In this time point, the flexible board 1002 gripped by the hand 701 of the robot arm 102 and the connector 1003 have not yet come into contact with each other. The person's hand 1001 thus receives no reactive force to be generated upon contact between the flexible board 1002 and the connector 1003.”), ([0305] via “Next, FIG. 10C illustrates the time point where the distal end of the flexible board 1002 is in contact with an inlet port of the connector 1003. After elapse of a reaction time from this time point, the person's hand 1001 receives a reactive force generated upon contact, and the person changes his or her behavior. … The force sensor 716 detects the magnitude of the reactive force during this inserting task.”), ([0317] via “FIG. 13 indicates a value of the force sensor 716 and positional information on the robot arm 102 during the replaying motion illustrated in FIGS. 12A to 12J. The positional information on the robot arm 102 corresponds to positional information that is acquired by the motion information acquiring section 106. A first solid line 313 in graph indicates the value detected by the force sensor 716. A second solid line 314 in graph indicates the position of the robot arm 102”), (Note: See Figures 10-13 of Fudaba as well.).
Further, Shimodaira teaches simplifying the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot ([0065] via “In a graph G1, the force Fz is zero at the position where the z coordinate value is 0 (second teaching point TP2). When the first object OB1 advances in the −z direction and the two objects OB1 and OB2 are brought into contact, the force Fz increases as the first object OB1 moves in the −z direction, the force Fz temporarily decreases after reaching a peak value Fpk, and then the force Fz increases again.”), ([0079] via “FIG. 14 is an explanatory diagram showing an example of the movement of the end effector 160 from the first teaching point TP1 to the second teaching point TP2 according to teaching data. … Position control in the movement from the first teaching point TP1 to the second teaching point TP2 is performed by, for example, Continuous Path control (CP control). The CP control is a control method in which two points are continuously interpolated so that the movement path between two points of the end effector 160 follows a certain trajectory. In the example of
FIG. 14, the movement path between the first teaching point TP1 and the second teaching point TP2 is formed so as to follow a linear trajectory. In this way, it is possible to reduce the possibility that the first object OB1 and the end effector 160 physically interfere with other objects. After reaching the second teaching point TP2, the end effector 160 is moved in the −z direction, and the end effector 160 is moved until the force detected by the force detector 150 reaches the second force F2 (FIG. 7) so as to fit the objects OB1 and OB2 with each other.”), (Note: See Figures 4-6 and 14 of Shimodaira. Specifically, where teaching point TP2 of Shimodaira does not result in the generation of contact force between the gripped object OB1 and the surrounding object OB2.), and
generating position control data for the position control of the robot based on the simplified trajectory ([0078] via “In step S160, the teaching data 234 is generated using the input of the received teaching point, and stored in the nonvolatile memory 230 of the controller 200. The generation of the teaching data 234 is executed by the teaching data generation unit 214 of the controller 200. As is well known, the teaching data 234 includes a plurality of teaching points, and a description of control modes (position control and force control) to be executed upon movement between respective teaching points. … When the teaching data 234 is completed in this manner, in step S20 of FIG. 9, in order to manufacture the actual product, the fitting work according to the teaching data is executed for a plurality of sets of objects OB1 and OB2.”), ([0079] via “FIG. 14 is an explanatory diagram showing an example of the movement of the end effector 160 from the first teaching point TP1 to the second teaching point TP2 according to teaching data. … Position control in the movement from the first teaching point TP1 to the second teaching point TP2 is performed by, for example, Continuous Path control (CP control). The CP control is a control method in which two points are continuously interpolated so that the movement path between two points of the end effector 160 follows a certain trajectory. In the example of FIG. 14, the movement path between the first teaching point TP1 and the second teaching point TP2 is formed so as to follow a linear trajectory. In this way, it is possible to reduce the possibility that the first object OB1 and the end effector 160 physically interfere with other objects. After reaching the second teaching point TP2, the end effector 160 is moved in the −z direction, and the end effector 160 is moved until the force detected by the force detector 150 reaches the second force F2 (FIG. 7) so as to fit the objects OB1 and OB2 with each other.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Fudaba wherein the control method comprises: setting a section automatically in which the contact force is not generated as a section in which position control is performed on the robot by determining whether the contact force is present while the user moves the predetermined portion by direct teaching. Doing so allows the robot to autonomously perform specific tasks after initially learning how to perform the specific task from the direct teaching, as stated by Fudaba ([0323] via “As described above, the feedback rule generating section 111 generates a feedback rule on the basis of taught data, and the controller 114 replays in accordance with the feedback rule thus generated and motion information generated by the motion generating section 112. It is thus possible to replay a motion similar to the teaching motion even in a case where the environment is varied.”).
In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Shimodaira wherein the control method comprises: simplifying the trajectory in the section in which the position control is performed on the robot by interpolating between a start point and an end point of the trajectory in the section in which the position control is performed on the robot, and generating position control data for the position control of the robot based on the simplified trajectory. Doing so reduces the chances that the predetermined portion collides in unwanted contact with external objects when executing the trajectory, as stated above by Shimodaira in [0079].
Regarding Claim 41, modified reference Bingham teaches a non-transitory computer readable medium storing a program causing a computer to execute the control method as set forth in claim 40 (Col. 1 lines 45-50, where “In another aspect, a non-transitory computer readable medium is provided. The non-transitory computer readable medium has stored therein instructions executable by one or more processors to cause a robotic system to perform functions, the robotic system including a plurality of components.”).
10. Claim(s) 25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira), and further in view of Yokoi (US 20160263747 A1 hereinafter Yokoi).
Regarding Claim 25, modified reference Bingham teaches the controller according to claim 21, but is silent on wherein the processing part is configured to perform linear interpolation or joint interpolation as a predetermined interpolation method.
However, Yokoi teaches wherein the processing part is configured to perform linear interpolation or joint interpolation as a predetermined interpolation method ([0070] via “The trajectory calculating unit 331 generates a path of the robot arm 201, which connects a plurality of set teaching points, according to a predetermined interpolation method (for instance, linear interpolation, circular interpolation, joint interpolation or the like). Then, the trajectory calculating unit 331 generates a trajectory of the robot arm 201 from the generated path of the robot arm 201.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Yokoi wherein the processing part is configured to perform linear interpolation or joint interpolation as a predetermined interpolation method. The courts have determined under the case KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 415-421, 82 USPQ2d 1385, 1395-07 (2007), a number of rationales in which obviousness is concluded. The rationale that pertains to the present invention is rationale B: Simple Substitution of One Known Element for Another to Obtain Predictable Results. Specifically, in this case item 3 of rationale B is satisfied: a finding that one of ordinary skill in the art could have substituted one known element for another, and the results of the substitution would have been predictable. Linear and joint interpolation are common and well-known methods of interpolating between data points. While the invention of Shimodaira teaches interpolating between two points, despite the lack of mention that this interpolation method is by linear or joint interpolation, the functionalities of the invention would still produce the same outcomes, and therefore the simple substitution of the interpolation method being either linear interpolation or joint interpolation would have been obvious to implement.
11. Claim(s) 30, 31, 32, and 34 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira), and further in view of Sato et al. (US 20170285625 A1 hereinafter Sato).
Regarding Claim 30, modified reference Bingham teaches the controller according to claim 29, but is silent on wherein the robot system includes a handling portion handled by the user, and wherein the processing part is configured to move the predetermined portion by the user handling the handling portion.
However, Sato teaches wherein the robot system includes a handling portion handled by the user ([0025] via “The robot 12 is a vertical articulated robot, and includes a robot base
18, a revolving drum 20, a robot arm 22, a robot hand 24, and a handling part 38.”), ([0028] via “The handling part 38 is a handle having a shape easy to grip for the operator A, and fixed to the adapter 34.”), and
wherein the processing part is configured to move the predetermined portion by the user handling the handling portion ([0056] via “Then, the robot controller 14 sends the generated speed command to each servo motor 32 so as to move the robot hand 24 in the direction of the handling force HF applied by the operator A. Consequently, the robot 12
operates in accordance with the handling force HF applied to the handling part 38 by the operator A.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sato wherein the robot system includes a handling portion handled by the user, and wherein the processing part is configured to move the predetermined portion by the user handling the handling portion. Doing so controls the robot in accordance with how the user manipulates the handling portion, as stated by Sato ([0057] via “Thus, in this embodiment, the robot controller functions as an operation controller 50 (FIG. 2) configured to operate the robot 12 in accordance with the handling force HF.”).
Regarding Claim 31, modified reference Bingham teaches the controller according to claim 30, but is silent on wherein the robot system includes a third sensor configured to acquire third information related to a force acting on the handling portion, wherein the processing part is configured to acquire third information when the user moves the predetermined portion, and wherein the processing part is configured to acquire information related to a handling force given to the handling portion by handling of the user when the predetermined portion is moved by the user based on the first information and the third information.
However, Sato teaches wherein the robot system includes a third sensor configured to acquire third information related to a force acting on the handling portion ([0040] via “The second force sensor 42 is interposed between the handling part 38 and the adapter 34. The second force sensor 42 is composed of a 6-axis force sensor, and transmits to the robot controller 14 an output signal corresponding to a strain generated at the second force sensor
42.”),
wherein the processing part is configured to acquire third information when the user moves the predetermined portion ([0044] via “The robot controller 14 respectively calculates forces in the x-axis direction, y-axis direction, and z-axis direction of the second sensor-coordinate system shown in FIG. 4, and moments about the x-axis direction, y-axis direction, z-axis direction of the second sensor-coordinate system, on the basis of the output signal from the second force sensor 42. In this manner, the robot controller 14 calculates the handling force HF applied to the handling part 38.”), and
wherein the processing part is configured to acquire information related to a handling force given to the handling portion by handling of the user when the predetermined portion is moved by the user based on the first information and the third information ([0058] via “At step S5, the robot controller 14 calculates the contact force CF applied to the robot 12. First, the robot controller 14 calculates a force HF′ acting on the first force sensor 40 due to the handling force HF, from the most-recently detected handling force HF.”), ([0060] via “Then, the robot controller 14 subtracts the thus-calculated force HF′ from the most-recently detected external force EF. As a result, the component of the handling force HF is eliminated from the external force EF detected by the first force sensor 40, thereby it is possible to calculate the contact force CF applied from an external object to the robot 12 when a portion of the robot 12 contacts the object.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sato wherein the robot system includes a third sensor configured to acquire third information related to a force acting on the handling portion, wherein the processing part is configured to acquire third information when the user moves the predetermined portion, and wherein the processing part is configured to acquire information related to a handling force given to the handling portion by handling of the user when the predetermined portion is moved by the user based on the first information and the third information. Doing so allows the controller to calculate the contact force applied to an external object when the handling force is also applied to the robot, as stated above by Sato in paragraph [0060].
Regarding Claim 32, modified reference Bingham teaches the controller according to claim 31, but is silent on wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force.
However, Sato teaches wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force ([0058] via “At step S5, the robot controller 14
calculates the contact force CF applied to the robot 12. First, the robot controller 14
calculates a force HF′ acting on the first force sensor 40 due to the handling force HF, from the most-recently detected handling force HF.”), ([0060] via “Then, the robot controller 14
subtracts the thus-calculated force HF′ from the most-recently detected external force EF. As a result, the component of the handling force HF is eliminated from the external force EF detected by the first force sensor 40, thereby it is possible to calculate the contact force CF applied from an external object to the robot 12 when a portion of the robot 12 contacts the object.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sato wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force. Doing so allows the controller to calculate the contact force applied to an external object when the handling force is also applied to the robot, as stated above by Sato in paragraph [0060].
Regarding Claim 34, modified reference Bingham the controller according to claim 31, but is silent on wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force, and wherein the processing part is configured to smooth the information related to the handling force and acquire the force control data based on the information related to a handling force smoothed.
However, Sato teaches wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force ([0058] via “At step S5, the robot controller 14
calculates the contact force CF applied to the robot 12. First, the robot controller 14
calculates a force HF′ acting on the first force sensor 40 due to the handling force HF, from the most-recently detected handling force HF.”), ([0060] via “Then, the robot controller 14
subtracts the thus-calculated force HF′ from the most-recently detected external force EF. As a result, the component of the handling force HF is eliminated from the external force EF detected by the first force sensor 40, thereby it is possible to calculate the contact force CF applied from an external object to the robot 12 when a portion of the robot 12 contacts the object.”), and
wherein the processing part is configured to smooth the information related to the handling force and acquire the force control data based on the information related to a handling force smoothed ([0041] via “The robot controller 14 filters the output signal from the second force sensor 42 by using a means, such as low-pass filtering, arithmetic averaging, weighted averaging, FIR filtering, or IIR filtering, so as to remove a noise component from the output signal.”), ([0058] via “At step S5, the robot controller 14 calculates the contact force CF applied to the robot 12. First, the robot controller 14 calculates a force HF′ acting on the first force sensor 40 due to the handling force HF, from the most-recently detected handling force HF.”), ([0060] via “Then, the robot controller 14 subtracts the thus-calculated force HF′ from the most-recently detected external force EF. As a result, the component of the handling force HF is eliminated from the external force EF detected by the first force sensor 40, thereby it is possible to calculate the contact force CF applied from an external object to the robot 12 when a portion of the robot 12 contacts the object.”), (Note: The Examiner interprets averaging the force as the smoothing process, as this process is described in paragraphs [0078] and [0094] of the specification of the instant application.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sato wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force, and wherein the processing part is configured to smooth the information related to the handling force and acquire the force control data based on the information related to a handling force smoothed. Doing so filters the force data to then be incorporated to calculate the contact force applied to an external object when the handling force is also applied to the robot, as stated above by Sato in paragraphs [0041] and [0060].
12. Claim(s) 33 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira), and further in view of Sato et al. (US 20170285625 A1 hereinafter Sato) and Ishii (US 20180200881 A1 hereinafter Ishii).
Regarding Claim 33, modified reference Bingham teaches the controller according to claim 29, but is silent on wherein the robot system includes a handling portion handled by the user, wherein the robot system includes a third sensor configured to acquire third information related to a force acting on the handling portion, wherein the processing part is configured to acquire third information when the user moves the predetermined portion, wherein the processing part is configured to acquire information related to a handling force given to the handling portion by handling of the user when the predetermined portion is moved by the user based on the first information and the third information, and wherein in a case where although the information related to the handling force continuously fluctuates but the contact force does not change or a number of changes is a predetermined number or less in a section, the processing part is configured to determine the section as a section of an exploring operation.
However, Sato teaches wherein the robot system includes a handling portion handled by the user ([0025] via “The robot 12 is a vertical articulated robot, and includes a robot base
18, a revolving drum 20, a robot arm 22, a robot hand 24, and a handling part 38.”), ([0028] via “The handling part 38 is a handle having a shape easy to grip for the operator A, and fixed to the adapter 34.”),
wherein the robot system includes a third sensor configured to acquire third information related to a force acting on the handling portion ([0040] via “The second force sensor 42 is interposed between the handling part 38 and the adapter 34. The second force sensor 42 is composed of a 6-axis force sensor, and transmits to the robot controller 14 an output signal corresponding to a strain generated at the second force sensor 42.”),
wherein the processing part is configured to acquire third information when the user moves the predetermined portion ([0044] via “The robot controller 14 respectively calculates forces in the x-axis direction, y-axis direction, and z-axis direction of the second sensor-coordinate system shown in FIG. 4, and moments about the x-axis direction, y-axis direction, z-axis direction of the second sensor-coordinate system, on the basis of the output signal from the second force sensor 42. In this manner, the robot controller 14 calculates the handling force HF applied to the handling part 38.”),
wherein the processing part is configured to acquire information related to a handling force given to the handling portion by handling of the user when the predetermined portion is moved by the user based on the first information and the third information ([0058] via “At step S5, the robot controller 14 calculates the contact force CF applied to the robot 12. First, the robot controller 14 calculates a force HF′ acting on the first force sensor 40 due to the handling force HF, from the most-recently detected handling force HF.”), ([0060] via “Then, the robot controller 14 subtracts the thus-calculated force HF′ from the most-recently detected external force EF. As a result, the component of the handling force HF is eliminated from the external force EF detected by the first force sensor 40, thereby it is possible to calculate the contact force CF applied from an external object to the robot 12 when a portion of the robot 12 contacts the object.”).
Further, Ishii teaches wherein in a case where although the information related to the handling force continuously fluctuates but the contact force does not change or a number of changes is a predetermined number or less in a section, the processing part is configured to determine the section as a section of an exploring operation ([0024] via “The control device 20
further includes a safety assurance operation command section 21 that compares the external force detected by the first force detection section S1 with a predetermined threshold value, and in a case where the external force exceeds a predetermined threshold value A, commands a safety assurance operation of causing the robot 10 to move in a direction that reduces the external force or causes the robot 10 to stop. The predetermined threshold value A is a value determined in advance through experiments, etc., and is stored in the storage section 22.”), (Note: See Figure 2 of Ishii, wherein the task differs depending on whether the handling force meets a predetermined threshold. While not explicitly stated, the Examiner submits that whether the task is an exploring task or not, the functionalities of the claim would remain the same.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sato wherein the robot system includes a handling portion handled by the user, wherein the robot system includes a third sensor configured to acquire third information related to a force acting on the handling portion, wherein the processing part is configured to acquire third information when the user moves the predetermined portion, wherein the processing part is configured to acquire information related to a handling force given to the handling portion by handling of the user when the predetermined portion is moved by the user based on the first information and the third information. Doing so allows the controller to calculate the contact force applied to an external object when the handling force is also applied to the robot, as stated above by Sato in paragraph [0060].
In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Ishii wherein in a case where although the information related to the handling force continuously fluctuates but the contact force does not change or a number of changes is a predetermined number or less in a section, the processing part is configured to determine the section as a section of an exploring operation. Doing so controls the robot to perform different tasks according to the applied force, as stated above by Ishii.
13. Claim(s) 35 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira), further in view of Sato et al. (US 20170285625 A1 hereinafter Sato), and further in view of Maeda et al. (JP 2015085492 A hereinafter Maeda).
Regarding Claim 35, modified reference Bingham teaches the controller according to claim 31, but is silent on wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force, and wherein the processing part is configured to set one section in which the processing part acquires the information related to the handling force as the force control data by correcting based on change of the information related to the handling force, and another section in which the processing part acquires the information related to the handling force as the force control data without the correction.
However, Sato teaches wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force ([0058] via “At step S5, the robot controller 14
calculates the contact force CF applied to the robot 12. First, the robot controller 14 calculates a force HF′ acting on the first force sensor 40 due to the handling force HF, from the most-recently detected handling force HF.”), ([0060] via “Then, the robot controller 14
subtracts the thus-calculated force HF′ from the most-recently detected external force EF. As a result, the component of the handling force HF is eliminated from the external force EF detected by the first force sensor 40, thereby it is possible to calculate the contact force CF applied from an external object to the robot 12 when a portion of the robot 12 contacts the object.”).
Further, Maeda teaches wherein the processing part is configured to set one section in which the processing part acquires the information related to the handling force as the force control data by correcting based on change of the information related to the handling force, and another section in which the processing part acquires the information related to the handling force as the force control data without the correction (Page 3 paragraph 3 via “Specifically, the robot 50 according to the present embodiment includes the arm 20 including the end effector 30 and the force sensor 10. Then, after any of the gripped objects, the end effector 30 and the arm 20 that are gripped by the end effector 30 comes into contact with the object (for example, corresponding to FIG. 7B), the arm 20 moves and stops (for example, FIG. Corresponding to C)), the position and orientation of the arm 20 are returned to the position and orientation of the arm at the time of contact with the object (for example, corresponding to FIG. 7D).”), (Page 9 paragraph 3 via “FIG. 10 shows sensor information S1 before the smoothing process and sensor information S2 after the smoothing process. As can be seen from S <b> 1 in FIG. 10, unless smoothing processing is performed, the value of sensor information varies greatly at high frequencies, and stop determination processing that performs comparison processing with a threshold value cannot be performed with high accuracy. On the other hand, by performing the smoothing process, as shown in S2, the sensor information is suppressed from fluctuations at high frequencies, and fluctuations at relatively low frequencies can be easily identified. In general, a force generated by contact or the like contributes to a fluctuation at a low frequency, and therefore, it is possible to perform a determination using a force sensor value with high accuracy by performing a smoothing process.”), (Note: The Examiner interprets the smoothing process of Maeda as the correcting. Also see Figure 10 of Maeda (reproduced below).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sato wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force. Doing so allows the controller to calculate the contact force applied to an external object when the handling force is also applied to the robot, as stated above by Sato in paragraph [0060].
In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Maeda wherein the processing part is configured to set one section in which the processing part acquires the information related to the handling force as the force control data by correcting based on change of the information related to the handling force, and another section in which the processing part acquires the information related to the handling force as the force control data without the correction. Doing so incorporates a highly accurate force sensing value when performing robot control, as stated above by Maeda on page 9 paragraph 3.
PNG
media_image1.png
499
318
media_image1.png
Greyscale
Figure 10 of Maeda
14. Claim(s) 36 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira), further in view of Sato et al. (US 20170285625 A1 hereinafter Sato), and further in view of Uchida (JP H10249767 A hereinafter Uchida).
Regarding Claim 36, modified reference Bingham teaches the controller according to claim 31, but is silent on wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force, and wherein the processing part is configured to determine whether a retry operation is made in teaching the robot based on the first information, the second information, and the third information, and in a case where a determination is made such that the retry operation has been made, the processing part is configured to acquire the force control data so as not to regenerate failed operations.
However, Sato teaches wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force ([0058] via “At step S5, the robot controller 14
calculates the contact force CF applied to the robot 12. First, the robot controller 14
calculates a force HF′ acting on the first force sensor 40 due to the handling force HF, from the most-recently detected handling force HF.”), ([0060] via “Then, the robot controller 14
subtracts the thus-calculated force HF′ from the most-recently detected external force EF. As a result, the component of the handling force HF is eliminated from the external force EF detected by the first force sensor 40, thereby it is possible to calculate the contact force CF applied from an external object to the robot 12 when a portion of the robot 12 contacts the object.”), and
wherein the processing part is configured to determine whether a retry operation is made in teaching the robot based on the third information, and in a case where a determination is made such that the retry operation has been made, the processing part is configured to acquire the force control data so as not to regenerate failed operations ([0075] via “When the robot controller 14 determines that the direction of the handling force HF coincides with the allowable motion direction (i.e., determines YES), the robot controller 14
proceeds to step S4 in FIG. 5, and carries out the hand-guide operation again in accordance with the handling force HF. On the other hand, when the robot controller 14 determines that the direction of the handling force HF does not coincide with the allowable motion direction (i.e., determines NO), the robot controller 14 proceeds to step S16.”), ([0078] via “When the robot controller 14 determines that the switch 44 is turned off (i.e., determines YES), the robot controller 14 ends step S7 shown in FIG. 6, and thereby ends the flow shown in FIG. 5. On the other hand, when the robot controller 14 determines that the switch 44 is turned on (i.e., determines NO), it returns to step S14.”), (Note: The Examiner interprets the handling force HF of Sato to be within/outside an allowable motion direction as determining whether a retry operation is made, and the continuation of the operation (YES as step S15) would not regenerate failed operations. See Figures 5 and 6 of Sato as well.).
Further, Uchida teaches wherein the processing part is configured to determine whether a retry operation is made in teaching the robot based on the first information and the second information, and in a case where a determination is made such that the retry operation has been made, the processing part is configured to acquire the force control data so as not to regenerate failed operations (Page 4 paragraph 2 via “From this process, the robot hand 1 is controlled by two control systems. One is a position control system that aims at the deformation rate of the object to be grasped. The other is a force control system in which a target reaction force received by the target gripping structure 10 of the robot hand 1 from the target to be gripped is set as a control target value. The control computer 4 controls the robot hand 1 while feeding back the outputs of the position sensor 8 and the force sensor 9 until one of these two control systems is settled at the control target value.”), (Page 4 paragraph 5 via “The robot hand 1 is moved to the relative posture, and attempts to grasp the object again. This process is repeated until the gripping is finally achieved 22 while correcting the control target value and the control system parameters as described above.”), (Note: The Examiner interprets repeating the process until successful operation as not regenerating failed operations.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sato wherein the processing part is configured to acquire force control data to perform the force control of the robot in the second section based on the information related to the handling force, and wherein the processing part is configured to determine whether a retry operation is made in teaching the robot based on the third information, and in a case where a determination is made such that the retry operation has been made, the processing part is configured to acquire the force control data so as not to regenerate failed operations. Doing so prevents the robot from being operated in a way that is outside an allowable specification range, as stated above by Sato in paragraph [0075].
In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Uchida wherein the processing part is configured to determine whether a retry operation is made in teaching the robot based on the first information and the second information, and in a case where a determination is made such that the retry operation has been made, the processing part is configured to acquire the force control data so as not to regenerate failed operations. Doing so iterates the force and position control parameters to achieve a successful grasp of an object by the robot, as stated above by Uchida on page 4 paragraph 5.
15. Claim(s) 37 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira), and further in view of Yasuda et al. (JP 2008134903 A hereinafter Yasuda).
Regarding Claim 37, modified reference Bingham teaches the controller according to claim 27, wherein the processing part is configured to set a second section in which to perform a force control in the information related to the trajectory based on the information related to the contact force (Col. 5 lines 36-45, where “In a teaching mode, an external force is applied (e.g., by the user) to the robotic system 100 based on a teaching input that is intended to teach the robotic system regarding how to carry out a specific task. The robotic arm 200 may thus obtain data regarding how to carry out the specific task based on instructions and guidance from the user 210. Such data may relate to a plurality of configurations of the movable component(s) 116, joint position data, velocity data, acceleration data, torque data, force data, and power data, among other possibilities.”), (Col. 13 lines 22-36, where “In some cases, the other received information may include selection of the movable component(s) 116 that should be associated with the selected template. For example, user-input may be received selecting the end effector as the component associated with the path in the environment. Upon such selection, the robotic system 100 may be configured to evaluate position (e.g., coordinates) of the end effector in the environment with the position (e.g., coordinates) of the path in the environment. As shown by block 404, method 400 next involves initiating, by the robotic system, a recording process (e.g., operation in teach mode) for storing data related to motion of the at least one component in the environment. Note that, in some cases, receiving information related to the path may take place during the recording process.”).
Bingham is silent on wherein the controller includes an output portion configured to output display information corresponding to a first section in which the position control is performed and the second section in which the force control is performed on a display unit, and wherein the output portion is configured to display the display information as time series information such that the first section and the second section are discernible on the display unit.
However, Yasuda teaches wherein the controller includes an output portion configured to output display information corresponding to a first section in which the position control is performed and the second section in which the force control is performed on a display unit, and wherein the output portion is configured to display the display information as time series information such that the first section and the second section are discernible on the display unit (Page 6 paragraph 4 via “In the present invention, when teaching from the approach point (P1) to the insertion completion point (P2) in the peg insertion operation of FIG. 2, the time-series data storage means 6 of FIG. . In FIG. 2, 101 is a tip of the robot arm, 102 is a force torque sensor attached to the end, 103 is a gripper, 201 is a component 1 (peg) gripped by the gripper 101, and 202 is a component 1 inserted. Part 2. An example of the time series data is shown in FIG. FIG. 3A is an example of time-series data of the position in the insertion direction among the position and orientation of the robot tip. FIG. 3B is an example of time-series data of the force in the insertion direction detected by the force torque sensor 102.”), (Note: See figures 3A and 3B (reproduced below) where the first and second sections are discernible on the display as separate graphs.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Yasuda wherein the controller includes an output portion configured to output display information corresponding to a first section in which the position control is performed and the second section in which the force control is performed on a display unit, and wherein the output portion is configured to display the display information as time series information such that the first section and the second section are discernible on the display unit. Doing so allows for simultaneous position and force teaching of the robot, while still allowing both to be taught independently, as stated by Yasuda (Page 6 paragraph 5 via “In this way, teaching of position and orientation and teaching of force torque can be performed at the same time, and by converting time-series data into force torque data associated with the position and orientation, the travel time and initial position associated with the work It can be used as force teaching data independent of posture. Further, by removing unnecessary points and extracting patterns, the time series data when the robot is not moving becomes only one force torque data corresponding to the position, so that the data amount can be made very small.”).
PNG
media_image2.png
623
636
media_image2.png
Greyscale
Figures 3A and 3B of Yasuda
16. Claim(s) 46 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bingham et al. (US 9592608 B1 hereinafter Bingham) in view of Fudaba et al. (US 20140172143 A1 hereinafter Fudaba) and Shimodaira (US 20180319013 A1 hereinafter Shimodaira), and further in view of Brogardh (US 20060181236 A1 hereinafter Brogardh).
Regarding Claim 46, modified reference Bingham teaches the controller according to claim 21, but is silent on wherein the processing part is configured to automatically set a section in which the contact force is not generated as the section for simplifying a trajectory of the predetermined portion based on the contact force.
However, Brogardh teaches wherein the processing part is configured to automatically set a section in which the contact force is not generated as the section for simplifying a trajectory of the predetermined portion based on the contact force ([0112] via “During the measuring, the positions of the measuring points are determined relative to the base coordinate system of the robot. The surface scanning program does not only generate the measuring points, but also generates the path to be followed by the robot between the measuring points. The shape of this path depends on which type of sensor is used. … If a sensor for contactless measuring is used instead, the programmed movement can be performed parallel to the surface of the object during the measuring. FIG. 17 shows an example of an automatically generated robot path for measuring points on the surface of the object with a sensor 230 for contactless measuring. The sensor 230 is a distance measuring laser probe of a triangulation type.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Brogardh wherein the processing part is configured to automatically set a section in which the contact force is not generated as the section for simplifying a trajectory of the predetermined portion based on the contact force. Doing so improves the speed at which the trajectory is set relative between the predetermined portion and a surrounding object, as stated by Brogardh ([0111] via “A disadvantage with the use of search stop is that the measuring method will become slow, due to the fact that the robot has to keep a low velocity when it approaches the surface of the object. An alternative is instead to use a sensor measuring the distance between a part of the robot, preferably the tool holder of the robot, and the surface of the object. A suitable sensor is for example a LVDT sensor. During measuring with a LVDT sensor, contact between the surface of the object and the sensor is required during the measuring. It is also possible to use a sensor adapted for contactless measuring, for example sensors based on laser, ultra sonic, eddy current, induction, micro wave, air flow and capacitance measuring.”).
Examiner’s Note
17. The Examiner has cited particular paragraphs or columns and line numbers in the
references applied to the claims above for the convenience of the Applicant. Although the
specified citations are representative of the teachings of the art and are applied to specific
limitations within the individual claim, other passages and figures may apply as well. It is
respectfully requested of the Applicant in preparing responses, to fully consider the references
in their entirety as potentially teaching all or part of the claimed invention, as well as the
context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP
2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole,
including portions that would lead away from the claimed Invention. W.L. Gore & Associates,
Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123.
Conclusion
18. Any inquiry concerning this communication or earlier communications from the
examiner should be directed to BYRON X KASPER whose telephone number is (571)272-3895.
The examiner can normally be reached Monday - Friday 8 am - 5 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing
using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is
encouraged to use the USPTO Automated Interview Request (AIR) at
http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s
supervisor, Adam Mott can be reached on (571) 270-5376. The fax phone number for the
organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be
obtained from Patent Center. Unpublished application information in Patent Center is available
to registered users. To file and manage patent submissions in Patent Center, visit:
https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for
more information about Patent Center and https://www.uspto.gov/patents/docx for
information about filing in DOCX format. For additional questions, contact the Electronic
Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO
Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BYRON XAVIER KASPER/Examiner, Art Unit 3657
/ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657