DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 19 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 19 recites “The method of Claim 19.” A dependent claim cannot be dependent upon itself it is unclear as to which claim is the parent claim. For examination purposes, claim 19 is interpreted as being dependent of claim 15.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-2, 8-10, 12, 15-16, and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Natarajan (US 20190275671 A1) in view of Sato (IDS: US 20080312769 A1).
Regarding Claim 1,
Natarajan teaches
A robotic system, comprising: a communication interface; and a processor coupled to the communication interface and configured to: (“FIG. 3 is a block diagram of the example robot 102 of FIGS. 1 and 2 constructed in accordance with teachings of this disclosure. … an example communication bus 354.” See at least [0042-0043]; “FIGS. 1-3 and 5-8 could be implemented by one or more analog or digital circuit(s), logic circuit(s), programmable processor(s), programmable controller(s), graphics processing unit(s) (GPU(s)), digital signal processor(s) (DSP(s)), application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)) and/or field programmable logic device(s) (FPLD(s)).” See at least [0097])
receive a manifest or other data indicating a high-level objective to move a plurality of items from a source location to a destination location; (“The robot 102 of FIG. 1 is configured to obtain an image (e.g., image data) of an assembly of objects. … The robot 102 of FIG. 1 decomposes and/or deconstructs the obtained image into a plurality of constituent objects having object location goals (e.g., target object locations) and associated assembly goals (e.g., target object assembly parameters such as on a shelf, under a shelf, on another object, under another object, in front of another object, behind another object, etc.).” See at least [0038-0039])
utilize the manifest or other data to generate a plan to pick and place the plurality of items from the source location to the destination location in a particular order and manner; (“Based on the object location goals and the associated assembly goals, the robot 102 of FIG. 1 determines an object placement sequence to be implemented, invoked and/or executed by and/or at the robot 102 to sequentially place the constituent objects within a space.” See at least [0040]; “The robot 102 of FIGS. 1 and 2 constructs the complex assembly 200 of FIG. 2 by moving respective ones of physical objects (e.g., respective ones onto the milk jugs 202, the juice bottles 204, the soda bottles 206, and the water bottles 208) into and/or onto the shelving unit 124 and/or shelving 106 based on a object placement sequence and a plurality of action primitive sequences, as described above. In the illustrated example of FIG. 2, the robot 102 is in the process of moving certain ones of the juice bottles 204 onto the second shelf 116 of the shelving unit 124 and/or the shelving 106 in accordance with the object placement sequence and one or more action primitive sequence(s) implemented, invoked and/or executed by the robot 102.” See at least [0041]; Also see at least figs. 13-14)
move a first item of the plurality of items to a first location at the destination location as indicated by the manifest or other data using a robotic arm having an end effector; (“In the illustrated example of FIG. 2, the robot 102 is in the process of moving certain ones of the juice bottles 204 onto the second shelf 116 of the shelving unit 124 and/or the shelving 106 in accordance with the object placement sequence and one or more action primitive sequence(s) implemented, invoked and/or executed by the robot 102.” See at least [0041] and fig. 2; “The example movement manager 312 of FIG. 3 manages and/or controls the motor(s) 302 and/or movement(s) of the robot 102. In some examples, the movement manager 312 commands the robot 102 to construct an assembly of objects based on the sequence(s) of RL action primitives determined by the construction manager 310 of FIG. 3.” See at least [0084])
Natarajan does not explicitly teach, but Sato teaches
receive via the communication interface force sensor information generated by a force sensor; (“Once force detector 14 detects force F and moment M received by workpiece W2, controller 16 controls the operation of robot arm 12 and gripper 20 such that force F and moment M detected by force detector 14 approach target force Fd and target moment Md, respectively.” [0036])
use the force sensor information to align a structure comprising the first item with an opening associated with the first location; (“Next, X- and Y-axis forces F.sub.X, F.sub.Y and moments M.sub.X, M.sub.Y around X- and Y-axes detected by force detector 14 are compared with predetermined threshold values TF.sub.X, TF.sub.Y, TM.sub.X, TM.sub.Y, respectively, in every control cycle, and if any of the former values is larger than a predetermined corresponding threshold value, the process of step S104 is repeated in every control cycle and the correction of the position and orientation of gripper 20 and workpiece W2 is continued until all of F.sub.X, F.sub.Y, M.sub.X, M.sub.Y satisfy following Equation (6) (step S106). … On the other hand, if all of forces F.sub.X, F.sub.Y and moments M.sub.X, M.sub.Y detected by force detector 14 become less than or equal to predetermined threshold values TF.sub.X, TF.sub.Y, TM.sub.X, TM.sub.Y, respectively, and Equation (6) comes to be satisfied, it is judged that the error correction of the position and orientation of gripper 20 and workpiece W2 has been completed (step S108). When the correction of the position and orientation of gripper 20 and workpiece W2 has been completed, axis 38 of protrusion 24 of workpiece W2 held by gripper 20 is aligned with center axis 28 of fitting hole 26 of workpiece W1 fixed to table 18.” [0043-0044])
and insert the first item into the opening associated with the first location. (“Therefore, by moving gripper 20 and workpiece W2 in the fitting direction while keeping this position and orientation, protrusion 24 of workpiece W2 held by gripper 20 can be smoothly inserted into fitting hole 26 of workpiece W1 fixed to table 18, as shown in FIG. 4D, thereby completing the fitting operation (step S110).” [0044])
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Natarajan to further include the teachings of Sato with a reasonable expectation of success to facilitate a “quick, stable fitting operation.” (See at least [0015])
Regarding Claim 2,
Natarajan further teaches
wherein the processor is further configured to grasp the first item from the source location. (“The object pick-and-place action primitive 500 of FIG. 5 involves the robot 102 picking up an example object 502 from an example first location 504, and moving and/or placing the object 502 to and/or in an example second location 506.” See at least [0048]; See at least figs. 13 and 14 for the source location being the first location of the objects.)
Regarding Claim 8,
Natarajan further teaches
wherein the destination location includes the opening associated with the first location and one or more other openings associated with one or more other items. (See at least [0041] and fig. 2 (provided below) for shelves to hold different items, wherein shelves are interpreted as openings.)
PNG
media_image1.png
400
544
media_image1.png
Greyscale
Regarding Claim 9,
Natarajan further teaches
wherein one or more of the other openings associated with one or more other items has a shape or dimension that differs from the opening associated with the first location. (“The example shelving 106 of FIG. 1 can be implemented by and/or as any number (1, 2, 4, 10, 20, etc.), type, size and/or shape of shelves arranged and/or configured in any manner within the environment 100.” See at least [0137]; Also see at least [0073] and fig. 12 (provided below) for different shaped shelves)
PNG
media_image2.png
372
590
media_image2.png
Greyscale
Regarding Claim 10,
Natarajan does not explicitly teach, but Sato teaches
wherein the processor is configured to invoke a force control primitive to use the force sensor information to align the structure comprising the first item with the opening associated with the first location. (“Robot arm 12 further moves workpiece W2 in the fitting direction (i.e. Z-axis direction) parallel to center axis 28 of fitting hole 26 of workpiece W1 to contact it with workpiece W1 on table 18 (step S100). Please note that the operation commands include a velocity command .upsilon.(.upsilon..sub.x, .upsilon..sub.y, .upsilon..sub.z) for translating gripper 20 along the directions of the X-, Y- and Z-axes and an angular velocity command .omega.(.omega..sub.W, .omega..sub.P, .omega..sub.R) for rotating gripper 20 around the X-, Y- and Z-axes and that the components of velocity command .upsilon. and angular velocity command .omega. other than the fitting direction component of the velocity command become zero, i.e. .upsilon..sub.x=0, .upsilon..sub.y=0, .omega..sub.W=0, .omega..sub.P=0 and .omega..sub.R=0.” [0034]; “When workpiece W2 held by gripper 20 comes into contact with workpiece W1 fixed to table 18 in such a case, some force F and moment M exert on workpiece W2 held by gripper 20 (step S102). For example, in the case where, as shown in FIG. 4A, axis 38 of protrusion 24 of workpiece W2 held by gripper 20 is inclined with respect to center axis 28 of fitting hole 26 of workpiece W1 fixed to table 18, when two workpieces W1, W2 come into contact with each other, forces F.sub.X and F.sub.Y in the directions perpendicular to the fitting direction and moments M.sub.X, M.sub.Y around the axes perpendicular to the fitting direction exert on workpiece W2 held by gripper 20, as shown in FIG. 4B. Once force detector 14 detects force F and moment M received by workpiece W2, controller 16 controls the operation of robot arm 12 and gripper 20 such that force F and moment M detected by force detector 14 approach target force Fd and target moment Md, respectively.” [0036]; Also see fig. 3; Examiner Interpretation: An invoked force control primitive is S100 where velocity of the robot is controlled so as to contact a surface in the proximity of the slot to receive force sensor information useful for aligning the item with the slot.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Natarajan to further include the teachings of Sato with a reasonable expectation of success to facilitate a “quick, stable fitting operation.” (See at least [0015])
Regarding Claim 12,
Natarajan does not explicitly teach, but Sato teaches
wherein the processor is configured to detect based at least in item on the force sensor information that the opening associated with the first location is at a detected orientation that is different than an expected orientation. (“In the case where workpiece W1 and workpiece W2 are arranged such that center axis 28 of fitting hole 26 of workpiece W1 fixed to table 18 is accurately aligned with the axis of protrusion 24 of workpiece W2 held by gripper, 20, protrusion 24 of workpiece W2 will be smoothly inserted into fitting hole 26 of workpiece W1. Specifically, the components of force F(F.sub.X, F.sub.Y, F.sub.Z) and moment M(M.sub.X, M.sub.Y, M.sub.Z) exerting on workpiece W2 during the fitting operation other than the fitting direction component become zero (i.e. F.sub.X=0, F.sub.Y=0, M.sub.X=0, M.sub.Y=0, M.sub.Z=0). … However, position and orientation errors occur when workpiece W1 is fixed to table 18 and workpiece W2 is held by gripper 20, and the fitting operation is often performed while center axis 28 of fitting hole 26 of workpiece W1 fixed to table 18 is not accurately aligned with the axis of protrusion 24 of workpiece W2 held by gripper 20. When workpiece W2 held by gripper 20 comes into contact with workpiece W1 fixed to table 18 in such a case, some force F and moment M exert on workpiece W2 held by gripper 20 (step S102)” [0035-0036]; Examiner Interpretation: It’s interpreted that the existence of a position and orientation error of the hole (opening) can be determined based on the force and moment exerted on the workpiece held by the gripper. Position and orientation errors are differences from an expected position and orientation.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Natarajan to further include the teachings of Sato with a reasonable expectation of success to facilitate a “quick, stable fitting operation.” (See at least [0015])
Regarding Claim 15,
Natarajan teaches
A method, comprising: (“This disclosure relates generally to autonomous robots and, more specifically, to methods and apparatus for complex assembly via autonomous robots using reinforcement learning action primitives.” See at least [0001])
receiving a manifest or other data indicating a high-level objective to move a plurality of items from a source location to a destination location; (“The robot 102 of FIG. 1 is configured to obtain an image (e.g., image data) of an assembly of objects. … The robot 102 of FIG. 1 decomposes and/or deconstructs the obtained image into a plurality of constituent objects having object location goals (e.g., target object locations) and associated assembly goals (e.g., target object assembly parameters such as on a shelf, under a shelf, on another object, under another object, in front of another object, behind another object, etc.).” See at least [0038-0039])
utilizing the manifest or other data to generate a plan to pick and place the plurality of items from the source location to the destination location in a particular order and manner; (“Based on the object location goals and the associated assembly goals, the robot 102 of FIG. 1 determines an object placement sequence to be implemented, invoked and/or executed by and/or at the robot 102 to sequentially place the constituent objects within a space.” See at least [0040]; “The robot 102 of FIGS. 1 and 2 constructs the complex assembly 200 of FIG. 2 by moving respective ones of physical objects (e.g., respective ones onto the milk jugs 202, the juice bottles 204, the soda bottles 206, and the water bottles 208) into and/or onto the shelving unit 124 and/or shelving 106 based on a object placement sequence and a plurality of action primitive sequences, as described above. In the illustrated example of FIG. 2, the robot 102 is in the process of moving certain ones of the juice bottles 204 onto the second shelf 116 of the shelving unit 124 and/or the shelving 106 in accordance with the object placement sequence and one or more action primitive sequence(s) implemented, invoked and/or executed by the robot 102.” See at least [0041]; Also see at least figs. 13-14)
moving a first item of the plurality of items to a first location at the destination location as indicated by the manifest or other data using a robotic arm having an end effector; (“In the illustrated example of FIG. 2, the robot 102 is in the process of moving certain ones of the juice bottles 204 onto the second shelf 116 of the shelving unit 124 and/or the shelving 106 in accordance with the object placement sequence and one or more action primitive sequence(s) implemented, invoked and/or executed by the robot 102.” See at least [0041] and fig. 2; “The example movement manager 312 of FIG. 3 manages and/or controls the motor(s) 302 and/or movement(s) of the robot 102. In some examples, the movement manager 312 commands the robot 102 to construct an assembly of objects based on the sequence(s) of RL action primitives determined by the construction manager 310 of FIG. 3.” See at least [0084])
Natarajan does not explicitly teach, but Sato teaches
receiving force sensor information generated by a force sensor; (“Once force detector 14 detects force F and moment M received by workpiece W2, controller 16 controls the operation of robot arm 12 and gripper 20 such that force F and moment M detected by force detector 14 approach target force Fd and target moment Md, respectively.” [0036])
using the force sensor information to align a structure comprising the first item with an opening associated with the first location; (“Next, X- and Y-axis forces F.sub.X, F.sub.Y and moments M.sub.X, M.sub.Y around X- and Y-axes detected by force detector 14 are compared with predetermined threshold values TF.sub.X, TF.sub.Y, TM.sub.X, TM.sub.Y, respectively, in every control cycle, and if any of the former values is larger than a predetermined corresponding threshold value, the process of step S104 is repeated in every control cycle and the correction of the position and orientation of gripper 20 and workpiece W2 is continued until all of F.sub.X, F.sub.Y, M.sub.X, M.sub.Y satisfy following Equation (6) (step S106). … On the other hand, if all of forces F.sub.X, F.sub.Y and moments M.sub.X, M.sub.Y detected by force detector 14 become less than or equal to predetermined threshold values TF.sub.X, TF.sub.Y, TM.sub.X, TM.sub.Y, respectively, and Equation (6) comes to be satisfied, it is judged that the error correction of the position and orientation of gripper 20 and workpiece W2 has been completed (step S108). When the correction of the position and orientation of gripper 20 and workpiece W2 has been completed, axis 38 of protrusion 24 of workpiece W2 held by gripper 20 is aligned with center axis 28 of fitting hole 26 of workpiece W1 fixed to table 18.” [0043-0044])
and inserting the first item into the opening associated with the first location. (“Therefore, by moving gripper 20 and workpiece W2 in the fitting direction while keeping this position and orientation, protrusion 24 of workpiece W2 held by gripper 20 can be smoothly inserted into fitting hole 26 of workpiece W1 fixed to table 18, as shown in FIG. 4D, thereby completing the fitting operation (step S110).” [0044])
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Natarajan to further include the teachings of Sato with a reasonable expectation of success to facilitate a “quick, stable fitting operation.” (See at least [0015])
Regarding Claim 16,
Natarajan further teaches
further comprising grasping the first item from the source location. (“The object pick-and-place action primitive 500 of FIG. 5 involves the robot 102 picking up an example object 502 from an example first location 504, and moving and/or placing the object 502 to and/or in an example second location 506.” See at least [0048]; See at least figs. 13 and 14 for the source location being the first location of the objects.)
Regarding Claim 18,
Natarajan further teaches
wherein the destination location includes the opening associated with the first location and one or more other openings associated with one or more other items. (See at least [0041] and fig. 2 for shelves to hold different items, wherein shelves are interpreted as openings.)
Regarding Claim 19,
Natarajan further teaches
wherein one or more of the other openings associated with one or more other items has a shape or dimension that differs from the opening associated with the first location. (“The example shelving 106 of FIG. 1 can be implemented by and/or as any number (1, 2, 4, 10, 20, etc.), type, size and/or shape of shelves arranged and/or configured in any manner within the environment 100.” See at least [0137]; Also see at least [0073] and fig. 12 (provided below) for different shaped shelves)
Regarding Claim 20,
Natarajan teaches
A computer program product embodied in a non-transitory computer readable medium, comprising computer instructions for: (“a non-transitory computer-readable storage medium including instructions is disclosed. In some disclosed examples, the instructions, when executed, cause one or more processors of a robot to determine sequences of reinforcement learning (RL) action primitives based on object location goals and associated assembly goals determined for respective ones of objects depicted in an imaged assembly of objects. In some disclosed examples, the instructions, when executed, cause the one or more processors to command the robot to construct a physical assembly of objects based on the sequences of RL action primitives.” See at least [00151])
receiving a manifest or other data indicating a high-level objective to move a plurality of items from a source location to a destination location; (“The robot 102 of FIG. 1 is configured to obtain an image (e.g., image data) of an assembly of objects. … The robot 102 of FIG. 1 decomposes and/or deconstructs the obtained image into a plurality of constituent objects having object location goals (e.g., target object locations) and associated assembly goals (e.g., target object assembly parameters such as on a shelf, under a shelf, on another object, under another object, in front of another object, behind another object, etc.).” See at least [0038-0039])
utilizing the manifest or other data to generate a plan to pick and place the plurality of items from the source location to the destination location in a particular order and manner; (“Based on the object location goals and the associated assembly goals, the robot 102 of FIG. 1 determines an object placement sequence to be implemented, invoked and/or executed by and/or at the robot 102 to sequentially place the constituent objects within a space.” See at least [0040]; “The robot 102 of FIGS. 1 and 2 constructs the complex assembly 200 of FIG. 2 by moving respective ones of physical objects (e.g., respective ones onto the milk jugs 202, the juice bottles 204, the soda bottles 206, and the water bottles 208) into and/or onto the shelving unit 124 and/or shelving 106 based on a object placement sequence and a plurality of action primitive sequences, as described above. In the illustrated example of FIG. 2, the robot 102 is in the process of moving certain ones of the juice bottles 204 onto the second shelf 116 of the shelving unit 124 and/or the shelving 106 in accordance with the object placement sequence and one or more action primitive sequence(s) implemented, invoked and/or executed by the robot 102.” See at least [0041]; Also see at least figs. 13-14)
moving a first item of the plurality of items to a first location at the destination location as indicated by the manifest or other data using a robotic arm having an end effector; (“In the illustrated example of FIG. 2, the robot 102 is in the process of moving certain ones of the juice bottles 204 onto the second shelf 116 of the shelving unit 124 and/or the shelving 106 in accordance with the object placement sequence and one or more action primitive sequence(s) implemented, invoked and/or executed by the robot 102.” See at least [0041] and fig. 2; “The example movement manager 312 of FIG. 3 manages and/or controls the motor(s) 302 and/or movement(s) of the robot 102. In some examples, the movement manager 312 commands the robot 102 to construct an assembly of objects based on the sequence(s) of RL action primitives determined by the construction manager 310 of FIG. 3.” See at least [0084])
Natarajan does not explicitly teach, but Sato teaches
receiving force sensor information generated by a force sensor; (“Once force detector 14 detects force F and moment M received by workpiece W2, controller 16 controls the operation of robot arm 12 and gripper 20 such that force F and moment M detected by force detector 14 approach target force Fd and target moment Md, respectively.” [0036])
using the force sensor information to align a structure comprising the first item with an opening associated with the first location; (“Next, X- and Y-axis forces F.sub.X, F.sub.Y and moments M.sub.X, M.sub.Y around X- and Y-axes detected by force detector 14 are compared with predetermined threshold values TF.sub.X, TF.sub.Y, TM.sub.X, TM.sub.Y, respectively, in every control cycle, and if any of the former values is larger than a predetermined corresponding threshold value, the process of step S104 is repeated in every control cycle and the correction of the position and orientation of gripper 20 and workpiece W2 is continued until all of F.sub.X, F.sub.Y, M.sub.X, M.sub.Y satisfy following Equation (6) (step S106). … On the other hand, if all of forces F.sub.X, F.sub.Y and moments M.sub.X, M.sub.Y detected by force detector 14 become less than or equal to predetermined threshold values TF.sub.X, TF.sub.Y, TM.sub.X, TM.sub.Y, respectively, and Equation (6) comes to be satisfied, it is judged that the error correction of the position and orientation of gripper 20 and workpiece W2 has been completed (step S108). When the correction of the position and orientation of gripper 20 and workpiece W2 has been completed, axis 38 of protrusion 24 of workpiece W2 held by gripper 20 is aligned with center axis 28 of fitting hole 26 of workpiece W1 fixed to table 18.” [0043-0044])
and inserting the first item into the opening associated with the first location. (“Therefore, by moving gripper 20 and workpiece W2 in the fitting direction while keeping this position and orientation, protrusion 24 of workpiece W2 held by gripper 20 can be smoothly inserted into fitting hole 26 of workpiece W1 fixed to table 18, as shown in FIG. 4D, thereby completing the fitting operation (step S110).” [0044])
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Natarajan to further include the teachings of Sato with a reasonable expectation of success to facilitate a “quick, stable fitting operation.” (See at least [0015])
Claim(s) 3 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Natarajan (US 20190275671 A1) in view of Sato (IDS: US 20080312769 A1) and Jeon (IDS: US 5457773 A).
Regarding Claim 3,
Modified Natarajan does not explicitly teach, but Jeon teaches
wherein the processor is configured to move the first item to the first location using position control. (“Hereinafter, the robot actuator position control method will be described in reference to FIG. 5.” Col. 3, lines 36-37; Fig. 5 (shown below) shows position control with S3 and the following determination of whether to reposition.)
PNG
media_image3.png
679
353
media_image3.png
Greyscale
PNG
media_image4.png
628
412
media_image4.png
Greyscale
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Natarajan to further include the teachings of Jeon to resolve positioning errors that occur due to robot defects. (See at least Col. 1, lines 35-50)
Regarding Claim 17,
Modified Natarajan does not explicitly teach, but Jeon teaches
wherein the first item is moved to the first location using position control. (“Hereinafter, the robot actuator position control method will be described in reference to FIG. 5.” Col. 3, lines 36-37; Fig. 5 shows position control with S3 and the following determination of whether to reposition.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Natarajan to further include the teachings of Jeon to resolve positioning errors that occur due to robot defects. (See at least Col. 1, lines 35-50)
Claim(s) 4-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Natarajan (US 20190275671 A1) in view of Sato (IDS: US 20080312769 A1) and Ikeda (US 20200189097 A1).
Regarding Claim 4,
Natarajan further teaches
wherein the (“The example camera 304 of FIG. 3 is mounted to the robot 102 of FIGS. 1-3. The camera 304 is configured and/or positioned to capture images of objects located within a field of view of the camera 304.” See at least [0045])
Modified Natarajan does not explicitly teach, but Ikeda teaches
wherein the end effector includes a camera. (“a hand camera 141 provided at a tip of the robot arm 130.” See at least [0030] and fig. 1)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Natarajan to further include the teachings of Ikeda with a reasonable expectation of success “to appropriately recognize a grasping position of a target object to be grasped and then increase a success rate of grasping, and reduce a time required for the grasping.” (See at least [0013])
Regarding Claim 5,
Natarajan further teaches
wherein data obtained from the camera is utilized to determine one or more attributes associated with the first item. (“the robot 102 of FIG. 1 includes a camera configured to capture images. In some examples, the camera may capture an image of an assembly of objects located within the field of view of the camera. … The robot 102 of FIG. 1 decomposes and/or deconstructs the obtained image into a plurality of constituent objects having object location goals (e.g., target object locations) and associated assembly goals (e.g., target object assembly parameters such as on a shelf, under a shelf, on another object, under another object, in front of another object, behind another object, etc.). For example, the robot 102 of FIG. 1 may decompose and/or deconstruct (e.g., using one or more decomposition algorithm(s)) the assembly picture 126 of FIG. 1 into a plurality of milk jugs, juice bottles, soda bottles and water bottles, with each jug and/or bottle having an object location goal and an associated assembly goal.” See at least [0038-0039])
Regarding Claim 6,
Natarajan further teaches
wherein the processor is configured to receive data associated with the camera (In some examples, the camera may capture an image of an assembly of objects located within the field of view of the camera. … The robot 102 of FIG. 1 decomposes and/or deconstructs the obtained image into a plurality of constituent objects having object location goals (e.g., target object locations) and associated assembly goals.” See at least [0038-0039])
Modified Natarajan does not explicitly teach, but Ikeda teaches
receive data associated with the camera and data associated with one or more other cameras. (“The image-pickup acquisition unit 250 includes, for example, an environmental camera 121 provided at a position where an environmental space including moving ranges of the robot arm 130 and the robot hand 140 in the main-body part 120 can be observed, and a hand camera 141 provided at a tip of the robot arm 130. The environmental camera 121 and the hand camera 141 include an image pickup device which is, for example, a CMOS image sensor and an image data generating unit. The environmental camera 121 outputs image data generated by shooting an environmental space in front of it. The hand camera 141 outputs image data generated by shooting a space in front of the robot hand 140.” See at least [0030] and fig. 1; “The environmental camera 121 passes the generated image data to the control unit 200. As described above, the hand camera 141 is used for observing a space in front of the robot hand 140, and performs shooting in accordance with a shooting instruction from the control unit 200. The hand camera 141 passes the generated image data to the control unit 200.” See at least [0034])
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Natarajan to further include the teachings of Ikeda with a reasonable expectation of success “to appropriately recognize a grasping position of a target object to be grasped and then increase a success rate of grasping, and reduce a time required for the grasping.” (See at least [0013])
Regarding Claim 7,
Modified Natarajan does not explicitly teach, but Ikeda teaches
wherein the processor is configured to generate a three dimensional view of a work area associated with the robotic system based on one or more of the data associated with the camera and the data associated with the one or more other cameras. (“the information may be image data on a three-dimensional (3D) image created by compositing a plurality of image-pickup images of the target object to be grasped acquired by the image-pickup acquisition unit.” See at least [0012]; “The image-pickup acquisition unit 250 includes, for example, an environmental camera 121 provided at a position where an environmental space including moving ranges of the robot arm 130 and the robot hand 140 in the main-body part 120 can be observed, and a hand camera 141 provided at a tip of the robot arm 130.” See at least [0030])
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Natarajan to further include the teachings of Ikeda with a reasonable expectation of success “to appropriately recognize a grasping position of a target object to be grasped and then increase a success rate of grasping, and reduce a time required for the grasping.” (See at least [0013])
Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Natarajan (US 20190275671 A1) in view of Sato (IDS: US 20080312769 A1) and Kobayashi (IDS: US 20170151666 A1).
Regarding Claim 11,
Modified Natarajan does not explicitly teach, but Kobayashi teaches
wherein the processor is configured to use the force sensor information to align the structure comprising the first item with the opening associated with the first location in item by repositioning the first item to a location determined by a search algorithm and applying a downward vertical force. (“When performing the contact of the first object to a position different from the insertion portion of the second object and the separation of the first object and the second object at twice or more, the robot 20 helically moves at least one of the first object and the second object as seen in a direction in which the first object and the second object are closer to each other. Thereby, the robot 20 may seek a position in which the first object can be inserted into the insertion portion of the second object while helically changing the relative position between the first object and the insertion portion of the second object to start moving the object in the direction in which the first object and the second object are closer to each other.” [0205]; Also see at least [0130-0131]; See fig. 3 for the helical search trajectory and fig. 9 for downward vertical force (both provided below); Examiner Interpretation: The helical trajectory is the search algorithm and at each point until the hole is found, the pin is moved downward with a force to determine if the pin enters the hole. A measured force is used to determine whether the hole was found ([0123]))
PNG
media_image5.png
275
455
media_image5.png
Greyscale
PNG
media_image6.png
455
258
media_image6.png
Greyscale
;
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Natarajan to further include the teachings of Kobayashi with a reasonable expectation of success to facilitate finding the opening as to prevent continuous insertion failures that could damage the component. (See at least [0194-0195])
Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Natarajan (US 20190275671 A1) in view of Sato (IDS: US 20080312769 A1) and Luce (IDS: US 20100204824 A1).
Regarding Claim 13,
Modified Natarajan does not explicitly teach, but Luce teaches
wherein the processor is further configured to remap position information for the opening associated with the first location and one or more other openings associated with the destination location based at least in item on detecting that the destination receptacle is at the detected orientation that is different than an expected orientation. (“With continued reference to FIG. 10B, in act 423, the control system 250 may generate one or more text files that include one or more computer programs for causing the one or more sensors 240 to inspect a bit body 102 carried by the positioner 212 to identify differences between the intended locations and orientations of the cutting element pockets 112.” [0082]; “In act 424, the control system 250 may use these generated text files to cause the one or more sensors 240 (e.g., a vision system) to inspect a bit body 102 carried by the positioner 212 to identify differences between the intended locations and orientations of the cutting element pockets 112, as set forth in the design of the bit body 102, and the actual locations and orientations of the cutting element pockets 112 in the as-manufactured bit body 102. In act 425, the data or information acquired by the sensors 240 relating to the actual positions and orientations of the cutting element pockets 112 may be used to modify the motion programs that determine the paths to be followed by the positioner 212, the robot 222, and the robot 232 for a particular drill bit 100 or other tool to be processed using the cutting element attachment system 200.” [0084]; Examiner Interpretation: The cutting element’s pockets are interpreted as the openings. The sensor data of the actual position and orientation of the pockets is the remapped position information.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Natarajan to further include the teachings of Luce with a reasonable expectation of success to improve adaptability of the system such that the motion of the robot can be adjusted to account for the unexpected differences in slot orientation. (See at least [0084])
Allowable Subject Matter
Claim 14 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The relevant prior art does not disclose detecting an orientation approximately 180 degrees different than the expected orientation, withdrawing the item, rotating 180 degrees, and moving it to a remapped location as disclosed by the applicant.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Kerrick (US 20200147794 A1) is pertinent because it discusses receiving a CAD assembly and accordingly generating an assembly sequence for a robot.
Chirol (US 20210354917 A1) is pertinent because it discusses determining a palletizing plan specifying an order based on a given batch of objects.
The above mentioned art, evaluated separately and in combination, does not disclose the entirety of limitations of the dependent claim 14 since they do not describe detecting an orientation approximately 180 degrees different than the expected orientation, withdrawing the item, rotating 180 degrees, and moving it to a remapped location as disclosed by the applicant. No prior art has been found at the time of writing this office action to reject the pending claim 14 under 35 U.S.C. 102 or 103.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Karston G Evans whose telephone number is (571)272-8480. The examiner can normally be reached Mon-Fri 9:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached at (571)270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KARSTON G. EVANS/Examiner, Art Unit 3657