DETAILED ACTION
The amendments filed 2/17/2026 have been entered. Claims 1-6, 8-12, and 16-21 are pending.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-6, 8-10, 12, 19, and 20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Stubbs et al. (US Patent No. 9,694,494).
Stubbs teaches:
Re claim 1. A robot control module comprising at least one non-transitory processor-readable storage medium storing a library of three-dimensional shapes, a library of grasp primitives, and processor-executable instructions or data that, when executed by at least one processor of a robot system (grasp management service 102, Fig. 1-3), cause the robot system to:
access, by the at least one processor, a work objective of the robot system (column 21, lines 6-18: “the one or more contact points associated with the grasp that has the highest probability of success may be selected. In some examples, the selection of the one or more contact points may be constrained by some other factor in place of or in addition to probability of success. For example, if certain contact points would result in lower energy use, would require less manipulation of the item before and/or after picking the item up, or would be executed more quickly or efficiently, this may impact the selection of the one or more contact points.”);
capture, by at least one sensor carried by a robot body of the robot system, sensor data about an object (column 5, lines 39-34: “The robotic manipulator 110 may include any suitable type and number of sensors disposed throughout the robotic manipulator 110 (e.g., sensors in the base, in the arm, in joints in the arm, in the end of arm tool 126, or in any other suitable location).” And column 14, lines 33-42: three-dimensional scans of an item may be captured by an imaging device such as a camera.);
access, by the at least one processor, a platonic representation of the object comprising a set of at least one three-dimensional shape from the library of the three-dimensional shapes, the platonic representation of the object based at least in part on the sensor data (column 15, lines 3-9: “The primitive model 604 may be generated based at least in part on the model 602. The primitive model 604 may include one or more primitive shapes representative of the model 602 and the item. Thus, as illustrated, the primitive model 604 may include a combination of a cuboid as a torso, four cylinders as arms and legs, four spheres as hands and feet, a sphere as a head, and two cylinders as ears.”);
select, by the at least one processor and from the library of grasp primitives, a grasp primitive based at least in part on both the work objective and at least one three-dimensional shape in the platonic representation of the object (Column 15, lines 9-15: “Using the primitive model 604 and information from a grasp database that includes grasps based on similar primitive shapes, a set of predictably successful grasps may be generated for picking up the item at any of the locations represented by primitive shapes (e.g., ears, head, torso, arms and legs, or hands and feet).” Column 21, lines 6-18: “the one or more contact points associated with the grasp that has the highest probability of success may be selected. In some examples, the selection of the one or more contact points may be constrained by some other factor in place of or in addition to probability of success. For example, if certain contact points would result in lower energy use, would require less manipulation of the item before and/or after picking the item up, or would be executed more quickly or efficiently, this may impact the selection of the one or more contact points.”);
select, by the at least one processor, a grasp location at the at least one three- dimensional shape upon which the selection of the grasp primitive is at least partially based, the grasp location suited to achieve the work objective (Column 15, lines 9-15: “Using the primitive model 604 and information from a grasp database that includes grasps based on similar primitive shapes, a set of predictably successful grasps may be generated for picking up the item at any of the locations represented by primitive shapes (e.g., ears, head, torso, arms and legs, or hands and feet).” Column 21, lines 6-18: “At 1208, the process 1200 selects one or more contact points from the set of contact points. For example, the one or more contact points associated with the grasp that has the highest probability of success may be selected. In some examples, the selection of the one or more contact points may be constrained by some other factor in place of or in addition to probability of success. For example, if certain contact points would result in lower energy use, would require less manipulation of the item before and/or after picking the item up, or would be executed more quickly or efficiently, this may impact the selection of the one or more contact points.”);
control, by the at least one processor, an end effector of the robot body to apply the grasp primitive to grasp the object at a grasp location (column 15, lines 15-17: “These predictably successful grasps may be shared with a robotic manipulator for validation and execution.”);
as the end effector is controlled to apply the grasp primitive, capture further sensor data indicative of engagement between the end effector and the object being different from expected engagement between the end effector and the at least one three-dimensional shape upon which the selection of the grasp primitive is at least partially based (Column 5, lines 43-57: “The sensors can include sensors configured to detect pressure, force, weight, light, objects, slippage, and any other information that may be used to control and/or monitor the operation of the robotic manipulator 110, including the end of arm tool 126. … The sensing information may also be used as feedback to adjust the grasps used by the end of arm tool 126, and to generate new grasps, to validate grasps, and to determine quality values for grasps, which may be a numerical value based at least in part on one or more objective factors.” Column 21, lines 19-38: “At 1210, the process 1200 receives feedback from a robotic arm that grasps the item at the one or more contact points. This may be the one or more selected contact points. The feedback may come in any suitable form. For example, the feedback may include sensing information detected by any suitable sensor associated with the robotic arm (e.g., slippage, force exerted by suction, fingers, etc., whether the item was even capable of being picked up, and any other suitable information). At 1212, the process 1200 updates the set of contact points and the probabilities of success. This may include updating based at least in part on the feedback. For example, the feedback may indicate that the one or more contact points, while selected with high probabilities of computed, actual, or simulated success, are not ideal for grasping the item. This may be because the robotic manipulator was unable to grasp the item, the item slipped in the grasp, or any other indicator of success. This real-world feedback can be used to reduce (or increase) the probability of success for the one or more contact points.”); and
optimize, based on the further sensor data, actuation of at least one member of the end effector to increase grasp effectiveness (Column 5, lines 52-54: “The sensing information may also be used as feedback to adjust the grasps used by the end of arm tool 126, and to generate new grasps”. Column 21, lines 6-18: “At 1208, the process 1200 selects one or more contact points from the set of contact points. For example, the one or more contact points associated with the grasp that has the highest probability of success may be selected.”. Column 21, lines 28-38: “At 1212, the process 1200 updates the set of contact points and the probabilities of success. This may include updating based at least in part on the feedback. For example, the feedback may indicate that the one or more contact points, while selected with high probabilities of computed, actual, or simulated success, are not ideal for grasping the item. This may be because the robotic manipulator was unable to grasp the item, the item slipped in the grasp, or any other indicator of success. This real-world feedback can be used to reduce (or increase) the probability of success for the one or more contact points.”).
Re claim 2. wherein:
the processor-executable instructions or data further cause the at least one processor to: identify, by the at least one processor, the object (column 9, lines 37-39: “the item identification module 302 receives sensing information captured by a robotic arm and identifies an item based on the sensing information.”); and
the processor-executable instructions or data which cause the at least one processor to access the platonic representation of the object cause the at least one processor to: access a three-dimensional model of the object from a database, the three-dimensional model including the platonic representation of the object (column 7, lines 51-62: “For each entry that corresponds to an item, the item database 232 may include an item identifier (e.g., a unique product identifier), a description of the item, one or more stock images of the item, a surface model of the item or a link to the surface model of the item, a primitive shape model of the item or a link to the primitive shape model, a bounding box representation of the item, one or more actual images of the item (e.g., taken as it entered a facility), dimensions of the item (e.g., height, width, length), a location of a center of mass, a total weight, and any other suitable of information related to the item.” And column 15, lines 9-14: “Using the primitive model 604 and information from a grasp database that includes grasps based on similar primitive shapes, a set of predictably successful grasps may be generated for picking up the item at any of the locations represented by primitive shapes”).
Re claim 3. Wherein the processor-executable instructions or data which cause the at least one processor to access the platonic representation of the object cause the at least one processor to:
generate the at least one platonic representation of the object, by approximating the object with the set of at least one three-dimensional shape (column 15, lines 3-9: “The primitive model 604 may be generated based at least in part on the model 602. The primitive model 604 may include one or more primitive shapes representative of the model 602 and the item.”).
Re claim 4. wherein the processor-executable instructions or data which cause the at least one processor to generate the at least one platonic representation of the object, cause the at least one processor to:
identify at least one portion of the object suitable for representation by respective three-dimensional shapes (column 15, lines 3-17: “Thus, as illustrated, the primitive model 604 may include a combination of a cuboid as a torso, four cylinders as arms and legs, four spheres as hands and feet, a sphere as a head, and two cylinders as ears.”); and
for each portion of the at least one portion:
access a geometric three-dimensional shape model which is similar in shape to the portion (column 12, line 62 through column 13, line 1: “At the state 506, the grasping surfaces, which correspond to the shape of the feature 514, may be bounded by a primitive shape 516. The primitive shape may be selected from a set of primitive shapes and in a manner that attempts to most closely approximate the grasping surfaces, which, in this example, correspond to the shape of the feature 514.”; and
transform the accessed geometric three-dimensional shape model to fit the portion (column 13, lines 20-24: “in some examples, a second primitive shape may be selected and combined with the primitive shape 516 to approximate the feature 514. If more than one primitive shape is used to approximate the feature 514, the spatial relationship between the two primitive shapes may be saved.” And column 13, lines 27-40: “At the state 508, the size of the primitive shape 516 is expanded and contracted to create different versions 516(1)-516(N) of the primitive shape 516. These different versions may be considered altered primitive shapes.”).
Re claim 5. wherein the processor-executable instructions or data which cause the at least one processor to, for each portion of the at least one portion, transform the accessed three-dimensional geometric shape model to fit the portion, cause the at least one processor to:
transform a size of the geometric three-dimensional shape model in at least one dimension to fit the size of the geometric three-dimensional shape model to the portion (column 13, lines 27-40: “At the state 508, the size of the primitive shape 516 is expanded and contracted to create different versions 516(1)-516(N) of the primitive shape 516. These different versions may be considered altered primitive shapes.”);
transform a position of the geometric three-dimensional shape model to align with a position of the portion (column 13, lines 20-24: “in some examples, a second primitive shape may be selected and combined with the primitive shape 516 to approximate the feature 514. If more than one primitive shape is used to approximate the feature 514, the spatial relationship between the two primitive shapes may be saved.”); or
rotate the geometric three-dimensional shape model to fit the geometric model to an orientation of the portion (Fig. 6).
Re claim 6. Wherein the processor-executable instructions or data further cause the at least one processor to select the grasp location of the object (column 15, lines 9-15: “Using the primitive model 604 and information from a grasp database that includes grasps based on similar primitive shapes, a set of predictably successful grasps may be generated for picking up the item at any of the locations represented by primitive shapes (e.g., ears, head, torso, arms and legs, or hands and feet).”).
Re claim 8. Wherein the processor-executable instructions or data further cause the at least one processor to:
identify, based on the sensor data, at least one graspable feature of the object (column 15, lines 9-15: “Using the primitive model 604 and information from a grasp database that includes grasps based on similar primitive shapes, a set of predictably successful grasps may be generated for picking up the item at any of the locations represented by primitive shapes (e.g., ears, head, torso, arms and legs, or hands and feet).”); and
select one or more of the at least one graspable feature as the grasp location of the object (column 15, lines 9-15).
Re claim 9. Wherein the processor-executable instructions or data further cause the at least one processor to:
evaluate grasp-effectiveness for a plurality of grasp primitive-location pairs, each grasp primitive-location pair including a respective three-dimensional shape in the platonic representation of the object and a respective grasp primitive from the library of grasp primitives (column 15, lines 9-15: “Using the primitive model 604 and information from a grasp database that includes grasps based on similar primitive shapes, a set of predictably successful grasps may be generated for picking up the item at any of the locations represented by primitive shapes (e.g., ears, head, torso, arms and legs, or hands and feet).”); and
select the grasp location as a location, suited to achieve the work objective, of the three-dimensional shape in a grasp primitive-location pair having a grasp-effectiveness which exceeds a threshold (column 15, lines 9-15: successful; and column 21, lines 6-18),
wherein the processor-executable instructions or data which cause the at least one processor to select the grasp primitive cause the at least one processor to select the grasp primitive as a grasp primitive in the primitive-location pair having the highest grasp-effectiveness (column 20, line 57 through column 21, line 9: “At 1206, the process 1200 determines a probability of success for the grasp at individual contact points of the set of contact points using the grasping function. This may include determining how likely an end of arm tool utilizing the grasping function and executing the grasp will be able to pick up the item. In some examples, probability of success may be computed for each end of arm tool and for all possible grasps on the item that can be executed by the end of arm tools. In this manner, different grasps may be compared to determine what grasp has the highest probability of being successful. In some examples, validation grasps as described herein may increase probabilities of success. For example, if a particular grasp using particular contact points has been successfully executed multiple times in either simulation, actual conditions, or both, the probability of success of that grasp may be high. At 1208, the process 1200 selects one or more contact points from the set of contact points. For example, the one or more contact points associated with the grasp that has the highest probability of success may be selected.”).
Re claim 10. Wherein the processor-executable instructions or data which cause the at least one processor to evaluate grasp-effectiveness for a plurality of grasp primitive-location pairs cause the at least one processor to, for each grasp primitive-location pair:
simulate grasping of the respective three-dimensional shape in the platonic representation of the object, by applying the respective grasp primitive (column 20, line 57 through column 21, line 5: “if a particular grasp using particular contact points has been successfully executed multiple times in either simulation, actual conditions, or both, the probability of success of that grasp may be high.); and
generate a grasp-effectiveness score indicative of effectiveness of simulated grasping (column 20, line 57 through column 21, line 5).
Re claim 12. Wherein the processor executable instructions which cause the at least one sensor to capture sensor data about the object cause the at least one sensor to capture sensor data selected from a group of sensor data consisting of:
image data (column 14, lines 40-42: camera);
audio data;
tactile data;
haptic data;
actuator data indicating a state of a corresponding actuator;
inertial data;
proprioceptive data indicating a position, movement, or force applied for a corresponding actuatable member of the robot body; and
position data about at least one joint or appendage of the robot body.
Re claim 19. The robot control module of claim 1 wherein:
the at least one sensor includes a tactile sensor carried by the end effector (column 5, lines 43-47: “The sensors can include sensors configured to detect pressure, force, weight, light, objects, slippage, and any other information that may be used to control and/or monitor the operation of the robotic manipulator 110, including the end of arm tool 126.”);
the processor-executable instructions that, when executed by the at least one processor, cause the robot system to capture further sensor data indicative of engagement between the end effector and the object being different from expected engagement between the end effector and the at least one three-dimensional shape upon which the selection of the grasp primitive is at least partially based, cause the tactile sensor to capture tactile data indicative of engagement between the end effector and the object being different from expected engagement between the end effector and the at least one three-dimensional shape upon which the selection of the grasp primitive is at least partially based (Column 5, lines 43-57: “The sensors can include sensors configured to detect pressure, force, weight, light, objects, slippage, and any other information that may be used to control and/or monitor the operation of the robotic manipulator 110, including the end of arm tool 126. … The sensing information may also be used as feedback to adjust the grasps used by the end of arm tool 126, and to generate new grasps, to validate grasps, and to determine quality values for grasps, which may be a numerical value based at least in part on one or more objective factors.” Column 21, lines 19-38: “At 1210, the process 1200 receives feedback from a robotic arm that grasps the item at the one or more contact points. This may be the one or more selected contact points. The feedback may come in any suitable form. For example, the feedback may include sensing information detected by any suitable sensor associated with the robotic arm (e.g., slippage, force exerted by suction, fingers, etc., whether the item was even capable of being picked up, and any other suitable information). At 1212, the process 1200 updates the set of contact points and the probabilities of success. This may include updating based at least in part on the feedback. For example, the feedback may indicate that the one or more contact points, while selected with high probabilities of computed, actual, or simulated success, are not ideal for grasping the item. This may be because the robotic manipulator was unable to grasp the item, the item slipped in the grasp, or any other indicator of success. This real-world feedback can be used to reduce (or increase) the probability of success for the one or more contact points.”); and
the processor-executable instructions that, when executed by the at least one processor, cause the robot system to optimize, based on the further sensor data, actuation of at least one member of the end effector to increase grasp effectiveness, cause the robot system to optimize, based on the tactile data, actuation of at least one member of the end effector to increase grasp effectiveness on the object (Column 5, lines 52-54: “The sensing information may also be used as feedback to adjust the grasps used by the end of arm tool 126, and to generate new grasps”. Column 21, lines 6-18: “At 1208, the process 1200 selects one or more contact points from the set of contact points. For example, the one or more contact points associated with the grasp that has the highest probability of success may be selected.”. Column 21, lines 28-38: “At 1212, the process 1200 updates the set of contact points and the probabilities of success. This may include updating based at least in part on the feedback. For example, the feedback may indicate that the one or more contact points, while selected with high probabilities of computed, actual, or simulated success, are not ideal for grasping the item. This may be because the robotic manipulator was unable to grasp the item, the item slipped in the grasp, or any other indicator of success. This real-world feedback can be used to reduce (or increase) the probability of success for the one or more contact points.”).
Re claim 20. The robot control module of claim 19 wherein:
the at least one sensor further includes an image sensor (column 14, lines 33-42: three-dimensional scans of an item may be captured by an imaging device such as a camera.);
the processor-executable instructions that, when executed by the at least one processor, cause the robot system to capture, by at least one sensor, sensor data about an object, cause the image sensor to capture image data about the object (column 14, lines 33-42: three-dimensional scans of an item may be captured by an imaging device such as a camera.); and
the processor-executable instructions that, when executed by the at least one processor, cause the robot system to access, by the robot controller, the platonic representation of the object based at least in part on the sensor data, cause the robot system to access, by the robot controller, the platonic representation of the object based on the image data (Column 14, lines 32-35: “the grasp management engine 220 may function to generate the model 602 and/or the primitive model 604. For example, the model 602 may be based on one or more three-dimensional scans of the item.”. Column 15, lines 3-9: “The primitive model 604 may be generated based at least in part on the model 602. The primitive model 604 may include one or more primitive shapes representative of the model 602 and the item. Thus, as illustrated, the primitive model 604 may include a combination of a cuboid as a torso, four cylinders as arms and legs, four spheres as hands and feet, a sphere as a head, and two cylinders as ears.”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Stubbs et al. (US Patent No. 9,694,494) as applied to claim 1 above, and further in view of Lian et al. (US Publication No. 2022/0402128).
The teachings of Stubbs have been discussed above. Stubbs further teaches:
Re claim 11. Wherein the processor-executable instructions or data further cause the at least one processor to:
wherein the processor-executable instructions or data which cause the at least one processor to select the grasp primitive cause the at least one processor to select the grasp primitive based on the at least one three-dimensional shape in the platonic representation of the object which at least approximately corresponds to the grasp location (column 15, lines 9-15: “Using the primitive model 604 and information from a grasp database that includes grasps based on similar primitive shapes, a set of predictably successful grasps may be generated for picking up the item at any of the locations represented by primitive shapes (e.g., ears, head, torso, arms and legs, or hands and feet).”).
Stubbs fails to specifically teach: (re claim 11)
access a grasp heatmap for the object, the grasp heatmap indicative of grasp areas of the object; and
select the grasp location as a grasp area of the object.
Lian teaches, at the abstract, Fig. 4, and paragraph [0048], generating a canonical representation for objects belonging to an object category, generating a grasping area heatmap for these canonical representations for objects, and using this heatmap to select a category-level grasping area. This allows such systems to analyze a group of objects ahead of time, so as to quickly determine the desired grasp area for a type of object during operation.
In view of Lian’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the module as taught by Stubbs, (re claim 11) access a grasp heatmap for the object, the grasp heatmap indicative of grasp areas of the object; and select the grasp location as a grasp area of the object, with a reasonable expectation of success, since Lian teache generating a canonical representation for objects belonging to an object category, generating a grasping area heatmap for these canonical representations for objects, and using this heatmap to select a category-level grasping area. This allows such systems to analyze a group of objects ahead of time, so as to quickly determine the desired grasp area for a type of object during operation.
Claims 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Stubbs et al. (US Patent No. 9,694,494) as applied to claim 1 above.
The teachings of Stubbs have been discussed above. Stubbs further teaches:
Re claim 17. wherein:
the robot body carries the at least one sensor (column 5, lines 39-34: “The robotic manipulator 110 may include any suitable type and number of sensors disposed throughout the robotic manipulator 110 (e.g., sensors in the base, in the arm, in joints in the arm, in the end of arm tool 126, or in any other suitable location).”.
Stubbs fails to specifically teach: (re claim 16) wherein:
the robot body carries the at least one processor; and
the processor-executable instructions or data which cause the robot system to capture the sensor data, access the platonic representation of the object, select a grasp primitive, and control the end effector, are executed at the robot body;
(re claim 17) a remote device remote from the robot body includes the at least one processor;
the processor-executable instructions or data further cause the robot system to transmit, by a communication interface between the robot body and the remote device, the sensor data from the robot body to the remote device; and
the processor-executable instructions or data which cause the at least one processor to control the end effector cause the at least one processor to prepare and send control instructions to the robot body via the communication interface; and
(re claim 18) wherein:
the robot body carries the at least one sensor, a first processor of the at least one processor, and a first non-transitory processor-readable storage medium of the at least one non-transitory processor-readable storage medium;
a remote device remote from the robot body includes a second processor of the at least one processor and a second non-transitory processor-readable storage medium of the at least one non-transitory processor-readable storage medium;
the processor-executable instructions or data include first processor-executable instructions or data stored at the first non-transitory processor-readable storage medium that when executed cause the robot system to:
capture the sensor data by the at least one sensor;
transmit, via a communication interface between the robot body and the remote device, the sensor data from the robot body to the remote device; and
control, by the first at least one processor, the end effector to apply the grasp primitive to grasp the object; and
the processor-executable instructions or data include second processor-executable instructions or data stored at the second non-transitory processor-readable storage medium that when executed cause the robot system to:
access, from the second non-transitory processor-readable storage medium, the platonic representation of the object;
select, by the second processor, the grasp primitive; and
transmit, via the communication interface, data indicating the grasp primitive and the platonic representation of the object to the robot body.
Stubbs teaches, at column 6, lines 4 through column 7, line 2; and column 23 line 62 through column 24, line 16, computer hardware may be located local or remote to the other components of the system and may be distributed throughout more than one location. The location of the computer hardware does not affect the functioning of the system as data may be transferred to or from the robot body as necessary to allow flexibility in the location of the processing components.
In view of Stubb’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the module as taught by Stubbs, (re claim 16) wherein: the robot body carries the at least one processor; and the processor-executable instructions or data which cause the robot system to capture the sensor data, access the platonic representation of the object, select a grasp primitive, and control the end effector, are executed at the robot body; (re claim 17) a remote device remote from the robot body includes the at least one processor; the processor-executable instructions or data further cause the robot system to transmit, by a communication interface between the robot body and the remote device, the sensor data from the robot body to the remote device; and the processor-executable instructions or data which cause the at least one processor to control the end effector cause the at least one processor to prepare and send control instructions to the robot body via the communication interface; and (re claim 18) wherein: the robot body carries the at least one sensor, a first processor of the at least one processor, and a first non-transitory processor-readable storage medium of the at least one non-transitory processor-readable storage medium; a remote device remote from the robot body includes a second processor of the at least one processor and a second non-transitory processor-readable storage medium of the at least one non-transitory processor-readable storage medium; the processor-executable instructions or data include first processor-executable instructions or data stored at the first non-transitory processor-readable storage medium that when executed cause the robot system to: capture the sensor data by the at least one sensor; transmit, via a communication interface between the robot body and the remote device, the sensor data from the robot body to the remote device; and control, by the first at least one processor, the end effector to apply the grasp primitive to grasp the object; and the processor-executable instructions or data include second processor-executable instructions or data stored at the second non-transitory processor-readable storage medium that when executed cause the robot system to: access, from the second non-transitory processor-readable storage medium, the platonic representation of the object; select, by the second processor, the grasp primitive; and transmit, via the communication interface, data indicating the grasp primitive and the platonic representation of the object to the robot body, with a reasonable expectation of success, since Stubbs teaches computer hardware may be located local or remote to the other components of the system. The location of the computer hardware does not affect the functioning of the system as data may be transferred to or from the robot body as necessary to allow flexibility in the location of the processing components.
Claim 21 is rejected under 35 U.S.C. 103 as being unpatentable over Stubbs et al. (US Patent No. 9,694,494) as applied to claim 1 above, and further in view of Matsuda (US Publication No. 2024/0100695).
The teachings of Stubbs have been discussed above. Stubbs fails to specifically teach: (re claim 21) wherein the processor-executable instructions that, when executed by the at least one processor, cause the robot system to optimize, based on the further sensor data, actuation of at least one member of the end effector to increase grasp effectiveness, cause the robot system to optimize, based on the further sensor data, actuation of at least one member of the end effector to grasp the object more tightly.
Matsuda teaches, at paragraph [0170], controlling a grasping force of a robotic hand in accordance with the amount of displacement measured by a slip detection unit. Matsuda further teaches, at paragraph [0008], increasing grasping force is an appropriate response to an object slipping. This would allow such robotic grasping systems to increase a grasping force when a grasped object is slipping so as to hold the object more securely and prevent further slipping.
In view of Matsuda’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the apparatus as taught by Stubbs, (re claim 21) wherein the processor-executable instructions that, when executed by the at least one processor, cause the robot system to optimize, based on the further sensor data, actuation of at least one member of the end effector to increase grasp effectiveness, cause the robot system to optimize, based on the further sensor data, actuation of at least one member of the end effector to grasp the object more tightly, with a reasonable expectation of success, since Matsuda teaches controlling a grasping force of a robotic hand in accordance with the amount of displacement measured by a slip detection unit, and increasing grasping force is an appropriate response to an object slipping. This would allow such robotic grasping systems to increase a grasping force when a grasped object is slipping so as to hold the object more securely and prevent further slipping.
Response to Arguments
Applicant’s arguments, see page 9, filed 2/17/2026, with respect to the double patenting rejections and the claim objections have been fully considered and are persuasive. The double patenting rejections and the claim objections have been withdrawn.
Applicant's arguments filed 2/17/2026 have been fully considered but they are not persuasive.
Applicant remarks, on page 10,
In its rejection of claim 7, the Examiner does not point to any identical disclosure (or teaching or suggestion) of "select[ing] the grasp location as a location of the object relevant to the work objective". By way of this Response, Applicant has copied the features of original claim 7 into independent claim 1 and canceled claim 7. Additionally, Applicant has added greater emphasis on the grasp selection being relevant to the work objective by amending independent claim 1 to recite "select[ing], by the robot controller and from the library of grasp primitives, a grasp primitive based at least in part on both the work objective and at least one three-dimensional shape in the platonic representation of the object." This amendment also necessitated an amendment to claim 9 for consistency.
Stubbs teaches, at column 15, lines 9-15, generating predictably successful grasps based on the primitive shapes that are representing the item. Stubbs further teaches, at column 21, lines 6-18, selecting contact points for the selected grasp based on one or more factors that are being prioritized by the system, such as probability of success, lower energy use, less manipulation of the item being required before and/or after picking the item up, or speed or efficiency of execution.
Applicant remarks, on page 10,
In its rejection of claim 15, the Examiner does not point to any identical
disclosure (or teaching or suggestion) of "adjust[ing] control of the end effector based n the further sensor data [to] cause the robot controller to optimize actuation of at least one member of the end effector to increase grasp effectiveness". By way of this Response, Applicant has copied the features of original claim 15 into independent claim 1 and canceled claim 15. And to add further context to such features, Applicant has also copied in the features of original claim 14 and canceled claim 14.
Stubbs teaches, at column 5, lines 43-57, “The sensing information may also be used as feedback to adjust the grasps used by the end of arm tool 126”. Stubbs further teaches, at column 21, lines 19-38, “At 1212, the process 1200 updates the set of contact points and the probabilities of success. This may include updating based at least in part on the feedback. For example, the feedback may indicate that the one or more contact points, while selected with high probabilities of computed, actual, or simulated success, are not ideal for grasping the item. This may be because the robotic manipulator was unable to grasp the item, the item slipped in the grasp, or any other indicator of success. This real-world feedback can be used to reduce (or increase) the probability of success for the one or more contact points.” This modified probability of success will be used in future iterations at step 1208 of Fig. 12, described at column 21, lines 6-18, to select contact points that have been determined to have the highest probability of success. This iterative process of determining and using the grasps with the highest probability of success will optimize the grasps over the multiple iterations.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SPENCER D PATTON whose telephone number is (571)270-5771. The examiner can normally be reached Monday to Friday 9:00-5:00 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571)272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SPENCER D PATTON/Primary Examiner, Art Unit 3656