Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
2. This communication is responsive to Application No. 18/794,652 and the claims filed on 8/5/2024.
3. Claims 1-22 are presented for examination.
Information Disclosure Statement
4. The information disclosure statements (IDS) submitted on 8/5/2024, 9/19/2024, 1/16/2025, and 3/24/2025 have been fully considered by the Examiner.
Claim Objections
5. Claims 3 and 17 are objected to because of the following informalities:
Regarding Claim 3, the term “the threshold size” recited in line 2 of claim 3 should read “a threshold size” to avoid a lack of antecedent basis.
Regarding Claim 17, the term “has picked up the number of items greater than the expected number of items” recited in lines 1-2 of claim 17 should read “has picked up a number of items greater than an expected number of items” to avoid a lack of antecedent basis.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
6. Claims 11, 12, 20, and 21 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding Claim 11, the term “the active measure includes determining to pull, drag, or push the item” recited in lines 2-3 of claim 11 is deemed to be indefinite for failing to particularly point out and distinctly claim the subject matter of the invention. It is unclear to the Examiner which item the active measure of is performed on within claim 11. Dependent claim 11 introduces an item whose weight is determined to exceed a threshold. However, independent claim 1, which claim 11 depends on, discloses the active measure involving a first item and a second item in some capacity, where claim 1 recites “determining to implement an active measure based at least in part on a detected state or condition associated with the one or more items in the workspace, comprising: detecting … that a location adjacent to the destination location comprises a second item having a characteristic satisfying a predefined criteria … wherein performing the active measure comprises: selecting a new destination location at which to place the first item,” recited in lines 13-21 of claim 1. It is unclear to the Examiner whether the item introduced within claim 11 is any of the first or second items previously introduced in claim 1, or if this is a brand-new third item. Said differently, it is unknown as to which item the active measure is performed on in claim 11. Claim 1 narrows the items to a first and second item, but dependent claim 11 broadens this by referring to “an item” and “the item.” Therefore, claim 11 has been determined to be indefinite. For the sake of compact prosecution, the Examiner interprets the active measure of claim 11 to refer to the “third” item introduced within claim 11. Support for this interpretation is found that the active measure of claim 11 is still performed on an item.
Regarding Claim 12, the term “wherein the item is pulled, dragged, or pushed into an exception handling area,” recited in lines 1-2 of claim 12, is deemed to be indefinite for failing to particularly point out and distinctly claim the subject matter of the invention. It is unclear to the Examiner which item claim 12 refers to. Independent claim 1, which claim 12 depends from, introduces a first and second item when performing the active measure. However, dependent claim 9, which claim 12 also depends from, introduces items in a logjam. It is unclear to the Examiner which items of “the item” claim 12 is referring to. Therefore, claim 12 is deemed to be indefinite. For the sake of compact prosecution, the Examiner interprets “the item” of claim 12 to refer to the first item introduced in claim 1. Support for this interpretation can be found that the first item of claim 1 is picked and handled by the first robotic arm.
Regarding Claim 20, the term “a potential collision between the first robotic arm and the second robotic arm” recited in lines 1-2 of claim 20, is deemed to be indefinite for failing to particularly point out and distinctly claim the subject matter of the invention. There is a lack of antecedent basis for the term “the second robotic arm.” Claims 1 and 17, upon which claim 20 depends on, do not properly introduce a second robotic arm. Thus, it is unclear as to how the second robotic arm functions within the inventions claimed in claims 1 and 17. Further, dependent claim 19 properly introduces a second robotic arm, and it is further unclear as to whether there is error within the claim dependency tree of claim 20 (i.e., whether claim 20 should depend on claim 17 or not).
Regarding Claim 21, the term “a determination that a next operation of the second robotic arm” recited in lines 1-2 of claim 21, is deemed to be indefinite for failing to particularly point out and distinctly claim the subject matter of the invention. There is a lack of antecedent basis for the term “the second robotic arm.” Claims 1 and 18, upon which claim 21 depends on, do not properly introduce a second robotic arm. Thus, it is unclear as to how the second robotic arm functions within the inventions claimed in claims 1 and 18. Further, dependent claim 19 properly introduces a second robotic arm, and it is further unclear as to whether there is error within the claim dependency tree of claim 21 (i.e., whether claim 21 should depend on claim 18 or not).
Regarding Claim 21, the term “the second robotic arm is to grasp a first item that is further away from the singulation conveyance structure than a second item to be grasped” recited in lines 2-3 of claim 21, is deemed to be indefinite for failing to particularly point out and distinctly claim the subject matter of the invention. Claim 21 introduces a first and second item to be grasped. However, independent claim 1, which claim 21 depends upon, also introduces a first and second item. It is unclear to the Examiner whether the first and second items recited in claim 21 are the same first and second items recited in claim 1, or if the first and second items of claim 21 are new first and second items different than those in claim 1. Therefore, claim 21 is further deemed to be indefinite. For the sake of compact prosecution, the Examiner interprets the first and second items of claim 21 to be the same first and second items of claim 1.
Claim Rejections - 35 USC § 103
7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
8. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
9. Claim(s) 1, 3, 4, 7, 14, 16, and 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov).
Regarding Claim 1, Mizoguchi teaches a system, comprising: a communication interface ([0039] via “The communication devices 206 can include circuits configured to communicate with external or remote devices via a network.”);
a first robotic arm ([0049] via “The robotic arm system 132 can include a robotic arm assembly 139 and an end effector 140, which includes a vision sensor device 143 and a multi-gripper assembly 141 (“gripper assembly 141”).”) comprising a suction-based end effector ([0055] via “The gripper assembly 141 can include addressable vacuum zones or regions 117a,
117b, 117c (collectively “vacuum regions 117”) defining a gripping zone 125.”), ([0056] via “Referring now to FIG. 5, the vacuum regions 117 can include a group or bank of suction elements 151 (one identified in FIG. 5) through which air is drawn.”); and
a processor coupled to the communication interface ([0035] via “In some embodiments, for example, the robotic system 100 (e.g., at one or more of the units or assemblies and/or robots described above) can include electronic/electrical devices, such as one or more processors 202, … one or more communication devices 206, … or a combination thereof.”), (Note: See Figure 2 of Mizoguchi as well.) and configured to:
receive sensor data via the communication interface, the sensor data including image data associated with a plurality of items present in a workspace ([0023] via “Target object(s) are identified based on captured image data.”); and
use the sensor data to determine and implement a plan to autonomously operate the first robotic arm to pick one or more items from the workspace and place each item singly at a corresponding location ([0025] via “A method for operating a transport robot includes receiving image data representative of a group of objects (e.g., a stack or pile of objects). One or more target objects are identified in the group based on the received image data. Addressable vacuum regions are selected based on the identified one or more target objects. The transport robot is command to cause the selected vacuum regions to hold and transport the identified one or more target objects.”), ([0032] via “An imaging system 160 can provide image data used to monitor operation of components, identify target objects, track objects, or otherwise perform tasks. … A controller 109 can communicate with the imaging system 160 and other components of the robotic system 100. The controller 109 can generate transport plans that include a sequence for picking up and dropping off objects (e.g., illustrated as stable containers), positioning information, order information for picking up objects, order information for dropping off objects, stacking plans (e.g., plans for stacking objects at the drop off zone), re-stacking plans (e.g., plans for re-stacking at least some of the containers at the pickup zone), or combinations thereof.”), wherein determining the plan comprises:
selecting a destination location at which to place a first item ([0045] via “Referring now to FIGS. 1 and 2, the robotic system 100 (via, e.g., the processors 202) can process image data and/or the point cloud to identify the target package 112 of FIG. 1, the start location 114 of
FIG. 1, the task location 116 of FIG. 1, a pose of the target package 112 of FIG. 1, or a combination thereof. … Similarly, the robotic system 100 can capture and analyze an image of another designated area (e.g., a drop location for placing objects on the conveyor belt, a location for placing objects inside the container, or a location on the pallet for stacking purposes) to identify the task location 116.”).
Mizoguchi is silent on wherein implementing the plan comprises: determining to implement an active measure based at least in part on a detected state or condition associated with the one or more items in the workspace, comprising: detecting, based at least in part on the sensor data, that a location adjacent to the destination location comprises a second item having a characteristic satisfying a predefined criteria; and in response to detecting that the adjacent location comprises the second item having the characteristic satisfying the predefined criteria, determining to perform the active measure, wherein performing the active measure comprises: selecting a new destination location at which to place the first item; and updating the plan to operate the first robotic arm to place the first item at the new destination location.
However, Diankov teaches wherein implementing the plan comprises: determining to implement an active measure based at least in part on a detected state or condition associated with the one or more items in the workspace, comprising: detecting, based at least in part on the sensor data, that a location adjacent to the destination location comprises a second item having a characteristic satisfying a predefined criteria ([0056] via “Dynamically deriving the placement location 350 of an object provides increased flexibility and reduced human labor for shipping/packaging environments. The robotic system 100 can use discretized real-time images/depth maps of objects and the pallet (i.e., including the already-placed objects) to test and evaluate different placement locations and/or orientation.”), ([0061] via “When the candidate positions 360 overlap one or more objects already placed at the task location 116, the robotic system 100 can calculate and evaluate a measure of support provided by the already-placed objects. To calculate and evaluate the measure of support, the robotic system
100 can determine heights/contour for the placement area 340 of FIG. 3B in real-time using one or more of the imaging devices 222 of FIG. 2. … Because a vertical position of the ground and/or of the platform (e.g., pallet) surface (e.g., a height of the platform surface above the facility ground surface) is known, the robotic system 100 can use the depth measure to calculate the heights/contour of the exposed top surface(s) of the platform, the placed objects, or a combination thereof.”), (Note: The Examiner interprets the already-placed objects of Diankov as the second item(s) and the task location 116 of Diankov as the destination location. Further, the Examiner interprets the dimensions of the second items as the characteristics.); and
in response to detecting that the adjacent location comprises the second item having the characteristic satisfying the predefined criteria, determining to perform the active measure ([0023] via “In contrast to the traditional systems, the robotic system described herein can (i) identify real-time conditions and/or deviations in received packages and/or other unanticipated errors and (ii) dynamically (e.g., as one or more objects arrive or are identified and/or after initially starting one or more operations, such as a packing operation) derive placement locations of the objects during system operation. In some embodiments, the robotic system can initiate/implement the dynamic derivation of the placement based on a triggering event, such as identification of one or more packaging/manipulation errors (e.g., a collision event or a lost piece event), an unrecognized object (e.g., at the source and/or at the destination), a change in locations/orientations of already-placed packages, and/or occurrence of other dynamic conditions.”),
wherein performing the active measure comprises: selecting a new destination location at which to place the first item ([0062] via “In some embodiments, as illustrated in FIG. 4A, the robotic system 100 can update the discretized platform model 304 to include height measures 402. The robotic system 100 can determine the height measures 402 according to each of the discretized pixels (e.g., the unit pixels 310) in the discretized platform model
304.”), ([0063] via “For each of the candidate positions 360 that overlap one or more of the already-placed objects, the robotic system 100 can evaluate the placement possibility based on the height measures 402. In some embodiments, the robotic system 100 can evaluate the placement possibility based on identifying the highest value of the height measures 402
overlapped in each of the candidate positions 360. The robotic system 100 can further identify other height measures 402 located in each of the candidate positions 360 with the height measures 402 within a limit of a difference threshold relative to the highest measure of the height measures 402. The qualifying cells/pixels can represent locations that can provide support for the stacked object such that the stacked object rests essentially flat/horizontal.”), ([0136] via “If the dimensions and/or the discretized data do not match or differ by measures exceeding the threshold range, the robotic system 100 can abandon the existing packing plan and re-derive the packing plan according to the updated master data and the current conditions (e.g., the remaining packages and/or the initial heights of the placement surface). … In some embodiments, the robotic system 100 can access and adjust the previously determined package groupings according to the current conditions (by, e.g., removing the already-placed objects), retain the previously determined processing order, and re-derive the 2D and 3D placement plans accordingly. For example, the robotic system 100
can use the current condition at the task location 116 as an updated placement surface (e.g., instead of the discretized platform model 304) or identify the current condition as an existing portion of the plan, such as results of previous planning iterations described above. Alternatively, when the remaining number of packages are under a threshold limit, the robotic system 100 can dynamically derive the placement locations as described in detail below.”); and
updating the plan to operate the first robotic arm to place the first item at the new destination location ([0118] via “At block 716, the robotic system 100 can implement the stacking plan for placing the available packages on the platform.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Diankov wherein implementing the plan comprises: determining to implement an active measure based at least in part on a detected state or condition associated with the one or more items in the workspace, comprising: detecting, based at least in part on the sensor data, that a location adjacent to the destination location comprises a second item having a characteristic satisfying a predefined criteria; and in response to detecting that the adjacent location comprises the second item having the characteristic satisfying the predefined criteria, determining to perform the active measure, wherein performing the active measure comprises: selecting a new destination location at which to place the first item; and updating the plan to operate the first robotic arm to place the first item at the new destination location. Doing so dynamically optimizes the placement plan of the first item such that its placement satisfies multiple rules of the destination, as stated by Diankov ([0059] via “The robotic system 100 can evaluate each of the candidate positions 360 according to various parameters/conditions, such as support measure/condition, supported weight in comparison to fragility ratings (e.g., maximum supported weight, such as for packages stacked thereon) of the supporting objects, space/packing implications, or a combination thereof. The robotic system 100 can further evaluate the candidate positions 360 using one or more placement rules, such as collision free requirement, stack stability, customer-specified rules/priorities, package separation requirements or the absence thereof, maximization of total loaded packages, or a combination thereof.”).
Regarding Claim 3, modified reference Mizoguchi teaches the system of claim 1, but is silent on wherein the characteristic of the second item is an item size, and the predefined criteria is the item size being larger than the threshold size.
However, Diankov teaches wherein the characteristic of the second item is an item size, and the predefined criteria is the item size being larger than the threshold size ([0063] via “For each of the candidate positions 360 that overlap one or more of the already-placed objects, the robotic system 100 can evaluate the placement possibility based on the height measures 402. In some embodiments, the robotic system 100 can evaluate the placement possibility based on identifying the highest value of the height measures 402 overlapped in each of the candidate positions 360. The robotic system 100 can further identify other height measures 402 located in each of the candidate positions 360 with the height measures 402
within a limit of a difference threshold relative to the highest measure of the height measures 402. The qualifying cells/pixels can represent locations that can provide support for the stacked object such that the stacked object rests essentially flat/horizontal.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Diankov wherein the characteristic of the second item is an item size, and the predefined criteria is the item size being larger than the threshold size. Doing so ensures that objects stacked on top of each other have sufficient support, as stated above by Diankov.
Regarding Claim 4, modified reference Mizoguchi teaches the system of claim 1, wherein the destination location is a singulation conveyance structure ([0031] via “In some embodiments, the task can include manipulation (e.g., moving and/or reorienting) of a target object or package 112 (e.g., boxes, cases, cages, pallets, etc.) from a start location 114 to a task location 116. For example, the unloading unit 102 (e.g., a devanning robot) can be configured to transfer the target package 112 from a location in a carrier (e.g., a truck) to a location on a conveyor belt. The transfer assembly 104 (e.g., a palletizing robot assembly) can be configured to load packages 112 onto the transport unit 106 or conveyor 120.”), (Note: The Examiner interprets the conveyor of Mizoguchi as the singulation conveyance structure.).
Regarding Claim 7, modified reference Mizoguchi teaches the system of claim 4, wherein the first robotic arm is used to nudge or pull at least one item in the workspace ([0050] via “In some embodiments, the gripper assembly 141 can have addressable regions each selectively capable of drawing in air for providing a vacuum grip. In some modes of operation, only addressable regions proximate to the targeted object(s) draw in air to provide a pressure differential directly between the vacuum gripper device and the targeted object(s). This allows only selected packages (i.e., targeted packages) to be pulled or otherwise secured against the gripper assembly 141 even though other gripping portions of the gripper assembly 141 are adjacent to or contact other packages.”).
Regarding Claim 14, modified reference Mizoguchi teaches the system of claim 1, wherein the processor is configured to determine and implement the plan at least in part by: determining to change one or both of a position and an orientation of at least one of the one or more items based at least in part on three dimensional image data associated with the workspace ([0025] via “A method for operating a transport robot includes receiving image data representative of a group of objects (e.g., a stack or pile of objects). One or more target objects are identified in the group based on the received image data. … The transport robot is command to cause the selected vacuum regions to hold and transport the identified one or more target objects.”), ([0044] via “In some embodiments, for example, the sensors 216 can include one or more imaging devices 222 (e.g., 2-dimensional and/or 3-dimensional imaging devices). configured to detect the surrounding environment.”); and
operating the first robotic arm according to the plan ([0102] via “The robotic system
100 can begin executing the base motion plan based on operating the actuation devices 212
according to the sequence of commands or settings or combination thereof. The robotic system 100 can execute a first set of actions in the base motion plan. For example, the robotic system 100 can operate the actuation devices 212 to place the end-effector 140 at a calculated location and/or orientation about the start location 114 for gripping the target package 112 as illustrated in block 752.”).
Regarding Claim 16, modified reference Mizoguchi teaches the system of claim 1, wherein: the suction-based end effector comprises one or more suction cups ([0078] via “The grippers can include suction elements (e.g., suction tubes, suction cups, sealing member, etc.), sealing member, valve plates, gripper mechanisms, and other fluidic components for providing gripping capability.”); and
the processor is configured to perform a diagnostic test, including by: operating the first robotic arm to move the end effector to a predetermined surface ([0102] via “The robotic system 100 can execute a first set of actions in the base motion plan. For example, the robotic system 100 can operate the actuation devices 212 to place the end-effector 140 at a calculated location and/or orientation about the start location 114 for gripping the target package 112 as illustrated in block 752.”);
causing the first robotic arm to grasp the predetermined surface ([0103] via “The robotic system 100 can operate the actuation devices 212 and vacuum source 221 (FIG. 7) to have the end-effector 140 engage and grip the target package 112.”);
measuring a pressure affected by the one or more suction cups when engaging the predetermined surface ([0047] via “The contact sensors 226 can measure the characteristic that corresponds to a grip of the end-effector (e.g., the gripper) on the target package 112. … For example, the contact measurement can include one or more force, pressure, or torque readings associated with forces associated with gripping the target package 112 by the end-effector. In some embodiments, the contact measurement can include both (1) pressure readings associated with vacuum gripping and (2) force readings (e.g., moment readings) associated with carrying object(s).”); and
comparing the pressure affected by the one or more suction cups when engaging the predetermined surface with a preset threshold pressure value ([0047] via “The contact sensors
226 can measure the characteristic that corresponds to a grip of the end-effector (e.g., the gripper) on the target package 112. Accordingly, the contact sensors 226 can output a contact measurement that represents a quantified measurement (e.g., a measured force, torque, position, etc.) corresponding to physical contact, a degree of contact or attachment between the gripper and the target package 112, or other contact characteristics. For example, the contact measurement can include one or more force, pressure, or torque readings associated with forces associated with gripping the target package 112 by the end-effector. In some embodiments, the contact measurement can include both (1) pressure readings associated with vacuum gripping and (2) force readings (e.g., moment readings) associated with carrying object(s).”), ([0105] via “At decision block 712, the robotic system 100 can compare the measured grip to a threshold (e.g., an initial grip threshold). For example, the robotic system 100 can compare the contact or force measurement to a predetermined threshold. … Accordingly, the robotic system 100 can determine whether the contact/grip is sufficient to continue manipulating (e.g., lifting, transferring, and/or reorienting) the target package(s)
112.”).
Regarding Claim 22, Mizoguchi teaches a method, comprising: receiving, by one or more processors ([0036] via “The processors 202 can include data processors (e.g., central processing units (CPUs), special-purpose computers, and/or onboard servers) configured to execute instructions (e.g., software instructions) stored on the storage devices 204 (e.g., computer memory).”), sensor data, the sensor data including image data associated with a plurality of items present in a workspace ([0023] via “Target object(s) are identified based on captured image data.”); and
using the sensor data to determine and implement a plan to autonomously operate a first robotic arm to pick one or more items from the workspace and place each item singly at a corresponding location ([0025] via “A method for operating a transport robot includes receiving image data representative of a group of objects (e.g., a stack or pile of objects). One or more target objects are identified in the group based on the received image data. Addressable vacuum regions are selected based on the identified one or more target objects. The transport robot is command to cause the selected vacuum regions to hold and transport the identified one or more target objects.”), ([0032] via “An imaging system 160 can provide image data used to monitor operation of components, identify target objects, track objects, or otherwise perform tasks. … A controller 109 can communicate with the imaging system 160 and other components of the robotic system 100. The controller 109 can generate transport plans that include a sequence for picking up and dropping off objects (e.g., illustrated as stable containers), positioning information, order information for picking up objects, order information for dropping off objects, stacking plans (e.g., plans for stacking objects at the drop off zone), re-stacking plans (e.g., plans for re-stacking at least some of the containers at the pickup zone), or combinations thereof.”), wherein determining the plan comprises:
selecting a destination location at which to place a first item ([0045] via “Referring now to FIGS. 1 and 2, the robotic system 100 (via, e.g., the processors 202) can process image data and/or the point cloud to identify the target package 112 of FIG. 1, the start location 114 of
FIG. 1, the task location 116 of FIG. 1, a pose of the target package 112 of FIG. 1, or a combination thereof. … Similarly, the robotic system 100 can capture and analyze an image of another designated area (e.g., a drop location for placing objects on the conveyor belt, a location for placing objects inside the container, or a location on the pallet for stacking purposes) to identify the task location 116.”).
Mizoguchi is silent on wherein implementing the plan comprises: determining to implement an active measure based at least in part on a detected state or condition associated with the one or more items in the workspace, comprising: detecting, based at least in part on the sensor data, that a location adjacent to the destination location comprises a second item having a characteristic satisfying a predefined criteria; and in response to detecting that the adjacent location comprises the second item having the characteristic satisfying the predefined criteria, determining to perform the active measure, wherein performing the active measure comprises: selecting a new destination location at which to place the first item; and updating the plan to operate the first robotic arm to place the first item at the new destination location.
However, Diankov teaches wherein implementing the plan comprises: determining to implement an active measure based at least in part on a detected state or condition associated with the one or more items in the workspace, comprising: detecting, based at least in part on the sensor data, that a location adjacent to the destination location comprises a second item having a characteristic satisfying a predefined criteria ([0056] via “Dynamically deriving the placement location 350 of an object provides increased flexibility and reduced human labor for shipping/packaging environments. The robotic system 100 can use discretized real-time images/depth maps of objects and the pallet (i.e., including the already-placed objects) to test and evaluate different placement locations and/or orientation.”), ([0061] via “When the candidate positions 360 overlap one or more objects already placed at the task location 116, the robotic system 100 can calculate and evaluate a measure of support provided by the already-placed objects. To calculate and evaluate the measure of support, the robotic system
100 can determine heights/contour for the placement area 340 of FIG. 3B in real-time using one or more of the imaging devices 222 of FIG. 2. … Because a vertical position of the ground and/or of the platform (e.g., pallet) surface (e.g., a height of the platform surface above the facility ground surface) is known, the robotic system 100 can use the depth measure to calculate the heights/contour of the exposed top surface(s) of the platform, the placed objects, or a combination thereof.”), (Note: The Examiner interprets the already-placed objects of Diankov as the second item(s) and the task location 116 of Diankov as the destination location. Further, the Examiner interprets the dimensions of the second items as the characteristics.); and
in response to detecting that the adjacent location comprises the second item having the characteristic satisfying the predefined criteria, determining to perform the active measure ([0023] via “In contrast to the traditional systems, the robotic system described herein can (i) identify real-time conditions and/or deviations in received packages and/or other unanticipated errors and (ii) dynamically (e.g., as one or more objects arrive or are identified and/or after initially starting one or more operations, such as a packing operation) derive placement locations of the objects during system operation. In some embodiments, the robotic system can initiate/implement the dynamic derivation of the placement based on a triggering event, such as identification of one or more packaging/manipulation errors (e.g., a collision event or a lost piece event), an unrecognized object (e.g., at the source and/or at the destination), a change in locations/orientations of already-placed packages, and/or occurrence of other dynamic conditions.”),
wherein performing the active measure comprises: selecting a new destination location at which to place the first item ([0062] via “In some embodiments, as illustrated in FIG. 4A, the robotic system 100 can update the discretized platform model 304 to include height measures 402. The robotic system 100 can determine the height measures 402 according to each of the discretized pixels (e.g., the unit pixels 310) in the discretized platform model
304.”), ([0063] via “For each of the candidate positions 360 that overlap one or more of the already-placed objects, the robotic system 100 can evaluate the placement possibility based on the height measures 402. In some embodiments, the robotic system 100 can evaluate the placement possibility based on identifying the highest value of the height measures 402
overlapped in each of the candidate positions 360. The robotic system 100 can further identify other height measures 402 located in each of the candidate positions 360 with the height measures 402 within a limit of a difference threshold relative to the highest measure of the height measures 402. The qualifying cells/pixels can represent locations that can provide support for the stacked object such that the stacked object rests essentially flat/horizontal.”), ([0136] via “If the dimensions and/or the discretized data do not match or differ by measures exceeding the threshold range, the robotic system 100 can abandon the existing packing plan and re-derive the packing plan according to the updated master data and the current conditions (e.g., the remaining packages and/or the initial heights of the placement surface). … In some embodiments, the robotic system 100 can access and adjust the previously determined package groupings according to the current conditions (by, e.g., removing the already-placed objects), retain the previously determined processing order, and re-derive the 2D and 3D placement plans accordingly. For example, the robotic system 100
can use the current condition at the task location 116 as an updated placement surface (e.g., instead of the discretized platform model 304) or identify the current condition as an existing portion of the plan, such as results of previous planning iterations described above. Alternatively, when the remaining number of packages are under a threshold limit, the robotic system 100 can dynamically derive the placement locations as described in detail below.”); and
updating the plan to operate the first robotic arm to place the first item at the new destination location ([0118] via “At block 716, the robotic system 100 can implement the stacking plan for placing the available packages on the platform.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Diankov wherein implementing the plan comprises: determining to implement an active measure based at least in part on a detected state or condition associated with the one or more items in the workspace, comprising: detecting, based at least in part on the sensor data, that a location adjacent to the destination location comprises a second item having a characteristic satisfying a predefined criteria; and in response to detecting that the adjacent location comprises the second item having the characteristic satisfying the predefined criteria, determining to perform the active measure, wherein performing the active measure comprises: selecting a new destination location at which to place the first item; and updating the plan to operate the first robotic arm to place the first item at the new destination location. Doing so dynamically optimizes the placement plan of the first item such that its placement satisfies multiple rules of the destination, as stated by Diankov ([0059] via “The robotic system 100 can evaluate each of the candidate positions 360 according to various parameters/conditions, such as support measure/condition, supported weight in comparison to fragility ratings (e.g., maximum supported weight, such as for packages stacked thereon) of the supporting objects, space/packing implications, or a combination thereof. The robotic system 100 can further evaluate the candidate positions 360 using one or more placement rules, such as collision free requirement, stack stability, customer-specified rules/priorities, package separation requirements or the absence thereof, maximization of total loaded packages, or a combination thereof.”).
10. Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Li et al. (WO 2020078224 A1 hereinafter Li).
Regarding Claim 2, modified reference Mizoguchi teaches the system of claim 1, but is silent on wherein the updating the plan to operate the first robotic arm to place the first item at the new destination location comprises: updating a distributed data structure to associate the new destination location with the first item, wherein the distributed data structure is used to manage reservations for items placed in a placement area by a plurality of robotic arms.
However, Li teaches wherein the updating the plan to operate the first robotic arm to place the first item at the new destination location comprises: updating a distributed data structure to associate the new destination location with the first item, wherein the distributed data structure is used to manage reservations for items placed in a placement area by a plurality of robotic arms (Page 11 paragraph 13 via “When the container is full, you can trigger the pre-set sensor, the sensor transmits the information to the background scheduling system, thereby closing the grid; and open other grids as a spare replacement grid, and notify the container to package and clear Remove, at the same time, when the container is full, the handling equipment can wait in the nearest area, and then go to the abnormal grid after timeout.”), (Page 12 paragraph 11 via “H. The handling equipment detects whether it reaches the target address through navigation; if it does not reach the destination, it will switch to the above E process; the planned path can be varied, and it can be changed and adjusted according to the real-time cloud scheduling situation.”), (Note: The Examiner interprets the container being full as being reserved, as the term reservation is used in at least paragraph [0067] of the specification of the instant application.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Li wherein the updating the plan to operate the first robotic arm to place the first item at the new destination location comprises: updating a distributed data structure to associate the new destination location with the first item, wherein the distributed data structure is used to manage reservations for items placed in a placement area by a plurality of robotic arms. Doing so provides communication when containers are full such that the robots do not continue to fill it, as stated above by Li on page 11 paragraph 13.
11. Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Ban et al. (US 20050075752 A1 hereinafter Ban).
Regarding Claim 5, modified reference Mizoguchi teaches the system of claim 1, but is silent on wherein the processor is further configured to: detect the detected state or condition, including by using the sensor data to detect that an observed flow of items through the workspace has deviated from a predicted flow predicted based on a model of item flow through the workspace; determine that the detected state or condition impedes implementation of a current plan to autonomously operate the first robotic arm to pick the one or more items from the workspace and place each item singly in a corresponding location in the singulation conveyance structure; and in response to the determination, operate to implement the active measure to improve the flow of the items through the workspace.
However, Ban teaches to detect the detected state or condition, including by using the sensor data to detect that an observed flow of items through the workspace has deviated from a predicted flow predicted based on a model of item flow through the workspace ([0046] via “FIG. 6 is a flowchart of the conveyor speed adjusting process executed by the CPU 6a of the robot controller. Firstly, the initial value of the conveyor speed is set and output to the motor drive unit of the motor for driving the feeding conveyor (step 300). A timer for counting a preset time is reset and started (step 301). Until the timer counts and the preset time is up, the numbers M1, M2 of the workpieces 10 processed by the robots RB1, RB2 are counted (steps 302 and 303). When the preset time for the timer is up, the counted numbers M1, M2 of the handled workpieces are compared with each other (step 304). If the number M1 of the workpieces processed by the upstream-side robot RB1 is larger than the number M2 of the workpieces processed by the downstream-side robot RB2, the conveyor speed is increased by a predetermined constant (step 305). If the number of workpieces processed by the downstream-side robot RB2 is larger than the number of workpieces processed by the upstream-side robot RB1, in contrast, it is judged that the conveyor speed is too high, and the conveyor speed is reduced by a predetermined constant (step 306).”);
determine that the detected state or condition impedes implementation of a current plan to autonomously operate the first robotic arm to pick the one or more items from the workspace and place each item singly in a corresponding location in the singulation conveyance structure ([0044] via “When the feeding speed of the feeding conveyor 3 is low as compared with the speed of the workpiece picking-up operation of the robots RB1, RB2, the workpieces 10 conveyed by the conveyor 3 are gripped and picked up by the robot RB1 arranged more upstream and the number of workpieces picked up by the robot RB2 arranged more downstream is decreased. On the other hand, when the feeding speed of the feeding conveyor 3 is increased, the number of workpieces 10 that cannot be picked up by the robot RB1 is increased.”), (Note: See paragraph [0046] of Ban cited above and Figure 6 of Ban as well.); and
in response to the determination, operate to implement the active measure to improve the flow of the items through the workspace ([0045] via “ In such a case, it is necessary, therefore, to adjust the feeding speed of the feeding conveyor 3 to be suitable to the operating speed of the robots, thereby to reduce the waiting time of the robots RB1, RB2 and to achieve the optimum feeding speed of the feeding conveyor at which all or most of the workpieces can be picked up. For this purpose, in this embodiment, the robot control unit 4 is also adapted to control the motor speed of the drive source 2 for driving the feeding conveyor 3. … The CPU 6a of the robot controller 6 executes the robot distribution process and the conveyor speed adjusting process using the multitask function thereof.”), ([0046] via “After the conveyor speed command has been output, a signal is evaluated for indicating whether or not the conveyor speed adjusting process is to be completed (step 307). If the process end signal has not been on, the process returns to step 301, and the process described above continues to be executed. On the other hand, if the process end signal has been on, the process exit the above process loop.”), (Note: See Figure 6 of Ban as well.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Ban wherein the processor is further configured to: detect the detected state or condition, including by using the sensor data to detect that an observed flow of items through the workspace has deviated from a predicted flow predicted based on a model of item flow through the workspace; determine that the detected state or condition impedes implementation of a current plan to autonomously operate the first robotic arm to pick the one or more items from the workspace and place each item singly in a corresponding location in the singulation conveyance structure; and in response to the determination, operate to implement the active measure to improve the flow of the items through the workspace. Doing so optimizes the flow of the items such that the robotic arms are able to pick all items present while minimizing wait time, as stated above by Ban in paragraph [0045].
12. Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Lei (US 20200290825 A1 hereinafter Lei) and Sekiguchi et al. (US 20200391379 A1 hereinafter Sekiguchi).
Regarding Claim 6, modified reference Mizoguchi teaches the system of claim 4, but is silent on wherein the sensor data includes data associated with an infrared beam break detector and the detected state or condition is detected based at least in part on a determination that an infrared beam of the infrared beam break detector has not been broken at an expected time.
However, Lei teaches wherein the sensor data includes data associated with a beam break detector and the detected state or condition is detected based at least in part on a determination that a beam of the beam break detector has not been broken at an expected time ([0130] via “As a more specific example, the first position sensor 510 can operate with the second position sensor 512 to determine if both are below the object top 318. In other words, the first position sensor 510 and the second position sensor 512 can operate such that one is generating an optical beam while the other is receiving the optical beam. While this optical beam is not broken, the first position reading 518, the second position reading 520, or a combination thereof can indicate that the first position sensor 510, the second position sensor 512, or a combination thereof are above the object top 318. When the optical beam is broken, the first position reading 518, the second position reading 520, or a combination thereof can indicate that the first position sensor 510, the second position sensor 512, or a combination thereof are below the object top 318.”), (Note: See paragraph [0131] of Lei as well.).
Further, Sekiguchi teaches wherein the beam break detector is an infrared beam break detector ([0031] via “As the distance detector 5, a laser range finder (LRF) configured to oscillate and radiate laser light, and perceive the laser light bouncing back from the object 21
is applied as an example. Although it is sufficient if the laser light serving as the incident light 50 is infrared laser light, it is also sufficient if the incident light 50 is laser light such as visible light, ultraviolet light, X-ray, and the like.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lei wherein the sensor data includes data associated with a beam break detector and the detected state or condition is detected based at least in part on a determination that a beam of the beam break detector has not been broken at an expected time. Doing so locates the sensors relative to the item such that the dimensions of the item can be determined, as stated above by Lei.
In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Sekiguchi wherein the beam break detector is an infrared beam break detector. The courts have determined under the case KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 415-421, 82 USPQ2d 1385, 1395-07 (2007), a number of rationales in which obviousness is concluded. The rationale that pertains to the present invention is rationale B: Simple Substitution of One Known Element for Another to Obtain Predictable Results. Specifically, in this case item 3 of rationale B is satisfied: a finding that one of ordinary skill in the art could have substituted one known element for another, and the results of the substitution would have been predictable. Infrared beams are common and well known within the art, particularly when there are a predetermined number of light ranges to choose from. While the invention of modified reference Mizoguchi in view of Diankov and Lei includes an optical beam sensor, despite the lack of mention that this optical beam is an infrared beam, the functionalities of the invention would still produce the same outcomes, and therefore the simple substitution of an infrared beam would have been obvious to implement.
13. Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Holopainen et al. (US 20210237260 A1 hereinafter Holopainen).
Regarding Claim 8, modified reference Mizoguchi teaches the system of claim 4, but is silent on wherein the first robotic arm comprises a robotically controlled air blower, and the active measure includes using the air blower to clear a logjam by blowing air on the one or more items in the workspace.
However, Holopainen teaches wherein the first robotic arm comprises a robotically controlled air blower, and the active measure includes using the air blower to clear a logjam by blowing air on the one or more items in the workspace ([0127] via “In the second mode, or the “blow mode”, the air flow through the suction gripper 132 is reversed. Indeed, FIG. 5 shows the blow component 502 in operation and the air flow flowing from the blow component 502
to the suction cup 400. Air is drawn in from the first air inlet 506 and flows through the blow tube 504 to the suction tube 414 and exits at the air hole 412 of the suction cup 400. The positive air pressure exerts a force on a blocking object 514 causing a blockage in the suction tube 414. The force of the positive air flow can push the blocking object 514 out from the suction cup.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Holopainen wherein the first robotic arm comprises a robotically controlled air blower, and the active measure includes using the air blower to clear a logjam by blowing air on the one or more items in the workspace. Doing so uses the structure already within the suction-based end effector to be able to clear blocking items in the workspace, a stated above by Holopainen.
14. Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Nemati et al. (US 20190315570 A1 hereinafter Nemati).
Regarding Claim 9, modified reference Mizoguchi teaches the system of claim 4, but is silent on wherein the active measure includes using the first robotic arm to clear a logjam of items impeding the flow of items through the workspace.
However, Nemati teaches wherein the active measure includes using the first robotic arm to clear a logjam of items impeding the flow of items through the workspace ([0103] via “In another example, the arm 1906 partially extends to block the item 1908 for a predetermined time-period which provides sufficient separation distance between the item 1904 and the item 1908. When the item 1904 reaches the scan location or the predetermined time-period after item 1904 passes the arm 1906 expires, the arm 1906
partially retracts to unblock item 1908. Once item 1908 is clear, the arm 1906 then extends fully across the conveyor belt 204 to block the item 1910 from moving past the arm 1906 until the second item 1908 either reaches the scan location or the predetermined time-period after unblocking item 1908 expires. In this non-limiting example, the arm 1912 remains retracted and only arm 1906 extends and retracts to sort/align the items moving down the conveyor belt. In other words, the item alignment arms retract and extend in a given sequence and/or timing to alternatively block and un-block items as they flow down a moving conveyor belt to prevent two items from entering the scan location at the same time.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Nemati wherein the active measure includes using the first robotic arm to clear a logjam of items impeding the flow of items through the workspace. The first robotic arm intervenes as necessary to prevent multiple items unintentionally being handled at once, improving item flow through the system, as stated above by Nemati.
15. Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Kondo et al. (US 20240157404 A1 hereinafter Kondo).
Regarding Claim 10, modified reference Mizoguchi teaches the system of claim 4, but is silent on wherein the active measure includes: operating the first robotic arm to clear debris impeding flow through the workspace.
However, Kondo teaches operating the first robotic arm to clear debris impeding flow through the workspace ([0026] via “Waste material processing system 30 is a system that places waste material 12 on a conveyance surface of conveyance section 22, conveys waste material 12 in the conveyance line, removes foreign matter 14, and recycles waste material
12. As illustrated in FIG. 2, waste material processing system 30 includes conveyance section 22, recognition section 32, detection section 37, removal section 40, and control device 50.”), ([0029] via “Removal section 40 is a device that is disposed in removal area 35
and that executes work for removing foreign matter 14 included in waste material 12. Removal section 40 may include arm robot 41.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Kondo wherein the active measure includes: operating the first robotic arm to clear debris impeding flow through the workspace. Doing so incorporates a system to appropriately remove any unwanted foreign matter within the item flow, as stated by Kondo ([0010] via “In this waste material processing system, it is possible to execute removal of foreign matters more appropriately when recycling the waste material.”).
16. Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Wagner et al. '011 (US 20200160011 A1 hereinafter Wagner '011).
Regarding Claim 11, modified reference Mizoguchi teaches the system of claim 1, but is silent on wherein the detected state or condition includes a determination that an item exceeds a prescribed threshold weight and the active measure includes determining to pull, drag, or push the item through at least a portion of the workspace to the corresponding destination location.
However. Wagner ‘011 teaches wherein the detected state or condition includes a determination that an item exceeds a prescribed threshold weight and the active measure includes determining to pull, drag, or push the item through at least a portion of the workspace to the corresponding destination location ([0177] via “In accordance with further aspects of the invention for example, induction systems may be used that may discriminate between objects by passing objects by an air blower that pushes lighter packages from a stream of packages, leaving the heavier packages. The heavier packages' larger inertia overcomes the air resistance arising from the blown air. For lighter packages, the air resistance exceeds the lighter packages' lower inertia.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Wagner ‘011 wherein the detected state or condition includes a determination that an item exceeds a prescribed threshold weight and the active measure includes determining to pull, drag, or push the item through at least a portion of the workspace to the corresponding destination location. Doing so discriminates between certain types of packages such that each package type is transported to the appropriate handling area for said package type, as stated above by Wagner ‘011.
17. Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), further in view of Nemati et al. (US 20190315570 A1 hereinafter Nemati), and further in view of Blasdel et al. (US 20160257001 A1 hereinafter Blasdel).
Regarding Claim 12, modified reference Mizoguchi teaches the system of claim 9, but is silent on wherein the item is pulled, dragged, or pushed into an exception handling area.
However, Blasdel teaches wherein the item is pulled, dragged, or pushed into an exception handling area ([0042] via “The robotic arm may move 710 across the surface such that it sweeps across the surface and pushes an object (such as the target object) towards the target area, as described in greater detail herein.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Blasdel wherein the item is pulled, dragged, or pushed into an exception handling area. Pushing the item into the exception handling area may accurately place it within the specified target area, a stated by Blasdel ([0038] via “Once the target object 505 has been pushed by the first arm 100a to the first vector 510 defining a first mapped target area 515, the first arm may retract and move away such that a second robotic arm 100b may move in direction D.sub.3 to push the target object to a second vector 512. The second vector 512, along with the first vector 510, the first edge
502a, and the second edge 502b, may define a second mapped target area 520 that is generally smaller in size than the first mapped target area 515. Because the second mapped target area 520 is smaller, it increases the accuracy of the robotic hand in gripping the target object 505. In some embodiments, particularly embodiments where a sensor on the first robotic arm 100a can sense an exact location of contact, the precise location of the target object 505 may be obtained by determining the paths that the target object travels while being pushed by each of the robotic arms 100a, 100b. Such a precise location may ensure a more accurate grip by the robotic hand.”).
18. Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Iino et al. (US 20210107750 A1 hereinafter Iino).
Regarding Claim 13, modified reference Mizoguchi teaches the system of claim 1, but is silent on wherein performing the active measure includes dragging an item along a top surface of other items present in the workspace.
However, Iino teaches wherein performing the active measure includes dragging an item along a top surface of other items present in the workspace ([0051] via “Then, based on the size information and the position information read from the package identification codes
12a-12f, the tip side of the support plate 310 of the robot hand 31 is positioned slightly below the lower edge of the package 5C, and the suction pad 311 sucks while closely coming in contact with the package 5C. In this way, as shown in FIG. 6, the package 5C is unloaded while pulled out and picked up successively from the top of the stacked packages. Thus, even when the package 5C is stacked tightly and the gaps on top and/or side(s) of the package 5C are small, it is possible to smoothly pull out and unload the package 5C with simple configuration without cost increase.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Iino wherein performing the active measure includes dragging an item along a top surface of other items present in the workspace. Doing so allows for the gripping and singulation of items by the robotic arm even when the items are initially piled in close proximity to each other, as stated above by Iino.
19. Claim(s) 15 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Lee et al. (US 11478942 B1 hereinafter Lee).
Regarding Claim 15, modified reference Mizoguchi teaches the system of claim 16, wherein: the first robotic arm comprises an end effector ([0030] via “FIG. 1 is an illustration of an example environment in which a robotic system 100 transports objects. The robotic system 100 can include an unloading unit 102, a transfer unit or assembly 104 (“transfer assembly 104”), a transport unit 106, a loading unit 108, or a combination thereof in a warehouse or a distribution/shipping hub.”), ([0031] via “The transfer assembly 104 can include a robotic end effector 140 (“end effector 140”) with vacuum grippers (or vacuum regions) each individually operated to pick up and carry object(s) 112.”); and
the end effector comprises one or more suction cups ([0078] via “The grippers can include suction elements (e.g., suction tubes, suction cups, sealing member, etc.), sealing member, valve plates, gripper mechanisms, and other fluidic components for providing gripping capability.”).
Mizoguchi is silent on wherein the active measure comprises changing one or both of the position and the orientation of the at least one of the one or more items by blowing air out of at least one of the one or more suction cups.
However, Lee teaches wherein the active measure comprises changing one or both of the position and the orientation of the at least one of the one or more items by blowing air out of at least one of the one or more suction cups (Col. 6 lines 3-12, where “The picking assembly
210 may be coupled to a vacuum suction system that may provide vacuum flow or negative air pressure to the individual suction cup assemblies. The negative air pressure may flow through the suction cups coupled to the individual suction cup assemblies, which may provide a force that can be used to grasp and lift items out of a container, off of a conveyor, or from another location. To release an item, for example onto a conveyor belt, the negative air pressure may be reduced and/or positive air pressure may be applied.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lee wherein the active measure comprises changing one or both of the position and the orientation of the at least one of the one or more items by blowing air out of at least one of the one or more suction cups. Doing so places the item from the suction-based end effector to the appropriate placement location, such as a conveyor belt, as stated above by Lee.
Regarding Claim 18, modified reference Mizoguchi teaches the system of claim 1, wherein the suction-based end effector comprises a plurality of independently actuated sets of suction cups, each set comprising one or more suctions cups ([0078] via “The grippers can include suction elements (e.g., suction tubes, suction cups, sealing member, etc.), sealing member, valve plates, gripper mechanisms, and other fluidic components for providing gripping capability.”), ([0131] via “FIG. 20 shows air being sucked into the vacuum regions
117a, 117b, as indicated by arrows, to hold the target objects 812a, 812b against the gripper assembly 141 without drawing a vacuum (or a substantial vacuum) at the other vacuum region 117c. The vacuum level can be increased or decreased to increase or decrease the compression of the compliant panel(s) 412 (one identified).”).
Mizoguchi is silent on wherein the active measure includes operating the respective independently actuated sets of suction cups in a staggered manner to release each item in the grasp singly at a corresponding destination location.
However, Lee teaches wherein the active measure includes operating the respective independently actuated sets of suction cups in a staggered manner to release each item in the grasp singly at a corresponding destination location (Col. 6 lines 37-49, where “At block 278, an order of release may be determined. For example, the controller may determine that a first item is to be released or dropped at a first location, a second item is to be released second (to the same location or a different location), and so forth. At block 280, individual items may be caused to be released in the order of release. For example, the controller may cause the picking assembly 210 to move from a first position to a second position, and may cause the picking assembly 210 to release the first item and the second item in the order of release. In some instances, positive air pressure may be applied to specific suction cup assemblies to release a particular item.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lee wherein the active measure includes operating the respective independently actuated sets of suction cups in a staggered manner to release each item in the grasp singly at a corresponding destination location. Doing so individually releases each item grabbed by the suction-based end effector to the appropriate placement locations, as stated above by Lee.
20. Claim(s) 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Wagner et al. '935 (US 20200130935 A1 hereinafter Wagner '935).
Regarding Claim 17, modified reference Mizoguchi teaches the system of claim 1, but is silent on wherein the determination that the first robotic arm has picked up the number of items greater than the expected number of items is further based on one or more of image sensor data, weight data, and tactile sensor data.
However, Wagner ‘935 teaches wherein the determination that the first robotic arm has picked up the number of items greater than the expected number of items is further based on one or more of image sensor data, weight data, and tactile sensor data ([0055] via “In accordance with certain embodiments, systems of the invention provide certain information that is determined to be absolutely correct, or ground truth (GT) information, regarding a variety of parameters through such feedback learning. For example, weight sensors may be used to determine or confirm a multi-object pick occurrence as follows. The weight sensors may be used to determine that more than one object has been grasped, for example, if an item is identified yet the experienced weight is much greater than that of the identified object. … Such noted changes (or events of not having changed) will then be correlated with performance (a successful grasp and acquisition, or a not successful grasp and acquisition), and may be further detailed with regard to whether such gasps repeated over time are generally successful and/or whether the movement (acquisition) repeated over time is generally successful.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Wagner ‘935 wherein the determination that the first robotic arm has picked up the number of items greater than the expected number of items is further based on one or more of image sensor data, weight data, and tactile sensor data. Doing so incorporates a method of detecting if the end effector picked up more items than expected, and the system can be trained overtime to identify these weight differences to result in more successful grasps, as stated above by Wagner ‘935.
21. Claim(s) 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), and further in view of Lykkegaard et al. (US 20160199884 A1 hereinafter Lykkegaard).
Regarding Claim 19, modified reference Mizoguchi teaches the system of claim 1, but is silent on wherein: the system includes a second robotic arm and the first robotic arm and the second robotic arm are operated autonomously and independently from one another, each to pick and place items from the workspace and place them singly each in a corresponding single destination location in a manner that achieves combined throughput while avoiding collisions or other interference between the first robotic arm and the second robotic arm.
However, Lykkegaard teaches wherein: the system includes a second robotic arm and the first robotic arm and the second robotic arm are operated autonomously and independently from one another ([0086] via “FIG. 2 illustrates basically the same sorter system as in FIG. 1, and thus the above description of the single parts apply as well for FIG. 2. However, in the embodiment of FIG. 2, there are two robots R1, R2 controlled by respective outputs O1, O2 by the processor PRC.”), each to pick and place items from the workspace and place them singly each in a corresponding single destination location ([0075] via “The processor PRC processes these inputs I_1, I_A in the control algorithm C_A, and in response, the control algorithm C_A generates an output O1 indicating where to place an item picked up by the robot R1. Accordingly, the robot R1 is controlled to pick up an item from the feeding conveyor FC, and to place the item in accordance with the output O1 of the control algorithm.”), ([0088] via “Especially, the two robots R1, R2 may be identical robots. … E.g. the first robot R1 can only handle items up to a certain size, while the second robot R2 can handle larger items, thus in such case, the control algorithm C_A can be arranged to control the robots R1, R2 such that an item larger than a predefined size is skipped by the first robot R1, such that it can be handled by the second robot R2.”) in a manner that achieves combined throughput while avoiding collisions or other interference between the first robotic arm and the second robotic arm ([0087] via “By having only one control algorithm C_A for controlling both robots R1, R2, it is ensured that the control algorithm C_A can take into account the activity of both robots R1, R2, thus ensuring that a collision will not happen, even though the reaching ranges of the two robots R1, R2 overlap. Even further, it may be possible to define a control algorithm C_A such that it is possible to provide a synergistic effect between the two robots R1, R2. E.g. the control algorithm C_A may decide that the first robot R1 should singulate incoming bulk items instead of picking and moving them, and thus allow the second R2 to pick and move at a higher speed, since it receives singulated items instead of bulk items.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lykkegaard wherein: the system includes a second robotic arm and the first robotic arm and the second robotic arm are operated autonomously and independently from one another, each to pick and place items from the workspace and place them singly each in a corresponding single destination location in a manner that achieves combined throughput while avoiding collisions or other interference between the first robotic arm and the second robotic arm. Doing so incorporates communication between the first and second robotic arms such that they are able to effectively plan movements of each arm to ensure collision avoidance between the two arms, as stated above by Lykkegaards in paragraph [0087].
22. Claim(s) 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), further in view of Wagner et al. '935 (US 20200130935 A1 hereinafter Wagner '935), and further in view of Kobayashi et al. (US 20190366541 A1 hereinafter Kobayashi).
Regarding Claim 20, modified reference Mizoguchi teaches the system of claim 17, but is silent on wherein the detected condition or state comprises a potential collision between the first robotic arm and the second robotic arm and the active measure includes delaying a next operation of the first robotic arm until after a movement of the second robotic arm.
However, Kobayashi teaches wherein the detected condition or state comprises a potential collision between the first robotic arm and the second robotic arm ([0032] via “In this embodiment, the ranges of motion of the arms R1 and R2 partially overlap each other because the arms R1 and R2 may sometimes carry out the task in collaboration. Such an overlapping zone constitutes the “interference zone” where the arms R1 and R2 are likely to interfere with each other while the ranges of the motion of the arms other than the interference zone constitute the “non-interference zone”. In this embodiment, while one of the arms performs the one-arm task (the task which is not the collaborative task) in the interference zone, the other arm is supposed to perform a non-interference action or to stand by in the non-interference zone in order to avoid a collision of the arms.”) and the active measure includes delaying a next operation of the first robotic arm until after a movement of the second robotic arm ([0058] via “In the examples of FIGS. 8A and 8B, it is possible to regard the target actions A2 and A3 assigned to the arm R1 as a first task and the target actions B2 and B3 assigned to the arm R2 as a second task. Step S30 may be regarded as the processing to compare the task completion time in the case of delaying the first task (the case of a first task order) for the purpose of avoiding an overlap of the site to carry out the first task (the target actions A2 and A3) and the site to carry out the second task (the target actions B2 and B3) with the task completion time in the case of delaying the second task (the case of a second task order) for the same purpose, and then to adopt the task order that has the earlier completion time.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Kobayashi wherein the detected condition or state comprises a potential collision between the first robotic arm and the second robotic arm and the active measure includes delaying a next operation of the first robotic arm until after a movement of the second robotic arm. Doing so optimizes the scheduling of each robotic arm’s tasks such that the risk for collision is minimized while still allowing each robot to perform its commanded tasks, as stated above by Kobayashi in paragraph [0058].
23. Claim(s) 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mizoguchi et al. (US 20210053230 A1 hereinafter Mizoguchi) in view of Diankov et al. (US 20200376670 A1 hereinafter Diankov), further in view of Lee et al. (US 11478942 B1 hereinafter Lee), and further in view of Lykkegaard et al. (US 20160199884 A1 hereinafter Lykkegaard).
Regarding Claim 21, modified reference Mizoguchi teaches the system of claim 18, but is silent on wherein the active measure is based at least in part on a determination that a next operation of the second robotic arm is to grasp a first item that is further away from the singulation conveyance structure than a second item to be grasped next by the first robotic arm.
However, Lykkegaard teaches wherein the active measure is based at least in part on a determination that a next operation of the second robotic arm is to grasp a first item that is further away from the singulation conveyance structure than a second item to be grasped next by the first robotic arm ([0036] via “The method may comprise providing a second robot arranged to pick up an item from the feeding conveyor in accordance with a second output from the control algorithm. Especially, the control algorithm may be arranged to generate the second output in response to an activity of the first robot. This can provide effective cooperation between two robots, e.g. to ensure that the robots do not collide in case their reaching ranges overlap, and e.g. to ensure that the robots do not try to pick up the same item or try to place an item on one place simultaneously.”), ([0087] via “By having only one control algorithm C_A for controlling both robots R1, R2, it is ensured that the control algorithm C_A can take into account the activity of both robots R1, R2, thus ensuring that a collision will not happen, even though the reaching ranges of the two robots R1, R2 overlap. Even further, it may be possible to define a control algorithm C_A such that it is possible to provide a synergistic effect between the two robots R1, R2. E.g. the control algorithm C_A may decide that the first robot R1 should singulate incoming bulk items instead of picking and moving them, and thus allow the second R2 to pick and move at a higher speed, since it receives singulated items instead of bulk items.”), (Note: See Figure 2 of Lykkegaard as well.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lykkegaard wherein the active measure is based at least in part on a determination that a next operation of the second robotic arm is to grasp a first item that is further away from the singulation conveyance structure than a second item to be grasped next by the first robotic arm. Doing so incorporates communication between the first and second robotic arms such that they are able to effectively plan movements of each arm within a shared, overlapping area to ensure collision avoidance between the two arms, as stated above by Lykkegaard in paragraph [0087].
Examiner’s Note
24. The Examiner has cited particular paragraphs or columns and line numbers in the
references applied to the claims above for the convenience of the Applicant. Although the
specified citations are representative of the teachings of the art and are applied to specific
limitations within the individual claim, other passages and figures may apply as well. It is
respectfully requested of the Applicant in preparing responses, to fully consider the references
in their entirety as potentially teaching all or part of the claimed invention, as well as the
context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP
2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole,
including portions that would lead away from the claimed Invention. W.L. Gore & Associates,
Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851
(1984). See also MPEP §2123.
Conclusion
25. Any inquiry concerning this communication or earlier communications from the
examiner should be directed to BYRON X KASPER whose telephone number is (571)272-3895.
The examiner can normally be reached Monday - Friday 8 am - 5 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing
using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is
encouraged to use the USPTO Automated Interview Request (AIR) at
http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s
supervisor, Adam Mott can be reached on (571) 270-5376. The fax phone number for the
organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be
obtained from Patent Center. Unpublished application information in Patent Center is available
to registered users. To file and manage patent submissions in Patent Center, visit:
https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for
more information about Patent Center and https://www.uspto.gov/patents/docx for
information about filing in DOCX format. For additional questions, contact the Electronic
Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO
Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BYRON XAVIER KASPER/Examiner, Art Unit 3657
/ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657