DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Pursuant to communications filed on 05 November 2025, amendments and/or arguments have been submitted and placed in the application file. Claims 2, 7, 12, 13, 27 and 29 have been cancelled, therefore claims 1, 3-6, 8-11, 14-26, 28 and 30-33 are pending in the instant application.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-33 have been considered but are moot in view of the new grounds of rejection provided below in view of newly found prior art, which was necessitated based on Applicant’s amendments to the claims.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1, 3-4 & 8-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Moreno et al (US 2022/0297958 A1, hereinafter Moreno).
Regarding claim 1, Moreno teaches an object processing system (Figure 4C, kitting system 450) comprising:
an input area for receiving an input container (Figures 4C, kitting shelf system 454) that contains a plurality of objects (Figure 4C, object(s) 451) to be processed, the input area including a conveyor section (Figure 4C, conveyor(s) 466, 468) (Figures 1 & 4B-4C; at least as in paragraph 0142, wherein “kitting system 450 includes a kitting shelf system 454 and a robotic arm 452. In some embodiments, a kitting system includes a plurality of kitting shelf systems and/or a plurality of robotic arms 454. The robotic arm may operate autonomously to pick an item (or an object from within an item) from a kitting shelf system and place the item (or object) to a predetermined location. In some embodiments, a robotic arm picks and places one or more items to a predetermined location based at least in part on a plan such as a plan for kitting the one or more items (e.g., to assemble a kit based on an order etc.)”, and further at least as in paragraph 0169, wherein “While a “kitting” operation is shown in FIG. 1 and described herein with reference to FIG. 4C and other Figures, in various embodiments kitting systems (and kitting shelf systems) and integrated systems as disclosed herein may be used to perform the reverse operation, e.g., by stocking shelves, bins, and/or kitting machines with items removed from an initially full or partly full box of items” and additionally as in at least paragraph 0170, wherein “In some embodiments, items on a kitting shelf system, such as kitting shelf system 454, or on a shelf accessed by, or comprised in, a kitting system as disclosed herein, may be bins or trays that comprise objects that are to be “kitted”.”);
an output area including a plurality of destination containers (Figure 4C, receptacle(s) 470, 472) for receiving any of the plurality of objects (Figures 1 & 4B-4C; at least as in paragraphs 0144 wherein “motive force is applied via a controller (not shown in FIG. 4C) to move the carriage 462 and attached robotic arm 454 along the rail or guide 464 to facilitate the automated retrieval of items from one or more kitting shelf systems, and the placement of items (e.g., object 451) in a receptacle (e.g., a box, a tray, etc.), such as receptacle 470 or receptacle 472, as the receptacle is moved along the corresponding conveyor”, and further as in paragraph 0151, wherein “Control computer 482 controls robotic arm 452 to retrieve the item(s) from the corresponding pickup zone(s) and places the item(s) in the receptacle (e.g., receptacle 472 or receptacle 470, as applicable)”);
a programmable motion device (Figure 4C, robotic arm 452) proximate the input area and the output area, the programmable motion device including an end-effector for grasping a selected object of the plurality of objects from the input container (Figures 1 & 4B-4C; at least as in paragraph 0142, wherein “The robotic arm may operate autonomously to pick an item (or an object from within an item) from a kitting shelf system and place the item (or object) to a predetermined location”, and further as in paragraph 0144, wherein “motive force is applied via a controller (not shown in FIG. 4C) to move the carriage 462 and attached robotic arm 454 along the rail or guide 464 to facilitate the automated retrieval of items from one or more kitting shelf systems, and the placement of items (e.g., object 451) in a receptacle (e.g., a box, a tray, etc.), such as receptacle 470 or receptacle 472, as the receptacle is moved along the corresponding conveyor”, and further as in paragraph 0149, wherein “robotic arm 452 has an end effector corresponding to a two-digit gripper. In various embodiments, robotic arm 452 includes one or more other and/or different types of end effectors/retrieval tool, including without limitation a gripper having three or more digits; a gripper having digits with different attributes than as shown, e.g., cushioned digits, smaller digits, larger digits, etc.; and/or a retrieval tool that is not a gripper, such as one configured to pick up items using suction, friction, electrostatic force, magnetic force, etc.”); and
a perception system (Figure 4C, camera 480) for detecting whether any of the plurality of objects that is not associated with the end-effector of the programmable motion device is located on the conveyor section of the input area outside of the input container (Figures 1 & 4B-4C; at least as in paragraph 0156, wherein “kitting system 450 includes a camera 480 (e.g., a video camera) configured to capture images (e.g., video images) of the elements comprised in kiting system 450. Camera 480 may be one of a plurality of sensors that obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)…camera 480 and/or other cameras and/or other sensors may be used to facilitate robotic arm 452 picking up an item and/or placing the item in its receptacle (e.g., box)”, and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.”);
wherein the programmable motion device is configured to retrieve the detected object from the conveyor section of the input area (at least as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.”). Examiner notes wherein Moreno is silent regarding wherein an object is detected explicitly “on a conveyor section” and “outside of the input container”, however, gleaning from the teachings of Moreno, specifically wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, it would have been obvious, if not inherent, to identify/determine one or more misplaced/dropped items in said workspace, including on the one or more conveyors and/or inside/external to one or more input containers (i.e. receptacles), since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system.
Regarding claim 3, Moreno further teaches wherein the conveyor section of the input area includes at least a portion of a roller conveyor system (Figures 1 & 4B-4C; at least as in paragraph 0152, wherein “kitting shelf system may comprise any one of a plurality of structures and mechanisms to supply items to an associated pick zone, including without limitation a gravity type conveyor having a plurality of adjacent rollers, a ramp, a conveyor belt, a set of revolving bins, etc.”).
Regarding claim 4, Moreno further teaches wherein the conveyor section of the input area includes at least a portion of a belted conveyor system (Figures 1 & 4B-4C; at least as in paragraph 0152, wherein “kitting shelf system may comprise any one of a plurality of structures and mechanisms to supply items to an associated pick zone, including without limitation a gravity type conveyor having a plurality of adjacent rollers, a ramp, a conveyor belt, a set of revolving bins, etc.”).
Regarding claim 8, Moreno further teaches wherein the perception system further includes any of a camera system and a scanning system for detecting any identifying indicia on any of the plurality of objects that fall toward a portion of a floor of the object processing system (Figures 1, 4C & 8B; at least as in paragraph 0157, wherein “the one or more sensors may comprise a scanner that obtains data pertaining to a label or identifier on the object as the object is moved from the item to the predetermined structure” and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.” and additionally wherein “the identifier may be a barcode, a QR code, a near field communication label, etc., one or more sensors in the workspace such as a sensor (e.g., scanner) at an input to the system may obtain the information pertaining to the identifier”). As noted above with respect to claim 1, Moreno teaches wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, therefore, Examiner contends wherein it would have been obvious, if not inherent, to one of ordinary skill in the art at the effective filing date of the instant invention to combine Moreno’s teachings of utilizing one or more sensors to identify an identifier associated with one or more objects with a workspace, including “misplaced or dropped” items, since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system.
Regarding claim 9, Moreno further teaches wherein the perception system further includes any of a camera system and a scanning system for detecting a number of any of the plurality of objects that fall toward a floor of the portion of the object processing system (Figures 1, 4C & 8B; at least as in paragraph 0157, wherein “the one or more sensors may comprise a scanner that obtains data pertaining to a label or identifier on the object as the object is moved from the item to the predetermined structure” and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.” and additionally wherein “the identifier may be a barcode, a QR code, a near field communication label, etc., one or more sensors in the workspace such as a sensor (e.g., scanner) at an input to the system may obtain the information pertaining to the identifier”). As noted above with respect to claim 1, Moreno teaches wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, therefore, Examiner contends wherein it would have been obvious, if not inherent, to one of ordinary skill in the art at the effective filing date of the instant invention to combine Moreno’s teachings of utilizing one or more sensors to identify an identifier associated with one or more objects with a workspace, including “misplaced or dropped” items, since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system.
Claim(s) 5, 10-11, 14-17, 26, 28 and 30-33 is/are rejected under 35 U.S.C. 103 as being unpatentable over Moreno et al (US 2022/0297958 A1, hereinafter Moreno) in view of Wagner et al (US 2020/0160011 A1, hereinafter Wagner).
The teachings of Moreno have been discussed above.
Regarding claim 5, Moreno teaches wherein the kitting system may include weight sensors to determine the weight of one or more items, however, Moreno is silent specifically regarding wherein “the conveyor section” of the input area includes a weight sensing conveyor section on which the input container is received. Wagner, in the same field of endeavor, teaches an object processing system comprising: an input area for receiving a plurality of objects to be processed, the input area including an input weight sensing conveyor section and the plurality of objects being provided within at least one input container (Figures 1, 9, 23-25, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0124-0131, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections); an output area (Figures 23-25, destination sections 228) including a plurality of destination containers (Figures 1, 9 and 23-25, destination bins 230) for receiving any of the plurality of objects; a programmable motion device (Figures 1, 9 and 23-25, programmable motion device 210) proximate the input area and the output area, the programmable motion device including an end-effector for grasping any of the plurality of objects (Figures 1, 9 and 23-26; at least as in paragraphs 0124-0131, wherein “[t]he programmable motion device 210 may include a robotic arm equipped with sensors and computing, that when combined is assumed herein to exhibit the following capabilities: (a) it is able to pick objects up from a singulated stream of objects using, for example, and end effector”); and a perception system (Figures 23-26, perception system 208, 212) for detecting a plurality of objects on the weight sensing conveyor section (Figures 1, 9, 23-26, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0124-0131, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections). Therefore, it would have been obvious to one of ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Moreno to include Wagner’s teachings of providing a “weight-sensing conveyor section”, since Wagner teaches wherein providing a “weight-sensing conveyor section” is useful in accurately identifying one or more items to be handled by the programmable motion device, thereby improving the performance/throughput of the object processing system.
Regarding claim 10, Moreno teaches an object processing system comprising:
an input area for receiving a plurality of objects (Figure 4C, object(s) 451) to be processed, the input area including an input (Figure 4C, conveyor(s) 466, 468) and the plurality of objects being provided within at least one input container (Figures 4C, kitting shelf system 454) (Figures 1 & 4B-4C; at least as in paragraph 0142, wherein “kitting system 450 includes a kitting shelf system 454 and a robotic arm 452. In some embodiments, a kitting system includes a plurality of kitting shelf systems and/or a plurality of robotic arms 454. The robotic arm may operate autonomously to pick an item (or an object from within an item) from a kitting shelf system and place the item (or object) to a predetermined location. In some embodiments, a robotic arm picks and places one or more items to a predetermined location based at least in part on a plan such as a plan for kitting the one or more items (e.g., to assemble a kit based on an order etc.)”, and further at least as in paragraph 0169, wherein “While a “kitting” operation is shown in FIG. 1 and described herein with reference to FIG. 4C and other Figures, in various embodiments kitting systems (and kitting shelf systems) and integrated systems as disclosed herein may be used to perform the reverse operation, e.g., by stocking shelves, bins, and/or kitting machines with items removed from an initially full or partly full box of items” and additionally as in at least paragraph 0170, wherein “In some embodiments, items on a kitting shelf system, such as kitting shelf system 454, or on a shelf accessed by, or comprised in, a kitting system as disclosed herein, may be bins or trays that comprise objects that are to be “kitted”.”);
an output area including a plurality of destination containers (Figure 4C, receptacle(s) 470, 472) for receiving any of the plurality of objects (Figures 1 & 4B-4C; at least as in paragraphs 0144 wherein “motive force is applied via a controller (not shown in FIG. 4C) to move the carriage 462 and attached robotic arm 454 along the rail or guide 464 to facilitate the automated retrieval of items from one or more kitting shelf systems, and the placement of items (e.g., object 451) in a receptacle (e.g., a box, a tray, etc.), such as receptacle 470 or receptacle 472, as the receptacle is moved along the corresponding conveyor”, and further as in paragraph 0151, wherein “Control computer 482 controls robotic arm 452 to retrieve the item(s) from the corresponding pickup zone(s) and places the item(s) in the receptacle (e.g., receptacle 472 or receptacle 470, as applicable)”);
a programmable motion device (Figure 4C, robotic arm 452) proximate the input area and the output area, the programmable motion device including an end-effector for grasping any of the plurality of objects (Figures 1 & 4B-4C; at least as in paragraph 0142, wherein “The robotic arm may operate autonomously to pick an item (or an object from within an item) from a kitting shelf system and place the item (or object) to a predetermined location”, and further as in paragraph 0144, wherein “motive force is applied via a controller (not shown in FIG. 4C) to move the carriage 462 and attached robotic arm 454 along the rail or guide 464 to facilitate the automated retrieval of items from one or more kitting shelf systems, and the placement of items (e.g., object 451) in a receptacle (e.g., a box, a tray, etc.), such as receptacle 470 or receptacle 472, as the receptacle is moved along the corresponding conveyor”, and further as in paragraph 0149, wherein “robotic arm 452 has an end effector corresponding to a two-digit gripper. In various embodiments, robotic arm 452 includes one or more other and/or different types of end effectors/retrieval tool, including without limitation a gripper having three or more digits; a gripper having digits with different attributes than as shown, e.g., cushioned digits, smaller digits, larger digits, etc.; and/or a retrieval tool that is not a gripper, such as one configured to pick up items using suction, friction, electrostatic force, magnetic force, etc.”); and
a perception system (Figure 4C, camera 480) for detecting whether any of the plurality of objects on the (Figures 1 & 4B-4C; at least as in paragraph 0156, wherein “kitting system 450 includes a camera 480 (e.g., a video camera) configured to capture images (e.g., video images) of the elements comprised in kiting system 450. Camera 480 may be one of a plurality of sensors that obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)…camera 480 and/or other cameras and/or other sensors may be used to facilitate robotic arm 452 picking up an item and/or placing the item in its receptacle (e.g., box)”, and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.”);
wherein the programmable motion device is configured to retrieve the detected object from the (at least as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.”). Examiner notes wherein Moreno is silent regarding wherein an object is detected explicitly “on a conveyor section” and “outside of the input container”, however, gleaning from the teachings of Moreno, specifically wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, it would have been obvious, if not inherent, to identify/determine one or more misplaced/dropped items in said workspace, including on the one or more conveyors and/or inside/external to one or more input containers (i.e. receptacles), since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system. That said, Moreno is silent regarding wherein the conveyor section specifically includes a “weight sensing conveyor section”.
Wagner, in the same field of endeavor, teaches an object processing system comprising: an input area for receiving a plurality of objects to be processed, the input area including an input weight sensing conveyor section and the plurality of objects being provided within at least one input container (Figures 1, 9, 23-25, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0124-0131, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections); an output area (Figures 23-25, destination sections 228) including a plurality of destination containers (Figures 1, 9 and 23-25, destination bins 230) for receiving any of the plurality of objects; a programmable motion device (Figures 1, 9 and 23-25, programmable motion device 210) proximate the input area and the output area, the programmable motion device including an end-effector for grasping any of the plurality of objects (Figures 1, 9 and 23-26; at least as in paragraphs 0124-0131, wherein “[t]he programmable motion device 210 may include a robotic arm equipped with sensors and computing, that when combined is assumed herein to exhibit the following capabilities: (a) it is able to pick objects up from a singulated stream of objects using, for example, and end effector”); and a perception system (Figures 23-26, perception system 208, 212) for detecting a plurality of objects on the weight sensing conveyor section (Figures 1, 9, 23-26, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0124-0131, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections). Therefore, it would have been obvious to one of ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Moreno to include Wagner’s teachings of providing a “weight-sensing conveyor section”, since Wagner teaches wherein providing a “weight-sensing conveyor section” is useful in accurately identifying one or more items to be handled by the programmable motion device, thereby improving the performance/throughput of the object processing system.
Regarding claim 11, in view of the above combination of Moreno and Wagner, Wagner further teaches wherein the output area includes an output weight sensing conveyor section (Figures 1, 9, 23-26, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0124-0131, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections).
Regarding claim 14, in view of the above combination of Moreno and Wagner, Moreno further teaches wherein at least one of the input weight sensing conveyor section and the output weight sensing conveyor section includes at least a portion of a roller conveyor system (Figures 1 & 4B-4C; at least as in paragraph 0152, wherein “kitting shelf system may comprise any one of a plurality of structures and mechanisms to supply items to an associated pick zone, including without limitation a gravity type conveyor having a plurality of adjacent rollers, a ramp, a conveyor belt, a set of revolving bins, etc.”).
Regarding claim 15, in view of the above combination of Moreno and Wagner, Moreno further teaches wherein at least one of the input weight sensing conveyor section and the output weight sensing conveyor section includes at least a portion of a belted conveyor system (Figures 1 & 4B-4C; at least as in paragraph 0152, wherein “kitting shelf system may comprise any one of a plurality of structures and mechanisms to supply items to an associated pick zone, including without limitation a gravity type conveyor having a plurality of adjacent rollers, a ramp, a conveyor belt, a set of revolving bins, etc.”).
Regarding claim 16, in view of the above combination of Moreno and Wagner, Moreno further teaches wherein the perception system further includes any of a camera system and a scanning system for detecting any identifying indicia on any of the plurality of objects that fall toward a portion of a floor of the object processing system (Figures 1, 4C & 8B; at least as in paragraph 0157, wherein “the one or more sensors may comprise a scanner that obtains data pertaining to a label or identifier on the object as the object is moved from the item to the predetermined structure” and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.” and additionally wherein “the identifier may be a barcode, a QR code, a near field communication label, etc., one or more sensors in the workspace such as a sensor (e.g., scanner) at an input to the system may obtain the information pertaining to the identifier”). As noted above with respect to claim 1, Moreno teaches wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, therefore, Examiner contends wherein it would have been obvious, if not inherent, to one of ordinary skill in the art at the effective filing date of the instant invention to combine Moreno’s teachings of utilizing one or more sensors to identify an identifier associated with one or more objects with a workspace, including “misplaced or dropped” items, since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system.
Regarding claim 17, in view of the above combination of Moreno and Wagner, Moreno further teaches wherein the perception system further includes any of a camera system and a scanning system for detecting a number of any of the plurality of objects that fall toward the floor of the portion of the object processing system (Figures 1, 4C & 8B; at least as in paragraph 0157, wherein “the one or more sensors may comprise a scanner that obtains data pertaining to a label or identifier on the object as the object is moved from the item to the predetermined structure” and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.” and additionally wherein “the identifier may be a barcode, a QR code, a near field communication label, etc., one or more sensors in the workspace such as a sensor (e.g., scanner) at an input to the system may obtain the information pertaining to the identifier”). As noted above with respect to claim 1, Moreno teaches wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, therefore, Examiner contends wherein it would have been obvious, if not inherent, to one of ordinary skill in the art at the effective filing date of the instant invention to combine Moreno’s teachings of utilizing one or more sensors to identify an identifier associated with one or more objects with a workspace, including “misplaced or dropped” items, since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system.
Regarding claim 26, Moreno teaches a method of processing objects comprising:
providing a plurality of objects (Figure 4C, object(s) 451) in an input container (Figures 4C, kitting shelf system 454) on a first (Figures 1 & 4B-4C; at least as in paragraph 0142, wherein “kitting system 450 includes a kitting shelf system 454 and a robotic arm 452. In some embodiments, a kitting system includes a plurality of kitting shelf systems and/or a plurality of robotic arms 454. The robotic arm may operate autonomously to pick an item (or an object from within an item) from a kitting shelf system and place the item (or object) to a predetermined location. In some embodiments, a robotic arm picks and places one or more items to a predetermined location based at least in part on a plan such as a plan for kitting the one or more items (e.g., to assemble a kit based on an order etc.)”, and further at least as in paragraph 0169, wherein “While a “kitting” operation is shown in FIG. 1 and described herein with reference to FIG. 4C and other Figures, in various embodiments kitting systems (and kitting shelf systems) and integrated systems as disclosed herein may be used to perform the reverse operation, e.g., by stocking shelves, bins, and/or kitting machines with items removed from an initially full or partly full box of items” and additionally as in at least paragraph 0170, wherein “In some embodiments, items on a kitting shelf system, such as kitting shelf system 454, or on a shelf accessed by, or comprised in, a kitting system as disclosed herein, may be bins or trays that comprise objects that are to be “kitted”.”);
grasping a selected object of the plurality of objects for movement to a destination container (Figure 4C, receptacle(s) 470, 472) using an end-effector of a programmable motion device (Figure 4C, robotic arm 452) (Figures 1 & 4B-4C; at least as in paragraph 0142, wherein “The robotic arm may operate autonomously to pick an item (or an object from within an item) from a kitting shelf system and place the item (or object) to a predetermined location”, and further as in paragraph 0144, wherein “motive force is applied via a controller (not shown in FIG. 4C) to move the carriage 462 and attached robotic arm 454 along the rail or guide 464 to facilitate the automated retrieval of items from one or more kitting shelf systems, and the placement of items (e.g., object 451) in a receptacle (e.g., a box, a tray, etc.), such as receptacle 470 or receptacle 472, as the receptacle is moved along the corresponding conveyor”, and further as in paragraph 0149, wherein “robotic arm 452 has an end effector corresponding to a two-digit gripper. In various embodiments, robotic arm 452 includes one or more other and/or different types of end effectors/retrieval tool, including without limitation a gripper having three or more digits; a gripper having digits with different attributes than as shown, e.g., cushioned digits, smaller digits, larger digits, etc.; and/or a retrieval tool that is not a gripper, such as one configured to pick up items using suction, friction, electrostatic force, magnetic force, etc.”);
monitoring whether any of the plurality of objects other than the selected object become dropped or displaced by using a perception system to detect whether any of the plurality of objects other than the selected object is located on the first (Figures 1 & 4B-4C; at least as in paragraph 0156, wherein “kitting system 450 includes a camera 480 (e.g., a video camera) configured to capture images (e.g., video images) of the elements comprised in kiting system 450. Camera 480 may be one of a plurality of sensors that obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)…camera 480 and/or other cameras and/or other sensors may be used to facilitate robotic arm 452 picking up an item and/or placing the item in its receptacle (e.g., box)”, and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.”); and
retrieving the detected object from the first (at least as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.”). Examiner notes wherein Moreno is silent regarding wherein an object is detected explicitly “on a conveyor section” and “outside of the input container”, however, gleaning from the teachings of Moreno, specifically wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, it would have been obvious, if not inherent, to identify/determine one or more misplaced/dropped items in said workspace, including on the one or more conveyors and/or inside/external to one or more input containers (i.e. receptacles), since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system. That said, Moreno is silent regarding wherein the conveyor section specifically includes a “weight sensing conveyor section”.
Wagner, in the same field of endeavor, teaches an object processing system comprising: an input area for receiving a plurality of objects to be processed, the input area including an input weight sensing conveyor section and the plurality of objects being provided within at least one input container (Figures 1, 9, 23-25, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0124-0131, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections); an output area (Figures 23-25, destination sections 228) including a plurality of destination containers (Figures 1, 9 and 23-25, destination bins 230) for receiving any of the plurality of objects; a programmable motion device (Figures 1, 9 and 23-25, programmable motion device 210) proximate the input area and the output area, the programmable motion device including an end-effector for grasping any of the plurality of objects (Figures 1, 9 and 23-26; at least as in paragraphs 0124-0131, wherein “[t]he programmable motion device 210 may include a robotic arm equipped with sensors and computing, that when combined is assumed herein to exhibit the following capabilities: (a) it is able to pick objects up from a singulated stream of objects using, for example, and end effector”); and a perception system (Figures 23-26, perception system 208, 212) for detecting a plurality of objects on the weight sensing conveyor section (Figures 1, 9, 23-26, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0124-0131, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections). Therefore, it would have been obvious to one of ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Moreno to include Wagner’s teachings of providing a “weight-sensing conveyor section”, since Wagner teaches wherein providing a “weight-sensing conveyor section” is useful in accurately identifying one or more items to be handled by the programmable motion device, thereby improving the performance/throughput of the object processing system.
Regarding claim 28, in view of the above combination of Moreno and Wagner, Moreno further teaches wherein the first weight sensing conveyor section includes a belted conveyor section (Figures 1 & 4B-4C; at least as in paragraph 0152, wherein “kitting shelf system may comprise any one of a plurality of structures and mechanisms to supply items to an associated pick zone, including without limitation a gravity type conveyor having a plurality of adjacent rollers, a ramp, a conveyor belt, a set of revolving bins, etc.”).
Regarding claim 30, in view of the above combination of Moreno and Wagner, Moreno further teaches wherein the method further includes detecting any identifying indicia on any of the plurality of objects that fall toward a portion of a floor and for detecting a number of any of the plurality of objects that fall toward the floor (Figures 1, 4C & 8B; at least as in paragraph 0157, wherein “the one or more sensors may comprise a scanner that obtains data pertaining to a label or identifier on the object as the object is moved from the item to the predetermined structure” and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.” and additionally wherein “the identifier may be a barcode, a QR code, a near field communication label, etc., one or more sensors in the workspace such as a sensor (e.g., scanner) at an input to the system may obtain the information pertaining to the identifier”). As noted above with respect to claim 1, Moreno teaches wherein computer vision techniques are utilized to “obtains information pertaining to the workspace (e.g., the workspace corresponding to kitting system 100)” (wherein the workspace includes at least the kitting shelf section, conveyors, receptacles, etc.) and further wherein said computer vision techniques are employed to determine “that an item is misplaced or dropped”, therefore, Examiner contends wherein it would have been obvious, if not inherent, to one of ordinary skill in the art at the effective filing date of the instant invention to combine Moreno’s teachings of utilizing one or more sensors to identify an identifier associated with one or more objects with a workspace, including “misplaced or dropped” items, since Moreno teaches wherein said computer vision techniques are utilized with said robotic kitting system to maximize collective throughput, thereby optimizing productivity of the robotic fulfillment system.
Regarding claim 31, in view of the above combination of Moreno and Wagner, Moreno further teaches wherein the method further includes detecting at least one characteristic regarding the selected object as the selected object continues to move through a pose-in-hand location (Figures 1, 4C & 8B; at least as in paragraph 0157, wherein “the one or more sensors may comprise a scanner that obtains data pertaining to a label or identifier on the object as the object is moved from the item to the predetermined structure” and further as in paragraph 0179, wherein “In response to a determination that an item is misplaced or dropped, the system assigns a robot or, if needed, a human worker to pick the misplaced item up and place the item back in the applicable kitting shelf system (e.g., on a shelf such as via the feeder portion) or, if available or more optimal, on a receptacle on the conveyor.” and additionally wherein “the identifier may be a barcode, a QR code, a near field communication label, etc., one or more sensors in the workspace such as a sensor (e.g., scanner) at an input to the system may obtain the information pertaining to the identifier”).
Regarding claim 32, in view of the above combination of Moreno and Wagner, Wagner further teaches wherein the method further includes detecting a first weight by the first weight sensing conveyor section, and wherein monitoring whether any of the plurality of objects other than the selected object become dropped or displaced further includes detecting a second weight by a second weight sensing conveyor section (Figures 1, 9, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections).
Regarding claim 33, in view of the above combination of Moreno and Wagner, Wagner further teaches wherein the method further includes determining whether any difference between a weight decrease at the first weight sensing conveyor section is within a tolerance of any weight increase at the second weight sensing conveyor section (Figures 1, 9, 39A-43C and 63-69; at least as in paragraph 0095-0098, 0158 and 0185-0191, specifically wherein the system includes one or more weight-sensing conveyor sections).
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Moreno et al (US 2022/0297958 A1, hereinafter Moreno) as modified by Wagner et al (US 2020/0160011 A1, hereinafter Wagner) above, and further in view of Prechtl et al (US 2021/0046646 A1, hereinafter Prechtl).
The teachings of Moreno and Wagner have been discussed above.
Regarding claim 6, Moreno as modified by Wagner above, is silent wherein the weight sensing conveyor section is configured to verify the selected object is displaced from the input container by weighing the input container before and after the end-effector grasps the selected object and comparing a weight pick delta against an expected weight of the selected object.
Prechtl, in the same field of endeavor, teaches a robot system comprising a robot with an end effector, a controller and at least a sensor system for detecting a loading space for handling one or more goods (i.e. objects) and further wherein the controller performs a check, via cameras of the sensor system, to see if a good/object protrudes and/or falls beyond a loading space (i.e. overflows, falls towards a floor, falls outside of a loading space, etc.), the end effector of the robot is controlled to move the good/object back into said loading space. Prechtl further teaches wherein the loading space may further include a scale which is positioned below the goods carrier and which can measure the weight of the goods/objects along with the goods/objects located therein, and further identify if a good/object falls from the end effector and/or carrier based on the detected weight change (i.e. weight delta) (Figures 1, 3 & 9-12; at least as in paragraphs 0069, 0075, 0101-0105, 0107 and 0114-0118). Therefore, it would have been obvious to one of ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Moreno as modified by Wagner above, to include Prechtl’s teaching of employing perception techniques to determine if a good/object protrudes and/or has fallen outside of a container when located in a designated loading space and further employing weight calculations to accurately identify one or more objects, since Prechtl teaches wherein such a robotic system employing such object identification techniques provides a more efficient and fast order-picking operation by said robotic system, thereby resulting in a more robust and enhanced robotic system for handling one or more objects.
Allowable Subject Matter
Claims 18-25 are allowable over the prior art.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See attached PTO-892 – Notice of References Cited form. Examiner additionally notes the following references, in the same field of endeavor as the instant invention, and also teach some of the currently provided claim limitations;
US 2019/0185267 A1, issued to Mattern et al, which is directed towards an automated control system for a picking cell for picking of goods that includes an object detection device for detecting goods in a supply bin, a gripper for adding/removing goods from said supply bin, and a cell control for evaluating data of the object detection device and for path planning and control of the gripper.
US 2022/0138214 A1, issued to Hattabaugh et al, which is directed towards systems and methods for providing analytics regarding a plurality of object processing systems.
US 2022/0134543 A1, issued to Amend, JR.et al, which is directed towards an object induction system for assigning handling parameters to an object including a perception system and a programmable motion device including an end effector for handling said one or more objects.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN L SAMPLE whose telephone number is (571)270-5925. The examiner can normally be reached Monday-Friday 7:00am-4:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached at (571)270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JONATHAN L SAMPLE/Primary Examiner, Art Unit 3657