Prosecution Insights
Last updated: April 19, 2026
Application No. 18/492,444

ROBOTIC SYSTEMS WITH DYNAMIC MOTION PLANNING FOR TRANSFERRING UNREGISTERED OBJECTS

Non-Final OA §102§103
Filed
Oct 23, 2023
Examiner
LE, TIEN MINH
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Mujin Inc.
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
2y 12m
To Grant
92%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
55 granted / 81 resolved
+15.9% vs TC avg
Strong +24% interview lift
Without
With
+23.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
30 currently pending
Career history
111
Total Applications
across all art units

Statute-Specific Performance

§101
8.1%
-31.9% vs TC avg
§103
51.7%
+11.7% vs TC avg
§102
18.5%
-21.5% vs TC avg
§112
18.8%
-21.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 81 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority 1. Acknowledgement is made of applicant’s claim for priority to U.S. Provisional Application No. 63/418,637 filed on 10/24/2022. Claim Rejections - 35 USC § 102 2. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 3. Claims 1-4, 7, 11, and 13-18 is/are rejected under 35 U.S.C. 102(a)(2)/(a)(1) as being anticipated by Diankov et al. (US 20200130963, hereinafter Diankov). Regarding claim 1, Diankov teaches a method for operating a robotic system (see at least abstract: “A system and method for operating a robotic system to scan and register unrecognized objects is disclosed.”), the method comprising: receiving sensor data representing a distance between (i) a sensor of the robotic system and (ii) a target object engaged by an end-effector of the robotic system (see at least Figs. 3-4 and [0041]: “In some embodiments, the destination crossing sensor 316 can be used to measure a height of the target object 112 during transfer. For example, the robotic system 100 can determine a gripper height 322 (e.g., a vertical position/location/coordinate of the end-effector 304 relative to a reference point, such as the ground) at the time of an entry event as detected by the destination crossing sensor 316. The robotic system 100 can compare the gripper height 322 to a crossing reference height 324 (e.g., a known vertical position of the destination crossing sensor 316 and/or a reference line/plane thereof) to calculate an object height 320 of the target object 112 that is being transferred. In other words, the destination crossing sensor 316 can act as a trigger that indicates a time when a bottom portion of the target object 112 crosses the sensing line. Accordingly, the robotic system 100 can use the gripper height 322 at such time and the known height of the sensing line to calculate the object height 320 for the target object 112.”); determining a height of the target object based at least in part on the sensor data (see at least Figs. 3-4 and [0041]: “In some embodiments, the destination crossing sensor 316 can be used to measure a height of the target object 112 during transfer…The robotic system 100 can compare the gripper height 322 to a crossing reference height 324 (e.g., a known vertical position of the destination crossing sensor 316 and/or a reference line/plane thereof) to calculate an object height 320 of the target object 112 that is being transferred.); and updating, based at least in part on the height of the target object, a motion plan for placing the target object at a destination location (see at least Figs. 3-4 and [0037]: “For example, the robotic arm 302 can be configured to pick the objects from the target stack 310 and place them on the conveyor 306 for transport to another destination/task.”; [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer…In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”), wherein the updated motion plan includes commands, settings, or a combination thereof for operating a robotic arm and the end-effector to (i) approach the destination location and (ii) disengage the target object for placing the target object at the destination location (see at least [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer. Thus, the robotic system 100 can use the actual object height (e.g., height of the object when suspended instead of the resting height) in transferring the objects, thereby reducing/eliminating any collisions that may have occurred due to the changes in the shapes. In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”; [0088]: “The robotic system 100 can derive the motion plans based on combining the actuator commands/settings in sequence to transfer the recognized objects from the start location 114 to the task location 116. The robotic system 100 can further derive the motion plans based on combining the transfer commands/settings with a predetermined sequence of movements and/or corresponding actuator commands/settings to grip and/or release objects.”). Regarding claim 2, Diankov teaches the limitations of claim 1. Diankov further teaches wherein determining the height of the target object based at least in part on the sensor data includes: determining a first distance between a location of the end-effector and the sensor (see at least Fig. 4b and [0041]: “The robotic system 100 can compare the gripper height 322 to a crossing reference height 324 (e.g., a known vertical position of the destination crossing sensor 316 and/or a reference line/plane thereof) to calculate an object height 320 of the target object 112 that is being transferred. In other words, the destination crossing sensor 316 can act as a trigger that indicates a time when a bottom portion of the target object 112 crosses the sensing line. Accordingly, the robotic system 100 can use the gripper height 322 at such time and the known height of the sensing line to calculate the object height 320 for the target object 112.); determining, based at least in part on the sensor data, the distance between the sensor and the target object, wherein the distance between the sensor and the target object is a second distance (see at least [0041]: “In other words, the destination crossing sensor 316 can act as a trigger that indicates a time when a bottom portion of the target object 112 crosses the sensing line. Accordingly, the robotic system 100 can use the gripper height 322 at such time and the known height of the sensing line to calculate the object height 320 for the target object 112.”); and determining a difference between the first distance and the second distance (see at least [0041]: “In other words, the destination crossing sensor 316 can act as a trigger that indicates a time when a bottom portion of the target object 112 crosses the sensing line. Accordingly, the robotic system 100 can use the gripper height 322 at such time and the known height of the sensing line to calculate the object height 320 for the target object 112.”; [0049]: “FIG. 4B illustrates a height-calculation state 404 that corresponds to a bottom portion of the target object 112 entering or crossing a sense line 416 (e.g., a line traversed by the laser/optical signal sent/sensed by the destination crossing sensor 316). As described above, the robotic system 100 can obtain the gripper height 322 at the time of the crossing event. Using the gripper height 322 and the crossing reference height 324 of FIG. 3 (e.g., the height of the sense line 416), the robotic system 100 can calculate the object height 320 (e.g., as a difference between the two parameters). During the height-calculation state 404, the robotic system 100 may place the target object 112 outside of (e.g., above and/or below) a scanning zone associated with the scanning sensor 330. Accordingly, the scanning sensor 330 may remain inactive.”). Regarding claim 3, Diankov teaches the limitations of claim 2. Diankov further teaches wherein determining the first distance includes determining or tracking the location of the end-effector (see at least Fig. 3 and [0041]: “For example, the robotic system 100 can determine a gripper height 322 (e.g., a vertical position/location/coordinate of the end-effector 304 relative to a reference point, such as the ground) at the time of an entry event as detected by the destination crossing sensor 316”). Regarding claim 4, Diankov teaches the limitations of claim 1. Diankov further teaches wherein updating the motion plan includes determining a release height above the destination location the end-effector is to disengage the target object (see at least [0040]: “The height of the sensing line/plane (e.g., a release height) can be for safely releasing/dropping objects without damaging the objects. As an example, the height for the sensing line can be 10 cm or less above the placement location on the conveyor 306. Accordingly, the robotic system 100 can use the crossing event detected by the release point sensor 318 as a trigger to release the carried object from of the end-effector 304.”; [0088]: “In some embodiments, the robotic system 100 can derive the motion plans to release the object based on a triggering signal from the release point sensor 318 of FIG. 3 or 518 of FIG. 5. The release point sensor 318 can be configured to generate the triggering signal when the bottom portion of the transferred object crosses a sensing line/plane that corresponds to a safe release height above the placement surface.”). Regarding claim 7, Diankov teaches the limitations of claim 1. Diankov further teaches wherein updating the motion plan includes determining a speed at which the robotic arm and the end-effector are to move the target object toward the destination location (see at least [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer. Thus, the robotic system 100 can use the actual object height (e.g., height of the object when suspended instead of the resting height) in transferring the objects, thereby reducing/eliminating any collisions that may have occurred due to the changes in the shapes. In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”). Regarding claim 11, Diankov teaches the limitations of claim 1. Diankov further teaches wherein: the commands, the settings, or the combination thereof are first commands, first settings, or a first combination thereof (see at least [0106]: “The robotic system 100 can implement the scanning operations by generating/sending commands, settings, and/or motion plans that operate the robotic arm 302/502 and/or the end-effector 304/504 according to the scanning position(s). For example, the one or more processors 202 of FIG. 2 can determine and communicate commands, settings, and/or motion plans that, when executed by the robotic arm 302/502 and/or the end-effector 304/504, places the end-effector 304/504 at the scanning position. Accordingly, the robotic system 100 can present the target portion of the unrecognized object before the scanning sensor(s). Also, the robotic system 100 can implement the scanning maneuvers after placing the end-effector 304/504 at the scanning position.”); and the updated motion plan further includes second commands, second settings, or a second combination thereof for operating the robotic arm or the end-effector to return the end-effector to a start location directly from a location at which the end-effector disengages the target object for placing the target object at the destination location (see at least Fig. 7 and [0112]: “While or after registering the object, the robotic system 100 can complete the task. For example, the robotic system 100 can stop the scanning operation when the scanning sensor successfully returns the identifier value or the scanned image. The robotic system 100 can then continue implementing the remaining portions of the task and place the unrecognized object at the task location 116. In some embodiments, the robotic system 100 can obtain the scanning result and/or register the transferred object in parallel with the completion of the task.”; [0113]: “In some embodiments, the method 700 can iteratively transfer and register a group of unrecognized objects from one image. Accordingly, after transferring and registering one unknown object, the method 700 can determine a new registration target from amongst the remaining unrecognized objects as illustrated by a feedback path to block 710. In some embodiments, as illustrated by a feedback path to block 702, the method 700 can include reimaging the start location 114 after transferring and registering an unknown object.”). Regarding claim 13, Diankov teaches the limitations of claim 1. Diankov further teaches wherein: the sensor data is first sensor data (see at least Figs. 3-4 and [0041]: “In some embodiments, the destination crossing sensor 316 can be used to measure a height of the target object 112 during transfer. For example, the robotic system 100 can determine a gripper height 322 (e.g., a vertical position/location/coordinate of the end-effector 304 relative to a reference point, such as the ground) at the time of an entry event as detected by the destination crossing sensor 316.); and the method further comprises: receiving, while the end-effector approaches the destination location in accordance with the commands, the settings, or the combination thereof, second sensor data representing a second distance between (i) the sensor and (ii) the target object (see at least Figs. 3-4 and [0041]: “In some embodiments, the destination crossing sensor 316 can be used to measure a height of the target object 112 during transfer…The robotic system 100 can compare the gripper height 322 to a crossing reference height 324 (e.g., a known vertical position of the destination crossing sensor 316 and/or a reference line/plane thereof) to calculate an object height 320 of the target object 112 that is being transferred.); and determining the second distance based at least in part on the second sensor data (see at least [0043]: “Based on the relative locations/arrangements of the destination crossing sensor 316 and the scanning sensor 330, the robotic system 100 can operate the scanning sensor 330 according to information from or associated with the destination crossing sensor 316. For the example illustrated in FIG. 3, the scanning sensor 330 can be located above the destination crossing sensor 316 at a known height and be positioned to scan a region above the task location 116.”; [0045]: “The destination crossing sensor 316 and the scanning sensor 330 (e.g., horizontally facing cameras or ID scanners) can obtain additional data for the unrecognized objects during transfer. As described above, the destination crossing sensor 316 can be used to calculate the object height 320 of the transferred object without any additional maneuvers/movements in transferring the object.”; [0049]: “FIG. 4B illustrates a height-calculation state 404 that corresponds to a bottom portion of the target object 112 entering or crossing a sense line 416 (e.g., a line traversed by the laser/optical signal sent/sensed by the destination crossing sensor 316). As described above, the robotic system 100 can obtain the gripper height 322 at the time of the crossing event. Using the gripper height 322 and the crossing reference height 324 of FIG. 3 (e.g., the height of the sense line 416), the robotic system 100 can calculate the object height 320 (e.g., as a difference between the two parameters). During the height-calculation state 404, the robotic system 100 may place the target object 112 outside of (e.g., above and/or below) a scanning zone associated with the scanning sensor 330. Accordingly, the scanning sensor 330 may remain inactive.”). Regarding claim 14, Diankov teaches the limitations of claim 1. Diankov further teaches deriving the motion plan, wherein the motion plan includes second commands, second settings, or a second combination thereof for operating the robotic arm and the end-effector to position the target object within a field of view of the sensor such that (i) the target object is positioned above the sensor and (ii) the end-effector is positioned on a side of the target object opposite the sensor (see at least Figs. 3-4 and [0048]: “FIGS. 4A-4D illustrate a sequence (e.g., various example processing states) for height calculation and object scanning in accordance with one or more embodiments introduced here. FIG. 4A illustrates a height-unknown state 402. For the height-unknown state 402, the robotic system 100 of FIG. 1 can operate or exectue instructions for operating the robotic arm 302 of FIG. 3 to place the end-effector 304 and the target object 112 horizontally overlapping and above the task location 116. The height-unknown state 402 can precede calculation of the object height 320. Accordingly, the robotic system 100 can lower the target object 112 toward the task location 116 (i.e. the conveyor 306). During the height-unknown state 402, the robotic system 100 may place the target object 112 outside of (e.g., above and/or below) a scanning zone associated with the scanning sensor 330. Accordingly, the scanning sensor 330 may remain inactive.”). Regarding claim 15, Diankov teaches the limitations of claim 1. Diankov further teaches wherein the target object is an unregistered object having a height initially unknown to the robotic system prior to determining the height of the target object based at least in part on the sensor data (see at least [0011]: “A robotic system (e.g., an integrated system of devices that executes one or more designated tasks) configured in accordance with some embodiments provides enhanced usability and flexibility by autonomously (e.g., automatically with little or no human-operator inputs) scanning and registering previously unknown or unrecognized objects (e.g., packages, boxes, cases, etc.).”; Fig. 7 and [0094]: “The robotic system 100 can calculate the object heights during transfer of the unrecognized objects as described above.”). Regarding claim 16, Diankov teaches a non-transitory, computer-readable medium having processor instructions stored thereon that, when executed by one or more processors of a robotic system, cause the robotic system to perform a method (see at least Fig. 2 and [0027]: “The processors 202 can include data processors (e.g., central processing units (CPUs), special-purpose computers, and/or onboard servers) configured to execute instructions (e.g. software instructions) stored on the storage devices 204 (e.g., computer memory).”; [0028] The storage devices 204 can include non-transitory computer-readable mediums having stored thereon program instructions (e.g., software).”), the method comprising implementing instructions for: determining, based at least in part on sensor data representing a distance between a sensor and a target object engaged by an end-effector of the robotic system, a height of the target object (see at least Figs. 3-4 and [0041]: “In some embodiments, the destination crossing sensor 316 can be used to measure a height of the target object 112 during transfer. For example, the robotic system 100 can determine a gripper height 322 (e.g., a vertical position/location/coordinate of the end-effector 304 relative to a reference point, such as the ground) at the time of an entry event as detected by the destination crossing sensor 316. The robotic system 100 can compare the gripper height 322 to a crossing reference height 324 (e.g., a known vertical position of the destination crossing sensor 316 and/or a reference line/plane thereof) to calculate an object height 320 of the target object 112 that is being transferred. In other words, the destination crossing sensor 316 can act as a trigger that indicates a time when a bottom portion of the target object 112 crosses the sensing line. Accordingly, the robotic system 100 can use the gripper height 322 at such time and the known height of the sensing line to calculate the object height 320 for the target object 112.”); and updating, based at least in part on the height of the target object, a motion plan for placing the target object at a destination location (see at least Figs. 3-4 and [0037]: “For example, the robotic arm 302 can be configured to pick the objects from the target stack 310 and place them on the conveyor 306 for transport to another destination/task.”; [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer…In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”), the updated motion plan including commands, settings, or a combination thereof for operating a robotic arm and the end-effector to (i) approach the destination location and (ii) disengage the target object for placing the target object at the destination location (see at least [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer. Thus, the robotic system 100 can use the actual object height (e.g., height of the object when suspended instead of the resting height) in transferring the objects, thereby reducing/eliminating any collisions that may have occurred due to the changes in the shapes. In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”; [0088]: “The robotic system 100 can derive the motion plans based on combining the actuator commands/settings in sequence to transfer the recognized objects from the start location 114 to the task location 116. The robotic system 100 can further derive the motion plans based on combining the transfer commands/settings with a predetermined sequence of movements and/or corresponding actuator commands/settings to grip and/or release objects.”). Regarding claim 17, Diankov teaches a robotic system (see at least Fig. 1), comprising: a robotic arm (see at least Fig. 3, [0037]: “The robotic system 100 can include a robotic arm 302 (e.g., an instance of the transfer unit 104 of FIG. 1) that includes an end-effector 304 (e.g., a gripper).”); an end-effector attached to the robotic arm (see at least Fig. 3, [0037]: “The robotic system 100 can include a robotic arm 302 (e.g., an instance of the transfer unit 104 of FIG. 1) that includes an end-effector 304 (e.g., a gripper).”); and a distance sensor having a vertically oriented field of view (see at least Fig. 3 and [0038]: “In some embodiments, the robotic system 100 can include a first imaging sensor 312 and/or a second imaging sensor 314….The second imaging sensor 314 can include one or more 2D and/or 3D sensors, such as cameras and/or depth sensors, configured to image and/or analyze the task location 116….Also, the second imaging sensor 314 can include one or more cameras and/or depth sensors located at one or more known locations above and facing the task location 116 or an associated space. Accordingly, the second imaging sensor 314 can generate imaging data corresponding to one or more top views of the target object 112 at or within a threshold distance from the task location 116.”), wherein the robotic system is configured to: transfer, using the robotic arm and the end-effector, a target object between a source location and a destination location (see at least Figs. 1, 2, and [0037]: “As illustrated in FIG. 3, the start location 114 can have a pallet 308 with a target stack 310 (e.g., a grouping of objects) thereon. The task location 116 for the robotic arm 302 can be a placement location (e.g., a starting/egress point) on a conveyor 306 (e.g., an instance of the transport unit 106 of FIG. 1). For example, the robotic arm 302 can be configured to pick the objects from the target stack 310 and place them on the conveyor 306 for transport to another destination/task.”), and present, using the robotic arm and the end-effector, the target object within the vertically oriented field of view of the distance sensor before placement of the target object at the destination location (see at least Figs. 1-3 and [0038]: “The robotic system 100 can use one or more of the sensors 216 of FIG. 2 in performing the transfer operation with the robotic arm 302….Also, the second imaging sensor 314 can include one or more cameras and/or depth sensors located at one or more known locations above and facing the task location 116 or an associated space. Accordingly, the second imaging sensor 314 can generate imaging data corresponding to one or more top views of the target object 112 at or within a threshold distance from the task location 116.”). Regarding claim 18, Diankov teaches the limitations of claim 17. Diankov further teaches wherein the distance sensor is positioned at a location between the source location and the destination location (see at least Figs. 1-3 and [0038]: “The robotic system 100 can use one or more of the sensors 216 of FIG. 2 in performing the transfer operation with the robotic arm 302….Also, the second imaging sensor 314 can include one or more cameras and/or depth sensors located at one or more known locations above and facing the task location 116 or an associated space. Accordingly, the second imaging sensor 314 can generate imaging data corresponding to one or more top views of the target object 112 at or within a threshold distance from the task location 116.”). Claim Rejections - 35 USC § 103 4. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 5. Claims 5-6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov et al. (US 20200130963, hereinafter Diankov) in view of Diankov et al. (US 20210053216, hereinafter Diankov 216’). Regarding claim 5, Diankov teaches the limitations of claim 4. Diankov further teaches wherein determining the release height include determining the release height based at least in part on one or more properties of the target object (see at least [0088]: “In some embodiments, the robotic system 100 can derive the motion plans to release the object based on a triggering signal from the release point sensor 318 of FIG. 3 or 518 of FIG. 5. The release point sensor 318 can be configured to generate the triggering signal when the bottom portion of the transferred object crosses a sensing line/plane that corresponds to a safe release height above the placement surface.”). Diankov fails to explicitly teach determining the release height based at least in part on one or more properties of the target object. However, Diankov 312’ teaches a method and system for a transport robot that determines a release height based at least in part on one or more properties of a target object (see at least [0192]: “For example, when the packages designated for latter release crosses the sensing line/plane before the first/earlier release package, the robotic system 100 can determine whether the first/early release package can be released at a higher height. The robotic system 100 can determine a higher release height based on a height of the end effector 140 at the time of the triggering event. The robotic system 100 can determine the feasibility of the higher release height based on a height and/or a weight of the earlier release package (e.g., the target package 112).”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Diankov to incorporate the teachings of Diankov 312’ and provide a means to determines a release height based at least in part on one or more properties of a target object, with a reasonable expectation of success, in order to take into consideration a weight of the object when determining what height is safe to drop the object. Regarding claim 6, modified Diankov teaches the limitations of claim 5. Diankov fails to explicitly teach wherein the one or more properties include a weight of the target object. However, Diankov 312’ teaches a method and system for a transport robot wherein one or more properties include a weight of a target object (see at least [0192]: “For example, when the packages designated for latter release crosses the sensing line/plane before the first/earlier release package, the robotic system 100 can determine whether the first/early release package can be released at a higher height. The robotic system 100 can determine a higher release height based on a height of the end effector 140 at the time of the triggering event. The robotic system 100 can determine the feasibility of the higher release height based on a height and/or a weight of the earlier release package (e.g., the target package 112).”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Diankov to incorporate the teachings of Diankov 312’ and provide one or more properties including a weight of a target object, with a reasonable expectation of success, with a reasonable expectation of success, in order to take into consideration a weight of the object when determining what height is safe to drop the object. Claim Rejections - 35 USC § 103 6. Claims 8-10 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov et al. (US 20200130963, hereinafter Diankov) in view of Cohen et al. (US 20220135347, hereinafter Cohen). Regarding claim 8, Diankov teaches the limitations of claim 1. Diankov further teaches wherein: the method further comprises deriving the motion plan (see at least Figs. 1, 7, and [0023]: “As described below, the robotic system can derive plans (e.g., placement locations/orientations, sequence for transferring the objects, and/or corresponding motion plans) for placing and/or stacking the objects. Each of the units can be configured to execute a sequence of actions (e.g., operating one or more components therein) to execute a task.”); deriving the motion plan includes precalculating first commands, first settings, or a first combination thereof for operating the robotic arm and the end-effector based at least in part on a height value for the target object the target object (see at least [0029]: “In one or more embodiments, the master data 252 can include registration data 254 for each such object. The registration data 254 can include a dimension, a shape (e.g., templates for potential poses and/or computer-generated models for recognizing the object in different poses), a color scheme, an image, identification information (e.g., bar codes, quick response (QR) codes, logos, etc., and/or expected locations thereof), an expected weight, other physical/visual characteristics, or a combination thereof for the objects expected to be manipulated by the robotic system 100. In some embodiments, the master data 252 can include manipulation-related information regarding the objects, such as a center-of-mass (CoM) location or an estimate thereof on each of the objects, expected sensor measurements (e.g., for force, torque, pressure, and/or contact measurements) corresponding to one or more actions/maneuvers, or a combination thereof.”); and updating the motion plan includes updating, based at least in part on the height of the target object, the first commands, the first settings, or the first combination thereof to second commands, second settings, or a second combination thereof (see at least [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer. Thus, the robotic system 100 can use the actual object height (e.g., height of the object when suspended instead of the resting height) in transferring the objects, thereby reducing/eliminating any collisions that may have occurred due to the changes in the shapes. In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”; [0088]: “The robotic system 100 can derive the motion plans based on combining the actuator commands/settings in sequence to transfer the recognized objects from the start location 114 to the task location 116. The robotic system 100 can further derive the motion plans based on combining the transfer commands/settings with a predetermined sequence of movements and/or corresponding actuator commands/settings to grip and/or release objects.”). Diankov fails to explicitly teach deriving a motion plan based at least in part on a maximum possible height value for a target object and/or a minimum possible height value for the target object. However, Cohen teaches a system and method for automated packing and processing that derives a motion plan based at least in part on a maximum possible height value for a target object and/or a minimum possible height value for the target object (see at least [0101]: “The pose-in-hand routine determines how the unit is held by the gripper. In order to place objects into boxes and to later pack other objects efficiently, the system knows the pose-in-hand of each object while being grasped as discussed herein. The place planner performs searches in six-dimensional space, and these are done off-line to provide pre-computed paths. In an on-line place planning mode, the system reacts at each step to the preceding placements.”; [0104]: “The pack planning routine executes with an order is received, and each order is evaluated for compatibility with a given box size from smallest to largest…If multiple pack plans exist that fit the objects in a box, the pack plan chooses the plan with the smallest maximum object height. The pack plan may maintain certain constraints, including a minimum distance between objects, a minimum distance between objects and the container wall, a maximum object height, that only stackable objects are stacked, and that constraints on object size and characteristics are respected.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Diankov to incorporate the teachings of Cohen and provide a means to derive a motion plan based at least in part on a maximum possible height value for a target object and/or a minimum possible height value for the target object, with a reasonable expectation of success, in order to take into consideration the maximum or minimum height of the object to maintain certain constraints when picking, placing, and stacking objects while being safe. Regarding claim 9, modified Diankov teaches the limitations of claim 8. Diankov further teaches wherein updating the motion plan includes updating the motion plan prior to the robotic system implementing the first commands, the first settings, or the first combination thereof (see at least Fig. 7, [0029], and [0087-0088]: the robotic system precalculates and updates commands based on the registration data including a dimension for the target object for potential poses.). Regarding claim 10, modified Diankov teaches the limitations of claim 8. Diankov further teaches wherein updating the motion plan includes updating the motion plan while the robotic system implements at least a subset of the first commands, the first settings, or the first combination thereof location (see at least [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer. Thus, the robotic system 100 can use the actual object height (e.g., height of the object when suspended instead of the resting height) in transferring the objects, thereby reducing/eliminating any collisions that may have occurred due to the changes in the shapes. In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”; [0088]: “The robotic system 100 can derive the motion plans based on combining the actuator commands/settings in sequence to transfer the recognized objects from the start location 114 to the task location 116. The robotic system 100 can further derive the motion plans based on combining the transfer commands/settings with a predetermined sequence of movements and/or corresponding actuator commands/settings to grip and/or release objects.”). Regarding claim 12, Diankov teaches the limitations of claim 11. Diankov further teaches wherein: the method further comprises deriving the motion plan (see at least Figs. 1, 7, and [0023]: “As described below, the robotic system can derive plans (e.g., placement locations/orientations, sequence for transferring the objects, and/or corresponding motion plans) for placing and/or stacking the objects. Each of the units can be configured to execute a sequence of actions (e.g., operating one or more components therein) to execute a task.”); deriving the motion plan includes: precalculating, based at least in part on a height value for the target object, third commands, third settings, or a third combination thereof for operating the robotic arm and the end-effector to raise the end-effector to a specified height after disengaging the target object for placing the target object at the destination location (see at least [0029]: “In one or more embodiments, the master data 252 can include registration data 254 for each such object. The registration data 254 can include a dimension, a shape (e.g., templates for potential poses and/or computer-generated models for recognizing the object in different poses), a color scheme, an image, identification information (e.g., bar codes, quick response (QR) codes, logos, etc., and/or expected locations thereof), an expected weight, other physical/visual characteristics, or a combination thereof for the objects expected to be manipulated by the robotic system 100. In some embodiments, the master data 252 can include manipulation-related information regarding the objects, such as a center-of-mass (CoM) location or an estimate thereof on each of the objects, expected sensor measurements (e.g., for force, torque, pressure, and/or contact measurements) corresponding to one or more actions/maneuvers, or a combination thereof.”; Fig. 7, [0087-0088], [0113]: the robotic system precalculates commands based on the registration data including a dimension for the target object for potential poses and further process the recognized object by transferring the objects to the task location 112 (destination location). After releasing the objects at the task location, the manipulator raises to a specific height and returns to a pick up another object at the start location.), and precalculating fourth commands, fourth settings, or a fourth combination thereof for operating the robotic arm and the end-effector to return the end-effector to the start location after raising the end-effector to the specified height (see at least Fig. 7, [0087-0088], and [0113]: the robotic system precalculates commands based on the registration data including a dimension for the target object for potential poses and further process the recognized object by transferring the objects to the task location 112 (destination location). After releasing the objects at the task location, the manipulator raises to a specific height and returns to a pick up another object at the start location.); and updating the motion plan includes updating, based at least in part on the height of the target object, the third commands, the fourth commands, the third settings, and/or the fourth settings to the second commands, the second settings, or the second combination thereof (see at least [0045]: “By using a crossing sensor (e.g., the destination crossing sensor 316) to determine the object height 320 during transfer, the robotic system 100 can accurately account (via, e.g., motion planning) for any changes in shapes/dimensions of the objects during transfer. Thus, the robotic system 100 can use the actual object height (e.g., height of the object when suspended instead of the resting height) in transferring the objects, thereby reducing/eliminating any collisions that may have occurred due to the changes in the shapes. In some embodiments, the robotic system 100 can adjust transport speed, transport acceleration, or a combination thereof according to the actual object height, such as to reduce swaying or pendulating motion of the transferred object. In some embodiments, the robotic system 100 can use the resting object height and/or the transfer object height to register the unrecognized objects.”; [0088]: “The robotic system 100 can derive the motion plans based on combining the actuator commands/settings in sequence to transfer the recognized objects from the start location 114 to the task location 116. The robotic system 100 can further derive the motion plans based on combining the transfer commands/settings with a predetermined sequence of movements and/or corresponding actuator commands/settings to grip and/or release objects.”). Diankov fails to explicitly teach precalculating, based at least in part on a maximum possible height value for the target object and/or a minimum possible height value for the target object, commands, settings, or a combination thereof for operating the robotic arm and the end-effector. However, Cohen teaches a system and method for automated packing and processing that derives a motion plan including precalculating, based at least in part on a maximum possible height value for the target object and/or a minimum possible height value for the target object, commands, settings, or a combination thereof for operating a robotic arm and an end-effector (see at least [0101]: “The pose-in-hand routine determines how the unit is held by the gripper. In order to place objects into boxes and to later pack other objects efficiently, the system knows the pose-in-hand of each object while being grasped as discussed herein. The place planner performs searches in six-dimensional space, and these are done off-line to provide pre-computed paths. In an on-line place planning mode, the system reacts at each step to the preceding placements.”; [0104]: “The pack planning routine executes with an order is received, and each order is evaluated for compatibility with a given box size from smallest to largest…If multiple pack plans exist that fit the objects in a box, the pack plan chooses the plan with the smallest maximum object height. The pack plan may maintain certain constraints, including a minimum distance between objects, a minimum distance between objects and the container wall, a maximum object height, that only stackable objects are stacked, and that constraints on object size and characteristics are respected.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Diankov to incorporate the teachings of Cohen and provide a means to derive a motion plan including precalculating, based at least in part on a maximum possible height value for the target object and/or a minimum possible height value for the target object, commands, settings, or a combination thereof for operating a robotic arm and an end-effector, with a reasonable expectation of success, in order to take into consideration the maximum or minimum height of the object to maintain certain constraints when picking, placing, and stacking objects while being safe. Claim Rejections - 35 USC § 103 7. Claims 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov et al. (US 20200130963, hereinafter Diankov) in view of Chitta et al. (US 20200047331, hereinafter Chitta). Regarding claim 19, Diankov teaches the limitations of claim 17. Diankov further teaches wherein: the destination location is positioned at a top surface of rollers of a conveyor (see at least Figs. 1, 3 and [0037]: “The task location 116 for the robotic arm 302 can be a placement location (e.g., a starting/egress point) on a conveyor 306 (e.g., an instance of the transport unit 106 of FIG. 1). For example, the robotic arm 302 can be configured to pick the objects from the target stack 310 and place them on the conveyor 306 for transport to another destination/task.”); and the distance sensor is positioned at the destination location and the rollers of the conveyor (see at least Fig. 3 and [0038]: “Accordingly, the second imaging sensor 314 can generate imaging data corresponding to one or more top views of the target object 112 at or within a threshold distance from the task location 116.”. Diankov fails to explicitly teach the distance sensor is positioned beneath the destination location and the conveyor. However, Chitta teaches a method and apparatus for manipulating boxes using a zoned gripper that comprises a distance sensor positioned beneath a destination location and a conveyor (see at least Fig. 6 and [0042]: “Referring to FIG. 6, a conveyor sensor 300 (e.g., a distance/position sensor such as a laser distance sensor) is mounted underneath the conveyor 50 and configured to measure a height H of a bottom surface of a box 24T relative to the conveyor sensor 300 as a gripper 200 moves the box 24T to the conveyor 50.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Diankov to incorporate the teachings of Chitta and provide a distance sensor positioned beneath a destination location and a conveyor, with a reasonable expectation of success, in order to position the sensor in a discrete location under the conveyor and out of the way of the robotic arm. Regarding claim 20, modified Diankov teaches the limitations of claim 19. Diankov fails to explicitly teach wherein at least a portion of the vertically oriented field of view of the distance sensor is unobstructed by the rollers of the conveyor. However, Chitta teaches a method and apparatus for manipulating boxes using a zoned gripper wherein at least a portion of a vertically oriented field of view of a distance sensor is unobstructed by rollers of a conveyor (see at least Fig. 6 and [0042]: “Referring to FIG. 6, a conveyor sensor 300 (e.g., a distance/position sensor such as a laser distance sensor) is mounted underneath the conveyor 50 and configured to measure a height H of a bottom surface of a box 24T relative to the conveyor sensor 300 as a gripper 200 moves the box 24T to the conveyor 50.” Chitta teaches a distance sensor mounted underneath conveyor 50 that can measure the height of the bottom surface of box 24T located above the conveyor, thus, the camera is unobstructed by any rollers on the conveyor in order to view the bottom of the surface of the box.). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Diankov to incorporate the teachings of Chitta and provide a means wherein at least a portion of a vertically oriented field of view of a distance sensor is unobstructed by rollers of a conveyor, with a reasonable expectation of success, in order to position the sensor in a discrete location under the conveyor and out of the way of the robotic arm while allowing for visual view of an object. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Watts (US 20180056512) teaches a method and apparatus for determining a safe trajectory for movement of an object by a robotic system that accounts for a maximum height from which the robotic system is permitted to drop the object. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TIEN MINH LE whose telephone number is (571)272-3903. The examiner can normally be reached Monday to Friday (8:30am-5:30pm eastern time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached on (571)272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.M.L./Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Oct 23, 2023
Application Filed
May 12, 2025
Response after Non-Final Action
Jan 30, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566070
DETERMINATION APPARATUS AND DETERMINATION METHOD
2y 5m to grant Granted Mar 03, 2026
Patent 12528325
A CONTROL SYSTEM FOR A VEHICLE
2y 5m to grant Granted Jan 20, 2026
Patent 12508704
Marker Detection Apparatus and Robot Teaching System
2y 5m to grant Granted Dec 30, 2025
Patent 12509122
VEHICLE SELECTION DEVICE AND VEHICLE SELECTION METHOD
2y 5m to grant Granted Dec 30, 2025
Patent 12466074
IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, ROBOT-MOUNTED TRANSFER DEVICE, AND SYSTEM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
92%
With Interview (+23.8%)
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 81 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month