Prosecution Insights
Last updated: April 19, 2026
Application No. 18/177,578

SYSTEMS AND METHODS FOR ROBOTIC SYSTEM WITH OBJECT HANDLING

Non-Final OA §103
Filed
Mar 02, 2023
Examiner
JUNG, JAEWOOK
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Mujin Inc.
OA Round
3 (Non-Final)
33%
Grant Probability
At Risk
3-4
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
1 granted / 3 resolved
-18.7% vs TC avg
Strong +100% interview lift
Without
With
+100.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
27 currently pending
Career history
30
Total Applications
across all art units

Statute-Specific Performance

§101
7.9%
-32.1% vs TC avg
§103
53.7%
+13.7% vs TC avg
§102
14.1%
-25.9% vs TC avg
§112
23.2%
-16.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 3 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on January 5, 2026 has been entered. Response to Amendment This office action is in response to the amendments filed January 5, 2026. Claims 16-17 and 21 are cancelled. Claims 1, 18-20, and 22 are amended. Claims 1, 3-14, 18-20, and 22 are pending and addressed below. Response to Arguments As discussed in the interview with the applicant, clarifying the feature of “translatable gripping fingers” in the independent claims with supporting language from the specification would be sufficient to overcome the current rejection of art in view of Bradski. However, upon further search and consideration of the new amended claims, examiner relies on features disclosed by Bradski which were not previously relied upon to reject the claims as amended. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 3-12, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over US9333649B1 (Bradski). Regarding claims 1, 19, and 20, Bradski discloses a computing system, method, and non-transitory computer readable medium (NTCRM), comprising: a control system configured to communicate with a robot having a robot arm that includes or is attached to an end effector apparatus having a plurality of grippers, each of the plurality of grippers having translatable gripping fingers configured to translate with respect to each other, and to communicate with a camera; Bradski discloses a control system configured to communicate with a robot having a robot arm that includes or is attached to an end effector apparatus having a plurality of grippers (see Fig. 3D), each of the plurality of grippers having translatable gripping fingers configured to translate with respect to each other (see Figs. 4A-4C, where Bradski discloses in column 14, lines 25-27 that “the bellows in the suction valve 402 may cause the device to be at full extension when the fluid (or powder) is slack and no surface is in contact.”) and to communicate with a camera (Fig. 1B of Bradski, where arm-mounted sensor 106 is a camera shown in Fig. 1A). at least one processing circuit configured, when the robot is in an object handling environment including a source of objects for transfer to one or more destinations within the object handling environment is provided, to perform the following for transferring a plurality of target objects from the source of objects to the one or more destinations: Bradski discloses the robot in an object handling environment including a source of objects for transfer to one or more destinations within the object handling environment is provided, (see Fig. 2A) to perform the following for transferring a plurality of target objects (column 14, lines 35-42) from the source of objects to the one or more destinations: identifying the plurality of target objects from among a plurality of objects in the source of objects, wherein the plurality of target objects include a first target object of the plurality of target objects associated with a first grasping model, and a second target object of the plurality of the target objects associated with a second grasping model, and identifying the plurality of target objects includes selecting the first target object for grasping by the plurality of grippers of the end effector apparatus and the second target object for grasping by the plurality of grippers of the end effector apparatus; Bradski discloses identifying the plurality of target objects from among a plurality of objects in the source of objects (see Figs. 2A-2C, where a source of objects is the stack of boxes) including a first target object of the plurality of target objects associated with a first grasping model (column 8, lines 41-55), and a second target object of the plurality of the target objects associated with a second grasping model (column 14, lines 35-42) and selecting the first target object for grasping by the plurality of grippers of the end effector apparatus and the second target object for grasping by the plurality of grippers of the end effector apparatus (column 12, lines 57-59). generating a plurality of arm approach trajectories for the robot arm to approach the plurality of objects; See Fig. 5 of Bradski. Fig. 5 shows an example flow chart of the robot’s logic. Step 506 shows the step of generating the motion path for the gripper to follow, where the path includes the arm approach trajectory prior to grasping the box chosen from a plurality of trajectories generated (column 19, lines 62-65 of Bradski, “Therefore, in addition to the evaluation of graspable features, approach trajectories, nearby objects, and other factors, the system may also evaluate the possible robotic arm configurations for each potential grasp point.”). generating a plurality of end effector apparatus approach trajectories for the end effector apparatus to approach plurality of target objects; See Fig. 5 of Bradski. As the motion path is described to move an object to a drop-off location from 506, an end effector apparatus approach trajectory is also generated at 508, where the grasp points are determined based on the motion path’s potential grasp points. generating a grasp operation for grasping the plurality of target objects with the plurality of grippers of the end effector apparatus; See Fig. 5 of Bradski. This figure shows an example flow chart of the robot. Specifically, 506 and 508 show the steps of generating of both the arm approach trajectory and end effector trajectory by determining a motion path and selecting a grasp point for the end effector. with 510 showing the actual grasping motion. outputting an arm approach command to control the robot arm according to the plurality of arm approach trajectories to approach the plurality of objects; See Fig. 6B of Bradski, where the robotic arm 602 is on an approach path towards the object 608 (column 19, lines 16-19). outputting an end effector apparatus approach command to control the robot arm in the plurality of end effector apparatus approach trajectories to approach the plurality of target objects; and See Fig. 6C of Bradski. The robotic arm 602 grasps onto the object 608 at a select grasp point (column 24, lines 33-35). outputting an end effector apparatus control command to control the plurality of grippers of the end effector apparatus in the grasp operation to grasp the plurality of target objects; See rationale above regarding the end effector apparatus approaching the plurality of target objects. generating one or more destination trajectories for the robot arm to approach the one or more destinations; See Fig. 5 of Bradski. The step 506 determines a motion path for moving the physical object to a drop-off location for the physical object, where a destination trajectory would be the path of the movement of the physical object after grasp. outputting a robot arm control command to control the robot arm according to the one or more destination trajectories; and See Fig. 6D of Bradski, where a robot arm picks up an object 608 through a determined motion path 614 (column 24, lines 50-52). outputting an end effector apparatus release command to control the plurality of grippers of the end effector apparatus to release the plurality of target objects at the one or more destinations. See Fig. 6D of Bradski, where a robot arm picks up an object 608 through a determined motion path 614 and places the object at the drop-off location 612 (column 24, lines 50-53). wherein outputting the end effector apparatus control command to control the plurality of grippers of the end effector apparatus in the grasp operation to grasp the plurality of target objects includes sequentially grasping the first target object and the second target object with the plurality of grippers prior to controlling the robot arm according to the one or more destination trajectories. While Bradski does not explicitly disclose sequentially grasping the first target object and the second target object with the plurality of grippers prior to controlling the robot arm according to the one or more destination trajectories, Bradski discloses “In some embodiments, the gripper could potentially span several boxes or objects and turn on suction for some or all of the covered objects”. One of ordinary skill in the art would find it obvious that the system of Bradski is configured to sequentially grasp the first target object and the second target object with the plurality of grippers as the suction as Bradski discloses control over the suction. Regarding claims 3 and 5, with all of the limitations of claim 1, the computing system further discloses: wherein determining the one or more destination trajectories of the robot arm is based on an optimized destination trajectory time for the robot arm to travel from the source to one or more destinations and wherein determining the plurality of end effector approach trajectories is based on an optimized end effector apparatus approach time for the end effector apparatus in the grasp operation to grasp the plurality of target objects. Column 22, lines 18-22, of Bradski et al., “For example, an optimizer may accept arbitrary constraints in the form of costs such as: keeping a particular distance away from objects, and to approach a goal position from a given angle among other possibilities.”, where optimizing for time would be an example of an arbitrary constraint possible by the system. For example, Column 21, Line 65-66 show an example constraint of the quickest movement path, which implies an optimized time for approaching either the object or the destination after grasp. Regarding claim 4, with all of the limitations of claim 1, the computing system further discloses: wherein determining the one or more destination trajectories of the robot arm is based on a projected grip stability between the plurality of grippers of the end effector apparatus and the plurality of target objects. As stated in the previous action, examiner notes that projected grip stability is interpreted as a measure of slippage and robustness to collision. Column 25, lines 7-15 of Bradski, “In an example embodiment, the system may make use of specifically learned pick up points to yield efficient object specific pick up in any orientation the object might be in. For example, the robotic arm may analyze the physical environment and use various sensors to form a 3D model and subsequently grasp the object at a selected grasp point based on various factors. Each time this occurs, the robotic arm may evaluate how well the pick point performed (e.g., in terms of slippage) and store that knowledge.” Regarding claims 6 and 7, with all of the limitations of claims 5 and 1, the computing system further discloses: wherein the optimized end effector apparatus approach time is determined based on an available grasping model for the plurality of target objects and wherein determining the grasp operation includes determining at at least one grasping model from a plurality of available grasping models for use by the end effector apparatus in the grasp operation. Column 25, lines 32-37 of Bradski, “Subsequently, in run time, when an object is recognized, the stored information may be accessed and the grasp points may be looked up. In other words, the stored information may be used to determine other grasp points for the robotic manipulator to use in picking up additional physical objects from the physical environment.” Regarding claim 8, with all of the limitations of claim 7, the computing system further discloses: wherein the at least one processing circuit is further configured to determine a rank to each of the plurality of available grasping models according to a projected grip stability of each of the plurality of available grasping models. In column 24, lines 39-43, Bradski’s disclosure states the use of known models to determine a grasping point, showing that there exists a plurality of available grasping models. With the citation above in the previous section, there exists a point of best grasp chosen for an evaluated model as well for each model the arm has evaluated. See also the citation of column 22, lines 18-22 regarding optimization with respects to accepted arbitrary constraints and the citation acknowledging slippage as a performance metric. Given these capabilities in the system, the slippage performance metric can be used as a constraint by the optimizer of the system for establishing the best fit grasping points. Regarding claim 9, with all of the limitations of claim 8, the computing system further discloses: wherein determining the at least one grasping model for use by the end effector apparatus is based on the rank with a highest determined value of the projected grip stability See the citations regarding claims 6, 7, and 8. Bradski’s disclosure shows capabilities of storing points of grasping and accessing the storage as needed. The citation in claim 4 show a capability of evaluating performance of a metric which, in this case, would be slippage. Regarding claim 10, with all of the limitations of claim 1, the computing system further discloses: wherein the at least one processing circuit is further configured for: generating one or more detection results, each representing a detected object of the plurality of objects in the source of objects and including a corresponding object representation defining at least one of: a location of the detected object in the source of objects Column 8, lines 45-48 of Bradski, “As the robotic arm 102 moves, a sensor 106 on the arm may capture sensor data about the stack of boxes 220 in order to determine shapes and/or positions of individual boxes.” Regarding claim 11 and 12, with all of the limitations of claim 1, the computing system further discloses: wherein the plurality of objects are substantially the same in terms of size, shape, weight, and material composition and wherein the plurality of objects vary from each other in size, shape, weight, and material composition. Column 1, lines 29-33 of Bradski, “In some cases, all of the objects may be of the same type. In other cases, a container or truck may contain a mix of different types of objects, such as boxed items, cans, tires, or other stackable objects.” Claims 13-14 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over US9333649B1 (Bradski) and in further view of US20220016767A1 (Ku). Regarding claim 13, with all of the limitations of claim 10, the computing system further discloses: Wherein, identifying the plurality of target objects from the one or more detection results includes: determining whether available grasping models exist for each detected object; and Column 24, lines 39-43 of Bradski, “Possible considerations for determining a grasp point include known models, sensed models, sensed scene surfaces, joint angles, planned motion path, potential grasp points, and/or arm windup to help select a grasp point.” pruning the detected object without available grasping models from each detected object. While Bradski does not disclose pruning detected objects without grasping models from each detected object, Ku, in a similar field of endeavor, discloses a method of determining graspability scores of one or more detected objects by breaking down a region representative of an object into subregions and weighing each region with a score. See Figs. 1, 2, 4, and 5. Fig. 1 provides the overview of the process, Fig. 2 gives an example layout of the system, Fig. 4 provides an illustrative example, and Fig. 5 gives a clearer keypoint layout. One of ordinary skill in the art would have found it obvious, prior to the applicant’s effective filing date, to improve the system of Bradski with the pruning/ blacklisting of Ku, where one such modification would be to combine the region subdivision of Ku with Bradski et al.’s sensor model and arm system. Ku disclose, “Additionally or alternatively, subdividing the set of keypoints into subsets can blacklist regions of the object with a low success probability, removing them from consideration for subsequent selection of a candidate grasp location.” Bradski discloses a system that can store and reuse grasping models associated with detected objects, but the objects themselves are subject to certain orientations and positions that may not be ideal in all cases. Combining Ku’s system would allow for a system that can figure out graspability off of the 2D image obtained by the camera (see Fig. 8) and also filter (and remember) undesirable points of contact with the arm. Regarding claim 14, with all of the limitations of claim 13, the computing system further discloses: pruning the detected objects based on at least one of the object orientation, locations of the detected object in the source of objects, and/or inter-object distance. As listed above, Ku provides a breakdown of a region representative of one or more objects in their respective locations into sub-regions with quantitative measures regarding which objects get blacklisted. Regarding claim 18, with all of the limitations of claim 1, the computing system further discloses: wherein the at least one processing circuit is further configured for: outputting a second end effector apparatus approach command to control the robot arm to approach the second target object; outputting a second end effector apparatus control command to control the end effector apparatus to grasp the second target object; See Figs. 3A-3C of Bradski, where the end effector apparatus approaches a second target after grasping a first target and grasps the second target object. generating a destination trajectory for the robot arm to approach the one or more destinations; See Fig. 5 of Bradski. The step 506 determines a motion path for moving the physical object to a drop-off location for the physical object, where a destination trajectory would be the path of the movement of the physical object after grasp. outputting the robot arm control command to control the robot arm according to the one or more destination trajectories; and See Fig. 5 of Bradski. The step 510 describes moving through the determined motion path to the drop-off location (destination trajectory). outputting the end effector apparatus release command to control the end effector apparatus to release the first target object and the second target object at the one or more destinations While Bradski discloses releasing a first target object at a one or more destinations (see Figs. 2C and 6D for examples), Bradski does not explicitly disclose releasing a second target object at one or more destinations. Given Bradski discloses the grasp of two objects (see Fig. 3B and 3C) and control of individual suction cups (column 13, lines 18-21), one of ordinary skill in the art would find it obvious to try, prior to the applicant’s effective filing date, to release the two grasped objects sequentially as the only other alternative would be to release them simultaneously. Claim 22 is rejected under 35 U.S.C. 103 as being unpatentable over US9333649B1 (Bradski) in view of US20180290307 (Watanabe). Regarding claim 22, with all of the limitations of claim 1, the computing system further discloses: wherein selecting the second target object includes determining a disturbance range of the first target object, the disturbance range representing a minimum distance from the first target object at which other nearby objects are determined to be unlikely to shift in position or pose when the first target object is grasped with the plurality of grippers, and pruning each detected object from the plurality of objects that are within the disturbance range of the first target object. While Bradski does not explicitly describe a minimum distance from the first target object at which other nearby objects are determined to be unlikely to shift in position or pose when the first target object is grasped, Bradski discloses in column 19, lines 62-64, “Therefore, in addition to the evaluation of graspable features, approach trajectories, nearby objects, and other factors, the system may also evaluate the possible robotic arm configurations for each potential grasp point. As a result, the system may select potential grasp point at specific robotic arm configurations that ensure collisions will be avoided during the approach path and/or while the object is picked up.”, where nearby collision of nearby objects possible by the grasped object is avoided, showing that disturbance ranges are considered when selection is performed. However, Bradski does not disclose the explicit pruning the plurality of objects within the disturbance range of the first target object. From a similar field of endeavor, Watanabe discloses a gripping apparatus with an information processing apparatus comprising an imaging device. Specifically, the imaging device determines a variable imaging range to identify objects that may be interfered with from gripping the first object (Watanabe, Abstract). One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to combine the system of Watanabe to the system of Bradski to prune the plurality of detected objects within the range as selecting a second object outside of the range would allow for reliably deciding a subsequent target that will maintain its originally labeled position, reducing computations. Conclusion A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAEWOOK JUNG whose telephone number is (571)272-5470. The examiner can normally be reached Monday - Friday, 9:00 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached on 571-270-7777.The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /J.J./Examiner, Art Unit 3656 /WADE MILES/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Mar 02, 2023
Application Filed
Apr 21, 2025
Non-Final Rejection — §103
Jul 15, 2025
Response Filed
Oct 07, 2025
Final Rejection — §103
Nov 20, 2025
Examiner Interview Summary
Nov 20, 2025
Applicant Interview (Telephonic)
Jan 05, 2026
Request for Continued Examination
Feb 08, 2026
Response after Non-Final Action
Mar 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12514149
SYSTEMS AND METHODS FOR SPRAYING SEEDS DISPENSED FROM A HIGH-SPEED PLANTER
2y 5m to grant Granted Jan 06, 2026
Patent 12480561
VEHICLE AND CONTROL METHOD THEREOF
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
33%
Grant Probability
99%
With Interview (+100.0%)
2y 8m
Median Time to Grant
High
PTA Risk
Based on 3 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month