Prosecution Insights
Last updated: April 19, 2026
Application No. 17/919,806

POSITION SETTING DEVICE FOR SETTING WORKPIECE STACKING POSITION AND ROBOT APPARATUS PROVIDED WITH POSITION SETTING DEVICE

Non-Final OA §103
Filed
Oct 19, 2022
Examiner
KASPER, BYRON XAVIER
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fanuc Corporation
OA Round
5 (Non-Final)
70%
Grant Probability
Favorable
5-6
OA Rounds
3y 0m
To Grant
88%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
72 granted / 103 resolved
+17.9% vs TC avg
Strong +18% interview lift
Without
With
+18.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
36 currently pending
Career history
139
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
56.3%
+16.3% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
16.4%
-23.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 103 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. This communication is responsive to Application No. 17/919,806 and the amendments filed on 1/5/2026. 3. Claims 1 and 6-9 are presented for examination. Information Disclosure Statement 4. The information disclosure statements (IDS) submitted on 10/19/2022, 3/18/2025, 7/28/2025, and 10/20/2025 have been fully considered by the Examiner. Response to Arguments 5. Applicant's arguments filed 1/5/2026 with respect to the rejection of claims 1 and 6-9 under 35 U.S.C. 103 have been fully considered but they are not persuasive. Regarding independent claim 1, the Applicant argues that the combination of US 10953549 B2 to Diankov, US 20200031593 A1 to Usami, US 5908283 A to Huang, and US 20160167232 A1 to Takeshita fails to disclose all of the limitations recited in claim 1. However, the Examiner respectfully disagrees. Specifically, regarding Huang, the Applicant argues that Huang fails to teach allowing placement of the second workpiece when the number of divided regions facing the first workpiece are equal to or greater than a determination value, wherein the determination value is smaller than the total number of regions of the divided bottom face of the second workpiece. The Applicant cites Huang Col. 25 lines 15-21 as evidence against Huang teaching this limitation. However, the Examiner respectfully disagrees. As shown in Figure 35A of Huang, the bottom face of the package outline 350 is divided in such a way that each of the four corners of the package are segmented into ‘windows,’ with the remainder of the divided package acting as its own region. The Examiner notes that the ‘plus-shaped’ non-cornered area of the package in Figure 35A of Huang is in itself a divided region, even if it is not equal in size to the four corner windows. Claim 1 requires determining placement of the second workpiece may be allowed if the number of divided regions facing the first workpieces are equal to or greater than a determination value, wherein the determination value is smaller than the total number of divided regions. Put another way, the Examiner interprets Claim 1 to recite allowing stacking of the second workpiece onto the first workpieces when not all of the divided regions of the second workpiece are directly supported by contact with the first workpieces. Col. 25 lines 15-22 of Huang state, in reference to Figures 35A-B of Huang, determining that the package (second workpiece) has support when either all four corners face the first workpieces or three of the corners and the ratio between the total area of support between the first and second workpiece faces meets a threshold. By the latter, Huang does not require all four of the corners to be sufficiently supported, to which the Examiner interprets the satisfy the limitation of wherein the determination values is less than the total number of divided regions (i.e., less than four corners may be considered stable). While Huang adds in the total area of support meeting a threshold requirement, the Examiner notes that this may still not require the support of the fourth corner. Huang clearly states that not all of the corner windows require direct support. Therefore, the Examiner submits that Huang still teaches these limitations of claim 1, in which will also be described further below. Also, regarding Takeshita, the Applicant argues that Takeshita fails to disclose determining whether the second workpiece can be arranged on the top face of the first workpiece based on the number of divided regions. The Applicant cites Takeshita paragraph [0055] as evidence against Takeshita teaching this limitation. However, the Examiner respectfully disagrees. The Examiner notes that while the invention of Takeshita stacks workpieces on top of each other, the Examiner did not cite Takeshita to teach these concepts. Rather, the Examiner cited Takeshita to teach the limitation of wherein the entire bottom face of the second workpiece is divided into equal parts. As evident by paragraph [0054] and Figure 11A of Takeshita, the Examiner submits that Takeshita teaches this limitation. Regarding the stacking, the invention of Takeshita does not render incompatibility with Huang as this feature of Takeshita was not relied upon. However, the Examiner submits that Huang and Takeshita are compatible inventions with each other by being within the same field of endeavor and solving similar problems. Therefore, for these reasons, the Examiner submits that Takeshita still teaches aspects of claim 1, in which will also be described further below. As such, the Examiner submits that the combination of Diankov, Usami, Huang, and Takeshita teach all of the limitations of claim 1, in which will also be described further below. Regarding dependent claims 6-9, as all of these claims depend from claim 1, are still rejected, in which will be described later. Claim Rejections - 35 USC § 103 6. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 7. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 8. Claim(s) 1, 6, and 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov et al. (US 10953549 B2 hereinafter Diankov) in view of Usami et al. (US 20200031593 A1 hereinafter Usami), Huang et al. (US 5908283 A hereinafter Huang), and Takeshita (US 20160167232 A1 hereinafter Takeshita). Regarding Claim 1, Diankov teaches a robot apparatus (Col. 5 lines 37-39, where “FIG. 1 is an illustration of an example environment in which a robotic system 100 with a dynamic packing mechanism may operate.”), comprising: a position setting device configured to set a position where a second workpiece is stacked on an upper side of a plurality of first workpieces (Col. 11 lines 39-47, where “For illustrative purposes the placement location 350 is shown in FIG. 3B as being adjacent to (i.e., placed on the same horizontal layer/height as) the already-placed objects, such as directly on/contacting the pallet. However, it is understood that the placement location 350 can be on top of the already-placed objects. In other words, the robotic system 100 can derive the placement location 350 for stacking the target object 112 over and/or on top of one or more objects already on the pallet.”), (Note: The Examiner interprets the target object 112 as the second workpiece and the already-placed objects as the plurality of first workpieces.); an operation tool configured to grip the second workpiece (Col. 8 lines 20-24, where “The structural members and the joints can form a kinetic chain configured to manipulate an end-effector (e.g., the gripper) configured to execute one or more tasks (e.g., gripping, spinning, welding, etc.) ….”), (Col. 15 lines 51-54, where “For example, the robotic system 100 can operate the robotic arm 502 to grip and pick up the target object 112 from a designated location/portion on a conveyor and place the target object 112 on a pallet.”); a robot configured to move the operation tool (Col. 15 lines 51-54, where “For example, the robotic system 100 can operate the robotic arm 502 to grip and pick up the target object 112 from a designated location/portion on a conveyor and place the target object 112 on a pallet.”); and a controller configured to control the operation tool and the robot (Col. 7 lines 8-15, where “In some embodiments, the processors 202 can be included in a separate/stand-alone controller that is operably coupled to the other electronic/electrical devices illustrated in FIG. 2 and/or the robotic units illustrated in FIG. 1. The processors 202 can implement the program instructions to control/interface with other devices, thereby causing the robotic system 100 to execute actions, tasks, and/or operations.”), the position setting device comprises a sensor configured to detect a shape of the second workpiece (Col. 9 lines 50-58, where “As illustrated in FIG. 3A, some embodiments of the robotic system 100 can use discretized object models 302 to plan/derive placement locations of objects (e.g., the target object 112). The discretized object models 302 (shown using dotted lines) can represent exterior physical dimensions, shapes, edges, surfaces, or a combination thereof (shown using dash lines) for arriving or incoming objects (e.g., packages, boxes, cases, etc.) according to a discretization unit (e.g., a unit length).”), (Col. 16 lines 15-24, where “For example, the robotic system 100 can include and/or communicate with a source sensor 504 (e.g., one of the 3D cameras 122 of FIG. 1) located over the start location 114 and/or an incoming path (e.g., conveyor). The robotic system 100 can use the data from the source sensor 504 to generate and/or access the discretized object models 302 of FIG. 3A. In one or more embodiments, the robotic system 100 can image the objects and/or measure one or more dimensions of the objects using the source sensor 504.”), a processor configured to detect a shape of the second workpiece based on an output from the sensor (Col. 8 lines 53-59, where “As described in further detail below, the robotic system 100 (via, e.g., the processors 202) can process the digital image and/or the point cloud to identify the target object 112 of FIG. 1, the start location 114 of FIG. 1, the task location 116 of FIG. 1, a pose of the target object 112, a confidence measure regarding the start location 114 and/or the pose, or a combination thereof.”), (Col. 16 lines 19-24, where “The robotic system 100 can use the data from the source sensor 504 to generate and/or access the discretized object models 302 of FIG. 3A. In one or more embodiments, the robotic system 100 can image the objects and/or measure one or more dimensions of the objects using the source sensor 504.”), acquire shapes and positions of the plurality of first workpieces (Col. 12 lines 46 – 58, where “To calculate and evaluate the measure of support, the robotic system 100 can determine heights/contour for the placement area 340 of FIG. 3B in real-time using one or more of the imaging devices 222 of FIG. 2. In some embodiments, the robotic system 100 can use depth measures (e.g., point cloud values) from one or more of the imaging devices 222 located above the task location 116. Because a vertical position of the ground and/or of the platform (e.g., pallet) surface (e.g., a height of the platform surface above the facility ground surface) is known, the robotic system 100 can use the depth measure to calculate the heights/contour of the exposed top surface(s) of the platform, the placed objects, or a combination thereof.”), (Note: See Figures 3B. 4A, and 4B of Diankov as well where the height (shape) and position of the already-placed objects are acquired.), and search for a position where the second workpiece is allowed to be arranged on the upper side of the plurality of first workpieces (Col. 11 lines 44-51, where “In other words, the robotic system 100 can derive the placement location 350 for stacking the target object 112 over and/or on top of one or more objects already on the pallet. As described in detail below, the robotic system 100 can evaluate the heights of the already-placed objects in deriving the placement location 350 to ensure that the object is sufficiently supported when stacked on top of the already-placed objects.”), each workpiece of the first workpiece and the second workpiece includes a top face and a bottom face (Col. 2 lines 34-41, where “Systems and methods for identifying various packaging errors and dynamically packing objects (e.g., packages and/or boxes) are described herein. A robotic system (e.g., an integrated system of devices that executes one or more designated tasks) configured in accordance with some embodiments provides enhanced packing and storage efficiency by dynamically deriving storage locations for the objects and stacking them accordingly.”), (Note: See Figures 3A-B and 4A-B, where the boxes/packages are rectangular in shape. Since the boxes/packages are rectangular in shape, the Examiner interprets the boxes/packages to have a top and bottom face, in order for the boxes/packages to also be able to be stacked.), a determination range of a difference in height between a top face of one first workpiece and a top face of another first workpiece is predetermined (Col. 12 lines 46-58, where “To calculate and evaluate the measure of support, the robotic system 100 can determine heights/contour for the placement area 340 of FIG. 3B in real-time using one or more of the imaging devices 222 of FIG. 2. In some embodiments, the robotic system 100 can use depth measures (e.g., point cloud values) from one or more of the imaging devices 222 located above the task location 116. Because a vertical position of the ground and/or of the platform (e.g., pallet) surface (e.g., a height of the platform surface above the facility ground surface) is known, the robotic system 100 can use the depth measure to calculate the heights/contour of the exposed top surface(s) of the platform, the placed objects, or a combination thereof.”), (Col. 13 line 59 – Col. 14 line 4, where “The height difference threshold 416 and the support threshold 418 can correspond to limits used to process and/or validate the candidate positions 360. … In other words, the height difference threshold 416 can be used to define a range of surface heights that can contact and/or support the package placed thereon.”), (Note: See Figures 4A-B of Diankov where the height difference is calculated between adjacent top surfaces.), the processor is configured to determine whether the second workpiece is allowed to be arranged so as to be supported by both workpieces of the one first workpiece and the other first workpiece when a height of the top face of the one first workpiece and a height of the top face of the other first workpiece are different (Col. 13 lines 7-18, where “For each of the candidate positions 360 that overlap one or more of the already-placed objects, the robotic system 100 can evaluate the placement possibility based on the height measures 402. In some embodiments, the robotic system 100 can evaluate the placement possibility based on identifying the highest value of the height measures 402 overlapped in each of the candidate positions 360. The robotic system 100 can further identify other height measures 402 located in each of the candidate positions 360 with the height measures 402 within a limit of a difference threshold relative to the highest measure of the height measures 402.”), (Col. 13 line 59 – Col. 14 line 7, where “The height difference threshold 416 and the support threshold 418 can correspond to limits used to process and/or validate the candidate positions 360. The height difference threshold 416, which can be predetermined and/or adjusted by an operator and/or an order, can represent allowed deviations from another reference height (e.g., the maximum height 420 corresponding to the highest instance of the height measures 402 in the area overlapped by the discretized object model 302) for contacting and/or supporting packages placed on top. In other words, the height difference threshold 416 can be used to define a range of surface heights that can contact and/or support the package placed thereon. As such, relative to the maximum height 420, the lower height limit 422 can correspond to a lower limit for heights within the overlapped area 414 that can provide support for the stacked package.”), the processor allows the second workpiece to be arranged so as to be supported by both workpieces of the one first workpiece and the other first workpiece when a difference in the height falls within the determination range, and prohibits the second workpiece from being arranged so as to be supported by both workpieces of the one first workpiece and the other first workpiece when a difference in the height deviates from the determination range (Col. 14 lines 15-27, where “Accordingly, in one or more embodiments, the robotic system 100 can categorize the unit pixels 310 within the overlapped area 414 according to the height difference threshold 416. For example, the robotic system 100 can categorize the unit pixels 310 having heights satisfying the height difference threshold 416 (i.e., values greater than or equal to the lower height limit 422) as supporting locations 442 (e.g., a grouping of unit pixels 310 that represent a surface capable of having objects stacked thereon, such as represented in FIG. 4B via shaded pixels). The robotic system 100 can categorize the other unit pixels 310 as unqualified locations 444 (e.g., pixels with heights lower than the lower height limit 422).”), (Col. 14 lines 38-48, where “In one or more embodiments, the support threshold 418 can be used to evaluate a supported area (e.g., the unit pixels 310 that can provide support to an object stacked thereon, as can be determined by the height threshold) associated with the supporting locations 442. For example, the robotic system 100 can determine the support area outlines 426 based on extending edges and/or determining lines that extend across or around the unqualified locations 444 to connect corners of outermost/perimeter instances of the supporting locations 442. Thus, the support area outlines 426 can exclude the unqualified locations 444.”), (Note: The Examiner interprets the supporting locations 442 of Diankov to be the allowed workpiece placement locations and the unqualifies locations 444 of Diankov to be the prohibited workpiece placement locations, as these locations 442 and 444 determine whether the target object of Diankov is able to be placed on the already-placed objects.”), and the processor is further configured to determine that a region of the bottom face of the second workpiece faces the first workpiece when at least a part of the region of the bottom face of the second workpiece faces the top face of the first workpiece (Col. 9 line 62 – Col. 10 line 4, where “As illustrated in FIG. 3B, some embodiments of the robotic system 100 can use one or more discretized platform models 304 (e.g., discretized representations of the task locations 116 of FIG. 1) to plan/derive stacking placements of objects. The discretized platform models 304 can represent a placement area 340 (e.g., the physical dimension, shape, or a combination thereof of the task location 116, such as a top surface of the task location 116, a top surface of a package placed thereon, or a combination thereof) according to the discretization unit.”), (Col. 11 lines 42-51, where “However, it is understood that the placement location 350 can be on top of the already-placed objects. In other words, the robotic system 100 can derive the placement location 350 for stacking the target object 112 over and/or on top of one or more objects already on the pallet. As described in detail below, the robotic system 100 can evaluate the heights of the already-placed objects in deriving the placement location 350 to ensure that the object is sufficiently supported when stacked on top of the already-placed objects.”), (Note: See Figures 6A-B of Diankov as well, wherein the top face of the first workpiece 508 faces the bottom face of the second workpiece 112.). Diankov is silent on wherein the controller detects a position and an orientation of the second workpiece based on an output from a sensor, drives the robot so as to grip the second workpiece based on the position and the orientation of the second workpiece, and drives the robot so as to convey the second workpiece to a position, set by the position setting device, where the second workpiece is arranged, the processor sets a plurality of regions by dividing the entire bottom face of the second workpiece into equal parts, the processor calculates the number of a region facing the first workpiece; when the number of the region facing the first workpiece is equal to or more than a predetermined determination value, the processor allows the second workpiece to be arranged on the upper side of the first workpiece; and the predetermined determination value is smaller than a number of the divided regions of the entire bottom face of the second workpiece. However, Usami teaches wherein the controller detects a position and an orientation of the second workpiece based on an output from a sensor ([0060] via “For example, the image analyzing unit 520 analyzes the image or video captured by the first upper camera 110, thereby acquiring the positions and the dimensions of a plurality of packages P accommodated in the container C arranged in the first collected package arrangement area A1. For example, the image analyzing unit 520 acquires the dimensions of the packages P (the width dimensions and the depth dimensions of the packages P) when viewed from above. Furthermore, the image analyzing unit 520 analyzes the image or video captured by the first side camera 120, thereby acquiring the height dimension of a package P that is being lifted from the container C arranged in the first collected package arrangement area A1.”), drives the robot so as to grip the second workpiece based on the position and the orientation of the second workpiece ([0062] via “The selecting unit 530 selects one or more packages P that are to be preferentially retrieved, from among the packages P in the container C arranged in the first collected package arrangement area A1 (or the container C arranged in the second collected package arrangement area A2), based on the positional information of the packages P acquired by the image analyzing unit 520. For example, the selecting unit 530 selects packages P that are to be preferentially retrieved, while comprehensively considering the stacked state of the plurality of packages P, the position of the robot apparatus 200 relative to the container C, and the like.”), ([0065] via “The robot drive controller 550 controls driving of the robot apparatus 200. For example, the robot drive controller 550 calculates a route of the arm 220, based on the positional information of the packages P acquired by the image analyzing unit 520, the information indicating the picking packages P selected by the selecting unit 530, and the like.”), and drives the robot so as to convey the second workpiece to a position, set by the position setting device, where the second workpiece is arranged ([0066] via “Furthermore, the robot drive controller 550 determines a movement destination (placement destination) of the package P retrieved from the container C arranged in the first collected package arrangement area A1 (or the container C arranged in the second collected package arrangement area A2), based on a result of the determination performed by the determining unit 540. For example, if the package P satisfies the predetermined condition (is regarded as a package that is to be handled), the robot drive controller 550 controls the robot apparatus 200 so as to place the package retrieved from the container C, on the sub conveyor 300 (the first sub conveyor 300A or the second sub conveyor 300B) that is closer to that container C.”). Further, Huang teaches wherein the processor sets a plurality of regions by dividing the bottom face of the second workpiece, the processor calculates the number of a region facing the first workpiece (Col. 24 line 65 – Col. 25 line 10, where “As shown in FIG. 35A, under the package corner support check, four identical windows 351 on corners of the package outline are established on the bottom surface of a placing package. … As long as there is part of any direct supporting surface falling within these windows, expanded windows 352 are used to determine the overlap condition. … For each expanded window, its overlap is checked with all the direct supporting surfaces. If the minimum overlap dimension is above a threshold, then a solid support on package corner is considered established.”), (Note: See Figure 35A of Huang as well.); when the number of the region facing the first workpiece is equal to or more than a predetermined determination value, the processor allows the second workpiece to be arranged on the upper side of the first workpiece; and the predetermined determination value is smaller than a number of the divided regions of the bottom face of the second workpiece (Col. 25 lines 15-22, where “If a package has support on four corners (step 315), or three corners and the ratio between the total area of direct supporting surfaces and that of placing package's bottom surface is sufficiently big, such as 70% (see step 317), the package is considered stable (steps 316 and 321), and the Stability Check is complete. Otherwise, the package edge support check 314 is made, with the assistance of edge support calculations from step 314.”), (Note: In this scenario of Huang, the Examiner interprets the support of the three of the four corners with the threshold supporting area being met for the stability check as being smaller than the total number of divided regions, as at least the fourth window and/or all of the bottom surface of the second workpiece are not required to be in contact to be considered supported.). Further, Takeshita teaches wherein the entire bottom face of the second workpiece is divided into equal parts ([0054] via “The placement determining unit 28 obtains the grid information 111 of the resting surface as shown in FIG. 11A, and the grid information 112 of the receiving surface as shown in FIG. 11B. As shown in FIG. 11A, the lower, left-hand corner of a grid cell 113 located at the leftmost bottom of the grid information 111 of the resting surface is set as the origin, and the right arrow extending from the origin denotes the X direction, while the up-pointing arrow extending from the origin denotes the Y direction.”), (Note: See Figure 11A of Takeshita wherein the entire bottom face of the second workpiece is divided into equal parts.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Usami wherein the controller detects a position and an orientation of the second workpiece based on an output from a sensor, drives the robot so as to grip the second workpiece based on the position and the orientation of the second workpiece, and drives the robot so as to convey the second workpiece to a position, set by the position setting device, where the second workpiece is arranged. Doing so determines, picks, and conveys the workpieces to their destination locations, as stated above by Usami in paragraph [0066]. In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Huang wherein the processor sets a plurality of regions by dividing the bottom face of the second workpiece, the processor calculates the number of a region facing the first workpiece; when the number of the region facing the first workpiece is equal to or more than a predetermined determination value, the processor allows the second workpiece to be arranged on the upper side of the first workpiece; and the predetermined determination value is smaller than a number of the divided regions of the bottom face of the second workpiece. Doing so determines that the second workpiece sufficiently meets the required stability check threshold, even if not all four corners of the workpiece are supported by determining enough of the bottom surface of the second workpiece would be supported, as stated above by Huang in Col. 25 lines 15-22. In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Takeshita wherein the entire bottom face of the second workpiece is divided into equal parts. Doing so improves the speed at which the top face of the first workpiece(s) and the bottom face of the second workpiece are compared to each other for workpiece arrangement, as stated by Takeshita ([0009] via “In the placement determining method as described above, the shape of the resting surface may be compared with the shape of the receiving surface, and it may be determined whether the placement object can be placed on the receiving object, by plotting the shape of the resting surface on a grid so as to obtain grid information of the resting surface, plotting the shape of the receiving surface on a grid so as to obtain grid information of the receiving surface, comparing the grid information of the resting surface with the grid information of the receiving surface, and determining whether the placement object can be placed on the receiving object. With this method, the shape of the resting surface and the shape of the receiving surface can be compared with each other at a high speed.”). Regarding Claim 6, modified reference Diankov teaches the robot apparatus of claim 1, wherein the processor is configured to select a position where the second workpiece is arranged when the second workpiece is allowed to be arranged in a plurality of positions on the upper side of the first workpiece (Col. 9 lines 62-66, where “As illustrated in FIG. 3B, some embodiments of the robotic system 100 can use one or more discretized platform models 304 (e.g., discretized representations of the task locations 116 of FIG. 1) to plan/derive stacking placements of objects.”), (Col. 11 lines 11-17, where “Based on the discretized data/representations, the robotic system 100 can dynamically derive a placement location 350 for the target object 112. As illustrated in FIG. 3B, the robotic system 100 can dynamically derive the placement location 350, even after one or more objects (e.g., illustrated as objects with diagonal fills in FIG. 3B) have been placed on the placement area 340.”), the processor selects a position where the second workpiece is arranged according to a first condition having a first priority level (Col. 11 line 61 – Col. 12 line 18, where “As described further in detail below, the robotic system 100 can derive the placement location 350 according to a set of placement rules, conditions, parameters, requirements, etc. … The robotic system 100 can evaluate each of the candidate positions 360 according to various parameters/conditions, such as support measure/condition, supported weight in comparison to fragility ratings (e.g., maximum supported weight, such as for packages stacked thereon) of the supporting objects, space/packing implications, or a combination thereof. The robotic system 100 can further evaluate the candidate positions 360 using one or more placement rules, such as collision free requirement, stack stability, customer-specified rules/priorities, package separation requirements or the absence thereof, maximization of total loaded packages, or a combination thereof.”), (Note: The Examiner interprets any of these parameters/conditions as having a first priority level, dependent on the stacking operation.), and, when there are a plurality of positions that satisfy the first condition and where the second workpiece is arranged, the processor selects a position where the second workpiece is arranged according to a second condition having a second priority level (Col. 22 lines 16-52, where “In deriving the placement combinations (e.g., candidate placement locations), the robotic system 100 can test/evaluate locations of the discretized object model 302 of the corresponding package based on iteratively deriving and evaluating candidate stacking scenarios (e.g., potential combinations of unique placement locations for the available packages). The candidate stacking scenarios can each be derived based on identifying unique potential locations (e.g., according to a predetermined sequence/rule for placement locations) for the packages according to the above discussed sequence. The candidate stacking scenarios and/or the unique placement locations can be evaluated according to one or more placement criteria (e.g., requirements, constraints, placement costs, and/or heuristic scores). For example, the placement criteria can require that the discretized object models 302 entirely fit within horizontal boundaries of the discretized platform model 304 when placed at the selected location. Also, the placement criteria can require that placement of the discretized object models 302 be within or over a threshold distance relative to the initial placement location (e.g. such as along a horizontal direction) and/or the previous placement location, such as for adjacent placements or separation requirements. … Accordingly, the robotic system 100 can generate multiple unique placement combinations (i.e., candidate placement plans for each layer and/or the candidate stacking scenarios that each include multiple layers) of package placement locations.”), (Col. 22 line 55 – Co. 23 line 7, where “At block 740, the robotic system 100 can calculate/update a placement score for each combination/package placement. … For example, the robotic system 100 can use preference factors (e.g., multiplier weights) and/or equations to describe a preference for: separation distances between packages, differences in package dimensions/fragility ratings/package weights for adjacent packages, the collision probabilities, continuous/adjacent surfaces at the same height, a statistical result thereof (e.g., average, maximum, minimum, standard deviation, etc.), or a combination thereof. Each combination can be scored according to the preference factors and/or the equations that may be predefined by a system manufacturer, an order, and/or a system operator. In some embodiments, the robotic system 100 can calculate the placement score at the end of the overall placement iterations.”), (Note: The Examiner interprets the different combinations of weights/costs/preference factors to have varying levels of priorities set by the operator/human, as discussed above by Diankov in Col. 22 line 55 – Col. 23 line 7.). Regarding Claim 7, modified reference Diankov teaches the robot apparatus of claim 1, wherein the sensor is a three-dimensional sensor configured to be able to detect a three- dimensional shape of the first workpiece (Col. 16 lines 33-42, where “Also, the robotic system 100 can include and/or communicate with a destination sensor 506 (e.g., one of the 3D cameras 122 of FIG. 1) located over the task location 116. The robotic system 100 can use the data from the destination sensor 506 to determine and dynamically update the discretized platform models 304 of FIG. 3B. In one or more embodiments, the robotic system 100 can image and/or measure one or more dimensions of the placement area 340 (e.g., the task location 116, such as a pallet, cage, and/or car track).”). 9. Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov et al. (US 10953549 B2 hereinafter Diankov) in view of Usami et al. (US 20200031593 A1 hereinafter Usami), Huang et al. (US 5908283 A hereinafter Huang), and Takeshita (US 20160167232 A1 hereinafter Takeshita), and further in view of Kimoto et al. (US 20170267467 A1 hereinafter Kimoto). Regarding Claim 8, modified reference Diankov teaches the robot apparatus of claim 1, but is silent on the robot apparatus further comprising a display part configured to display information related to an arrangement of the second workpiece, wherein, when the processor cannot detect a position where the arrangement of the second workpiece is allowed on the upper side of the first workpiece, the display part indicates that there is no position where the second workpiece is arranged. However, Kimoto teaches a display part configured to display information related to an arrangement of the second workpiece, wherein, when the processor cannot detect a position where the arrangement of the second workpiece is allowed on the upper side of the first workpiece, the display part indicates that there is no position where the second workpiece is arranged ([0083] via “In step 135, after the determination for all the complete patterns, the complete pattern selection unit 43 determines whether or not a complete pattern in which the box can be stowed is present. In step 135, if it is determined that no complete pattern in which the box can be stowed is present, control is terminated. In this case, for example, the control device 2 displays a warning on a display so as to notify the operator that the box cannot be stowed.”). It would have been obvious to one or ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Kimoto wherein the robot apparatus further comprises a display part configured to display information related to an arrangement of the second workpiece, wherein, when the processor cannot detect a position where the arrangement of the second workpiece is allowed on the upper side of the first workpiece, the display part indicates that there is no position where the second workpiece is arranged. Doing so notifies a human operator that the workpiece cannot be stowed so that the operator may take appropriate action, as stated above by Kimoto. 10. Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Diankov et al. (US 10953549 B2 hereinafter Diankov) in view of Usami et al. (US 20200031593 A1 hereinafter Usami), Huang et al. (US 5908283 A hereinafter Huang), and Takeshita (US 20160167232 A1 hereinafter Takeshita), and further in view of Tokue (JP 2019181620 A hereinafter Tokue (provided by Applicant's IDS)). Regarding Claim 9, modified reference Diankov teaches the robot apparatus of claim 1, but is silent on wherein the processor adds a predetermined margin width to dimensions of the second workpiece, and determines whether the arrangement of the second workpiece is allowed. However, Tokue teaches wherein the processor adds a predetermined margin width to dimensions of the second workpiece ([0087] – [0088] via “Next, as shown by a 2 dot chain line in FIG. 5 a, the arithmetic unit 8 creates a margin added shape 17 in which a margin of a width B set around the bottom surface shape of the work W is added. The width B of the margin to be added to the periphery of the bottom surface shape of the work W is set to be larger than the maximum value of the error of the position which may occur when the work W is conveyed, in accordance with the accuracy of the position control of the manipulator 3 and the end effector 4 of the robot 2 to be used, for example.”), and determines whether the arrangement of the second workpiece is allowed ([0093] via “In this state, the arithmetic unit 8 performs a process of determining whether or not the work existing part 15 is interfered with the mesh of the bottom mesh data 14 which is overlapped with the work model 16. When it is determined that interference occurs by this determination processing, the calculation unit 8, as indicated by an arrow in FIG. 6 a, calculates a mesh size along the x-axis direction of the workpiece model 16.”), ([0104] via “As shown in FIG. 9, when the workpiece storage position 18 is determined by the storage position search process, the operation unit 8 is provided. A process for determining a target position for storing a workpiece W is performed so that a corresponding corner of an actual workpiece W is positioned at a position shifted by a dimension of a width B of a margin at the time of forming a workpiece model 16 from an upper left corner of a workpiece storage position 18 in an x-axis direction and a y-axis direction, as shown by a 2 dot chain line.”). It would have been obvious to one or ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Tokue wherein the processor adds a predetermined margin width to dimensions of the second workpiece, and determines whether the arrangement of the second workpiece is allowed. Doing so accounts for the margin of error of the robot’s placement accuracy, as stated above by Tokue in paragraph [0088], successfully storing the workpiece in this margin-added position, as stated by Tokue ([0107] via “Thus, in the storage system 1 of the present embodiment, the manipulator 3 and the end effector 4 of the robot 2 move in accordance with an instruction from the control device 5, whereby the current workpiece W to be stored is stored in the storage container 6 in a state in which a space defined by at least a margin width B is formed around the workpiece W. Accordingly, in the storage system 1 of the present embodiment, a gap having a width of at least a margin of a margin is maintained between the workpiece W newly stored in the storage container 6 and the side wall 10 existing on the outer periphery of the storage container 6 and on the outer periphery of the bottom surface 9 of the storage container 1.”). Examiner’s Note 11. The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the Applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the Applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP 2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed Invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123. Conclusion 12. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BYRON X KASPER whose telephone number is (571)272 -3895. The examiner can normally be reached Monday - Friday 8 am - 5 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached on (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BYRON XAVIER KASPER/Examiner, Art Unit 3657 /ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Oct 19, 2022
Application Filed
Nov 05, 2024
Non-Final Rejection — §103
Feb 18, 2025
Response Filed
Feb 28, 2025
Final Rejection — §103
May 30, 2025
Request for Continued Examination
Jun 03, 2025
Response after Non-Final Action
Jul 31, 2025
Non-Final Rejection — §103
Oct 07, 2025
Response Filed
Oct 29, 2025
Final Rejection — §103
Jan 05, 2026
Request for Continued Examination
Feb 12, 2026
Response after Non-Final Action
Feb 18, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594964
METHOD OF AND SYSTEM FOR GENERATING REFERENCE PATH OF SELF DRIVING CAR (SDC)
2y 5m to grant Granted Apr 07, 2026
Patent 12594137
HARD STOP PROTECTION SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12583101
METHOD FOR OPERATING A MODULAR ROBOT, MODULAR ROBOT, COLLISION AVOIDANCE SYSTEM, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 24, 2026
Patent 12576529
ROBOT SIMULATION DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12564962
ROBOT REMOTE OPERATION CONTROL DEVICE, ROBOT REMOTE OPERATION CONTROL SYSTEM, ROBOT REMOTE OPERATION CONTROL METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
70%
Grant Probability
88%
With Interview (+18.4%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 103 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month