Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of claims
The following claims have been rejected or allowed for the following reasons:
Claim(s) 1-18 is rejected under 35 USC § 103
Claim(s) 15-16 is rejected under 35 USC § 101
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. GB2204711.2, filed on 03/31/2022.
Information Disclosure Statement
The information disclosure statement/statements (IDS) were filed on 9/19/24. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 101
Claim 15 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because the claims contain the limitation of "a computer program". The Federal Circuit has held that a product claim to an intangible collection of information, even if created by human effort, does not fall within any statutory category. (See Digitech, 758 F.3d at 1350). Review of the specification, and in particular paragraph page 2 paragraph 1 of the filed specification, provides embodiments in which the computer program may be interpreted as software and thus the claimed limitations are directed towards non-statutory subject matter.
Claim 16 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because the claims contain the limitation of "a computer readable data carrier", can encompass non-statutory transitory forms of signal transmission, such as a propagating electrical or electromagnetic signal per se. (See In re Nuijten, 500 F.3d 1346, 84 USPQ2d 1495 (Fed. Cir. 2007). Review of the specification, and in particular paragraph page 16 paragraph 4 of the filed specification, provides embodiments in which the storage medium may be interpreted as signal transmission and thus the claimed limitations are directed towards non-statutory subject matter.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-6, 12, 14-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over as applied to Kimura (US 20200078935 A1), in further view of Oka (US 20200391385 A1).
Regarding claim 1 Kimura teaches A computer-implemented method of controlling a robotic manipulator for packing an object, the method comprising: obtaining an image of an object grasped by an end effector of the robotic manipulator; (Kimura [0088] reads “The robot 101 has an arm 201, an article recognition sensor 202, the hand 203, a storage box recognition sensor 204, and a sorting box recognition sensor 205. The hand 203 is attached to a tip end of the arm 201, and grips an article. The article recognition sensor 202 recognizes an article to be gripped by the hand 203. The article recognition sensor 202 may be, for example, a camera or a distance image camera that is attached to a tip end of the arm 201 to photograph the direction of the article gripped by the hand 203.”);
determining a major axis of the object in the image; (Kimura [0105] reads “FIG. 5A is a perspective view of the finger-type hand 203 a sandwiching and gripping an article 251 i using the fingers 209 a and 209 b. FIG. 5B is a side view obtained by viewing the finger-type hand 203 a and the article 251 i of FIG. 5A from the y direction (namely, the direction parallel to short sides of the article 251 i in a horizontal surface). FIG. 5C is a side view obtained by viewing the finger-type hand 203 a and the article 251 i of FIG. 5A from the x direction (namely, the direction parallel to long sides of the article 251 i in a horizontal surface). FIG. 5D is a plan view obtained by viewing the finger-type hand 203 a and the article 251 i of FIG. 5A from the z direction (namely, the upper direction).”);
determining a first object pose wherein the major axis of the object is aligned with an axis of a receiving space; and controlling the robotic manipulator to: manipulate the object to the first object pose above the receiving space; (Kimura [0127] reads “In the example, an article 251 m is newly stored in a state where articles 251 i, 251 j, and 251 k are already stored. The articles 251 i and 251 j are installed adjacent to each other in the y direction, and the articles 251 i and 251 k are installed adjacent to each other in the x direction. All the articles are installed so that the long sides thereof are parallel to the x direction. The position of the article 251 m to be installed is a position where the side surface of the article 251 m on the short-side side comes into contact with the article 251 j and the side surface thereof on the long-side side comes into contact with the article 251 k.”);
move the object from the first object pose down into the receiving space; (Kimura [0120] reads “It should be noted that movement including at least components in the vertical direction in such a case that the article 251 is moved from an opening portion of the sorting box 105 to near the bottom surface is also described as “insertion”, and movement including at least components in the horizontal direction in such a case that the article 251 having reached near the bottom surface is brought closer to a wall surface of the sorting box 105 or the direction of another article 251 already installed on the bottom surface is also described as “shift movement”. As will be described later, the moving direction at the time of insertion of the article 251 is different from the moving direction at the time of shift movement. It should be noted that the sorting box 105 is not illustrated in FIG. 7 for convenience of explanation.”);
Kimura does not teach and manipulate the object, in response to detecting by a force sensor a contact force above a predetermined force threshold at the end effector, to a second object pose above the receiving space for initiating a further attempt to place the object in the receiving space.
Oka in analogous art, teaches and manipulate the object, in response to detecting by a force sensor a contact force above a predetermined force threshold at the end effector, to a second object pose above the receiving space for initiating a further attempt to place the object in the receiving space. (Oka [0094] reads “As illustrated in FIG. 9, the hand 22 can be controlled to move oppositely in the direction of arrow B (repulsive motion) by the force control when the hand 22 or the object OBJ interferes with the wall 14W of the container 14 b or a side surface of the previously set object OBJs while moving along the moving route R.” and [0121] reads “The pressing force of the given threshold or less and the moving speed of the given threshold or less can be predetermined through testing in accordance with a type, a shape, material, or resistance of the object OBJ.”);
It would have been obvious to one with ordinary skill in the art, before the effective filing date of the claimed invention to have modified the teachings of Kimura with that of Oka to include a method that would allow for a robotic system understand when it contacts an unexpected object. This would allow for the robotic system to minimize the amount of damage that it could cause when packing objects. (Oka [0005] reads “It is thus preferable to provide an object handling control device, an object handling device, an object handling method, and a computer program product that can avoid a hand or an object grasped by the hand from interfering with an obstacle, which may otherwise damage them, and efficiently transfer the object.”);
Regarding claim 2 Kimura/Oka teaches A computer-implemented method according to claim 1, comprising controlling the robotic manipulator, in response to determining that the object and the end effector are positioned within the receiving space (Kimura [0135] reads “On the other hand, the grip position of the article 251 m to be installed by the finger 209 is shifted from the deal position in the x direction by only +ΔD. Therefore, the position when the article 251 m is inserted into the sorting box 105 (namely, before the shift movement is carried out) is shifted from the ideal position in the x direction by only −ΔD. This state is shown in each of FIG. 9A and FIG. 9B. It should be noted that FIG. 9B is a side view for showing a positional relation among the wall surface, the article 251 j, and the article 251 m shown in FIG. 9A.”);
without detecting a further contact force above the predetermined force threshold, (Oka [0051] reads “Additionally, in the embodiment, if the hand 22 or the object OBJ interferes with an obstacle under the force control based on the detected value of the force sensor 31, the hand 22 and the object OBJ can be moved away from the interfering obstacle. Such force control over the hand 22 and the object OBJ to move away from the interfering obstacle is referred to as repulsive control.” It would be appreciated by one with ordinary skill in the art that if the contact force is not there then the object motion plan would carry out as planned. );
to release the object from the end effector. (Kimura [0130] reads “When the finger 209 releases the article 251 m thereafter, the article 251 m comes into contact with the bottom surface of the sorting box 105. A plan view of the arrangement of each article 251 at this time is shown in FIG. 8E, and a side view viewed from the x direction is shown in FIG. 8F.”);
Regarding claim 3 Kimura/Oka teaches A computer-implemented method according to claim 1 computer-implemented method according to wherein the first and second object poses lie in a common plane parallel to an uppermost plane of the receiving space. (Kimura figure 8 below clearly shows that the objects are to be held and placed in such a way that planes of the surfaces are parallel.);
PNG
media_image1.png
351
514
media_image1.png
Greyscale
Kimura figure 8
Regarding claim 4 Kimura/Oka teaches A computer-implemented method according to claim 3, wherein a displacement of the second object pose relative to the first object pose in the common plane is determined according to a predetermined path of pose displacements. (Oka [0081] reads “First, the integrator 51 performs an order check with reference to a moving work plan instructed by the high-order system, as the planning operation. In this case, the camera 32 a recognizes the objects OBJ in the container 14 a at the initial position HP, and the camera 32 b recognizes previously set objects in the container 14 b at the moving destination RP to acquire the object information and the status information. The grasp plan generator 54 then generates a grasp plan including the order of grasping the objects OBJ. The grasp plan generator 54 also generates a plan of a moving or grasping route to the object OBJ. As the robot arm operation, the robot controller 57 moves the hand 22 from a grasp standby position (home position) to the initial position HP along the generated grasp route. The robot controller 57 causes the hand 22 or the suction pad 22 a to grasp the object OBJ at the grasp position by the suction pad operation. The robot controller 57 causes the hand 22 grasping the object OBJ to move to a moving standby position (for example, in the measuring regions of the laser range scanners 33 a and 33 b), and causes, while the hand is moving, the laser range scanners 33 a and 33 b (for example, LRFs) to estimate the pose of the object OBJ including a size and a grasping pose, as the recognition operation.”);
Regarding claim 5 Kimura/Oka teaches A computer-implemented method according to claim 1, comprising: determining, in response to detecting the contact force by the force sensor, a direction away from a contact point associated with the detected contact force; and controlling the robotic manipulator to move the end effector and object in the determined direction away from the contact point. (Oka [0094] reads “As illustrated in FIG. 9, the hand 22 can be controlled to move oppositely in the direction of arrow B (repulsive motion) by the force control when the hand 22 or the object OBJ interferes with the wall 14W of the container 14 b or a side surface of the previously set object OBJs while moving along the moving route R.”);
Regarding claim 6 Kimura/Oka teaches A computer-implemented method according to claim 1, wherein the image comprises a depth image. (Kimura [0088] reads “The robot 101 has an arm 201, an article recognition sensor 202, the hand 203, a storage box recognition sensor 204, and a sorting box recognition sensor 205. The hand 203 is attached to a tip end of the arm 201, and grips an article. The article recognition sensor 202 recognizes an article to be gripped by the hand 203. The article recognition sensor 202 may be, for example, a camera or a distance image camera that is attached to a tip end of the arm 201 to photograph the direction of the article gripped by the hand 203.”);
Regarding claim 12 Kimura/Oka teaches A computer-implemented method according to claim 1, comprising: determining spatial dimensions of the object based on the image; (Oka [0054] reads “The camera 32 a is located in the initial position to image the object OBJ and the surroundings thereof from above at the initial position HP of the object OBJ to be grasped and moved or conveyed and acquire object information (such as a shape or a size) and status information (such as a stationary pose) on the object OBJ.”);
obtaining spatial dimensions of the receiving space; (Oka [0055] reads “The camera 32 a is located in the initial position to image the object OBJ and the surroundings thereof from above at the initial position HP of the object OBJ to be grasped and moved or conveyed and acquire object information (such as a shape or a size) and status information (such as a stationary pose) on the object OBJ.”);
and determining the first object pose based on a comparison of the spatial dimensions of the object and the spatial dimensions of the receiving space. (Oka [0063] reads “The grasp plan generator 54 calculates a grasping method and a grasping pose of the object OBJ at the initial position HP, and a moving route and via points along which the manipulator 20 or hand 22 is moved to the initial position HP. The grasp plan generator 54 also calculates a moving route and via points of the hand 22 to grasp a next intended object OBJ after releasing the object OBJ at the moving destination RP. In these cases, the object information acquired by the camera 32 a is utilized in calculation of the moving route and via points to move the hand 22 without interfering with surrounding obstacles such as wall surfaces of the containers 14 a and 14 b or an object or objects other than the currently moved object OBJ.”);
Regarding claim 14 Kimura/Oka teaches A computer-implemented method according to claim 1, comprising: determining an end effector pose of the end effector; and adjusting the first object pose based on the end effector pose. (Oka [0105] reads “First, with reference to FIG. 12 and FIG. 13, the following describes an example of setting a passage restricted region for the fingertip TCP, that is, the second region 100, to avoid vertical interference without considering the size of the grasped object OBJ and the size of the hand 22. That is, the passage restricted region for the fingertip TCP is set on the assumption that the sizes of the grasped object OBJ and the hand 22 be infinitely close to zero. In the top view of the container 14 b and the previously set object OBJs in FIG. 12, locations where vertical interference may occur are the wall 14W of the container 14 b enclosed by ellipses C1 and the object OBJs enclosed by an ellipse C2. In this case, the second region 100 (second region 100 t 0) for avoiding interference between the bottom of the object OBJ grasped by the hand 22 and the top end of the wall 14W is set inside and along the wall 14W to include a given first margin value δx (X-direction) and first margin value δy (Y-direction). Similarly, on the top surface of the object OBJs, the second region 100 (second region 100 d 0) is set to cover the periphery of the object OBJs, including the given first margin value δx (X-direction) and first margin value δy (Y-direction) with respect to the top end of the object OBJs.”);
Regarding claim 15 Kimura/Oka teaches A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the computer- implemented method of claim 1. (Kimura [0218] reads “The central control device 391 is a processor that executes various processes by executing a program (not shown) stored in the main storage device 392. The input device 395 is a device that accepts an input of information from the worker 106, and may be, for example, a keyboard, a mouse, a touch sensor, or the like. The output device 396 is a device that outputs information to the worker 106, and may be, for example, a display device displaying characters, images, or the like. The input device 395 and the output device 396 may be integrated such as, for example, a so-called touch panel. The communication device 394 is a device that communicates with the entire management computer 109 via the network 301.”);
Regarding claim 16 Kimura/Oka teaches A computer-readable data carrier having stored thereon the computer program of claim 15. (Kimura [0218] reads “The central control device 391 is a processor that executes various processes by executing a program (not shown) stored in the main storage device 392. The input device 395 is a device that accepts an input of information from the worker 106, and may be, for example, a keyboard, a mouse, a touch sensor, or the like. The output device 396 is a device that outputs information to the worker 106, and may be, for example, a display device displaying characters, images, or the like. The input device 395 and the output device 396 may be integrated such as, for example, a so-called touch panel. The communication device 394 is a device that communicates with the entire management computer 109 via the network 301.”);
Regarding claim 17 Kimura/Oka teaches A controller for a robotic manipulator, wherein the controller is configured to perform the computer-implemented method of claim 1. (Oka [0046] reads “FIG. 1 is a schematic configuration diagram of an object handling system 1 including an object handling device. The object handling system 1 includes a control device 10, a manipulator 20 (robot arm), a sensor 30, and a housing 40. The object handling system 1 serves to grasp an intended object OBJ to move at an initial position HP, moves or conveys the object OBJ, and releases it at a moving destination RP. Such an operation is referred to as picking, transfer, conveyance of the object OBJ. Of the control device 10, the elements, except for a later-described robot controller 57 (motion controller) that controls the manipulator 20, may be separately provided as an object handling control device.”);
Regarding claim 18 Kimura/Oka teaches A robotic packing system comprising the controller of claim 17 and the robotic manipulator for packing an object. (Oka [0046] reads “FIG. 1 is a schematic configuration diagram of an object handling system 1 including an object handling device. The object handling system 1 includes a control device 10, a manipulator 20 (robot arm), a sensor 30, and a housing 40. The object handling system 1 serves to grasp an intended object OBJ to move at an initial position HP, moves or conveys the object OBJ, and releases it at a moving destination RP. Such an operation is referred to as picking, transfer, conveyance of the object OBJ. Of the control device 10, the elements, except for a later-described robot controller 57 (motion controller) that controls the manipulator 20, may be separately provided as an object handling control device.”);
Claim(s) 7-9 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over as applied to Kimura/Oka, in further view of Kanunikov (US 20210129334 A1).
Regarding claim 7 Kimura/Oka teaches A computer-implemented method according to claim 6.
Kimura/Oka does not teach comprising removing from the depth image any features with an associated depth value outside a range corresponding to a volume between the end effector and uppermost plane of the receiving space.
Kanunikov in analogous art, teaches comprising removing from the depth image any features with an associated depth value outside a range corresponding to a volume between the end effector and uppermost plane of the receiving space. (Kanunikov [0132] reads “In some embodiments, the robotic system 100 can validate the potential obstacle 910 based on the depth measures described above. For example, the robotic system 100 can validate/identify the potential obstacles 910 with one or more of the top surface depth measures greater than or equal to those of the candidate position 801. The robotic system 100 can eliminate from the potential obstacles 910 the preceding objects 902 that have the top surface depth measures less than those of the candidate position 801. In one or more embodiments, the robotic system 100 can identify/eliminate the potential obstacles 910 based on an ambiguity associated with the height of the candidate position 801 and/or the height of the potential obstacles 910.”);
It would have been obvious to one with ordinary skill in the art, before the effective filing date of the claimed invention to have modified the teachings of Kimura/Oka with that of Kanunikov to include depth sensing abilities. This would allow the system to better understand the objects that are to be grasped and better emulate the movements that humans conduct. (Kanunikov [0004] reads “However, despite the technological advancements, robots often lack the sophistication necessary to duplicate human sensitivity and/or adaptability required for executing more complex and intricate tasks. For example, robots often lack the granularity of control and flexibility in the executed actions to fully utilize available resources. Also, robots often are unable to account for deviations or uncertainties that may result from various real-world factors. Accordingly, there remains a need for improved techniques and systems for controlling and managing various aspects of the robots to complete the tasks despite the various real-world factors.”);
Regarding claim 8 Kimura/Oka teaches A computer-implemented method according to claim 6.
Kimura/Oka does not teach comprising projecting the depth image onto a plane parallel to a base of the receiving space to obtain a set of two dimensional points.
Kanunikov in analogous art, teaches comprising projecting the depth image onto a plane parallel to a base of the receiving space to obtain a set of two dimensional points. (Kanunikov [0126] reads “The depth value(s) for a top surface of the target object 112 in the candidate position 801 can be calculated as the sum of the depth value of the placement surface according to the discretized platform model, height of one or more objects placed or planned for placement between the container floor and the candidate position 801, and/or the height of the target object 112. The corresponding depth value(s) for the end-effector can be calculated as the sum of the calculated depth value of top surface of the target object at the candidate position 801 and the depth value(s) corresponding to the discretized end-effector model.”);
It would have been obvious to one with ordinary skill in the art, before the effective filing date of the claimed invention to have modified the teachings of Kimura/Oka with that of Kanunikov to include depth sensing abilities. This would allow the system to better understand the objects that are to be grasped and better emulate the movements that humans conduct. (Kanunikov [0004] reads “However, despite the technological advancements, robots often lack the sophistication necessary to duplicate human sensitivity and/or adaptability required for executing more complex and intricate tasks. For example, robots often lack the granularity of control and flexibility in the executed actions to fully utilize available resources. Also, robots often are unable to account for deviations or uncertainties that may result from various real-world factors. Accordingly, there remains a need for improved techniques and systems for controlling and managing various aspects of the robots to complete the tasks despite the various real-world factors.”);
Regarding claim 9 Kimura/Oka teaches A computer-implemented method according to claim 8.
Kimura/Oka does not teach wherein determining the major axis of the object in the image is done based on the set of two-dimensional points.
Kanunikov in analogous art, teaches wherein determining the major axis of the object in the image is done based on the set of two-dimensional points. (Kanunikov [0077] reads “In some embodiments, the robotic system 100 can derive and utilize an axis aligned bounding box (AABB) 730 for a set of objects designated for placement in the container. In other words, the AABB 730 can be a designated planar shape (e.g., a rectangle) that encompasses and/or is coincident with outer-most portions of the objects according to the derived placement plan. For the example illustrated in FIG. 7A, the AABB 730 can be a set of rectangles that are aligned according to a set of predetermined axes (e.g., x, y, and z axes) that coincides with outer-most points of the objects in the packing plan 700. The AABB 730 can represent an overall size (e.g., pack size) of the packing plan 700. The robotic system 100 may derive and use the AABB 730 to adjust the packing plan 700 and account for unexpected real-world conditions (e.g., partially-opened containers and/or warped container walls).”);
It would have been obvious to one with ordinary skill in the art, before the effective filing date of the claimed invention to have modified the teachings of Kimura/Oka with that of Kanunikov to include depth sensing abilities. This would allow the system to better understand the objects that are to be grasped and better emulate the movements that humans conduct. (Kanunikov [0004] reads “However, despite the technological advancements, robots often lack the sophistication necessary to duplicate human sensitivity and/or adaptability required for executing more complex and intricate tasks. For example, robots often lack the granularity of control and flexibility in the executed actions to fully utilize available resources. Also, robots often are unable to account for deviations or uncertainties that may result from various real-world factors. Accordingly, there remains a need for improved techniques and systems for controlling and managing various aspects of the robots to complete the tasks despite the various real-world factors.”);
Regarding claim 13 Kimura/Oka teaches A computer-implemented method according to claim 12.
Kimura/Oka does not teach , wherein determining the first object pose comprises adjusting an initial object pose to reduce a determined overhang of the spatial dimensions of the object beyond the spatial dimensions of the receiving space.
Kanunikov in analogous art, teaches wherein determining the first object pose comprises adjusting an initial object pose to reduce a determined overhang of the spatial dimensions of the object beyond the spatial dimensions of the receiving space. (Kanunikov [0169] reads “With the 3D states, the robotic system 100 can evaluate the placement combinations 744 according to one or more stacking rules (e.g., the horizontal offset rule 776 of FIG. 7C, the support separation rule 786 of FIG. 7C, and/or the vertical offset rule 790 of FIG. 7C). As an illustrative example, the robotic system 100 can calculate a reduced score for the placement combinations 744 or flag locations thereof that violate the overlap requirement 778 of FIG. 7C, the overhang requirement 780 of FIG. 7C, the vertical offset rule 790, the CoM offset requirement 784 of FIG. 7C, or a combination thereof described above.”);
It would have been obvious to one with ordinary skill in the art, before the effective filing date of the claimed invention to have modified the teachings of Kimura/Oka with that of Kanunikov to include depth sensing abilities. This would allow the system to better understand the objects that are to be grasped and better emulate the movements that humans conduct. (Kanunikov [0004] reads “However, despite the technological advancements, robots often lack the sophistication necessary to duplicate human sensitivity and/or adaptability required for executing more complex and intricate tasks. For example, robots often lack the granularity of control and flexibility in the executed actions to fully utilize available resources. Also, robots often are unable to account for deviations or uncertainties that may result from various real-world factors. Accordingly, there remains a need for improved techniques and systems for controlling and managing various aspects of the robots to complete the tasks despite the various real-world factors.”);
Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over as applied to Kimura/Oka/Kanunikov, in further view of Yamazaki (DE 102014016072 A1).
Kimura/Oka/Kanunikov teaches A computer-implemented method according to claim 9.
Kimura/Oka/Kanunikov does not teach wherein determining the major axis of the object in the image is done by performing Principal Component Analysis using the set of two-dimensional points.
Yamazaki in analogous art, teaches wherein determining the major axis of the object in the image is done by performing Principal Component Analysis using the set of two-dimensional points. (Yamazaki page 3 paragraph 4 reads “; and a robot control unit that controls the robot to move the hand to the hand position position determined by the hand position attitude calculation unit to pick up the object, wherein the subject identification unit calculates a main component direction of the connected amount by performing a principal component analysis on the related amount applies corresponding three-dimensional points, and identifies the position of the object based on the main component direction.”);
It would have been obvious to one with ordinary skill in the art, before the effective filing date of the claimed invention to have modified the teachings of Kimura/Oka/Kanunikov with that of Yamazaki to include a more sophisticated method for understanding the depth of objects. This would allow the system to preform better grasps of the objects. (Yamazaki abstract reads “An article picking device configured to measure surface positions of objects arbitrarily stacked in three-dimensional space by a three-dimensional measuring instrument to detect positional information of the three-dimensional points, to determine a connected quantity formed by connecting closely spaced three-dimensional points of the three-dimensional points, and to identify a position and position of an object based on the position information of three-dimensional points pertaining to the connected set. The location of the object is identified by calculating a principal component direction of the contiguous set by applying principal component analysis to the three dimensional points associated with the contiguous set and identifying the position of the object from the principal component direction.”);
Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over as applied to Kimura/Oka, in further view of Ogawa (EP 3623115 A1).
Regarding claim 11 Kimura/Oka teaches A computer-implemented method according to claim 1.
Kimura/Oka does not teach wherein controlling the robotic manipulator to manipulate the object to the first object pose comprises: determining a planar rotation of the end effector to align the major axis of the object with the axis of the receiving space; and controlling the robotic manipulator to perform the planar rotation of the end effector.
Ogawa in analogous art, teaches wherein controlling the robotic manipulator to manipulate the object to the first object pose comprises: determining a planar rotation of the end effector to align the major axis of the object with the axis of the receiving space; and controlling the robotic manipulator to perform the planar rotation of the end effector. (Ogawa [0115] reads “As illustrated in FIG. 14B, the hand 220, while being rotated, is similarly controlled as illustrated in FIG. 14A. Also in this case, when the magnitude of the pressing force in each direction is set to 5 [N], target force values in X, Y, Z, A, B, and C directions are set to (5 [N], -5 [N], -5 [N], 0 [Nm], 0 [Nm], 0 [Nm]). However, in FIG. 14B, the robot controller 107 obliquely presses the object 10 against a side surface O1 of the obstacle O, which causes moment around the contact point between the object 10 and the hand 220 about the Z axis. In this case, at the target value of the moment about the Z axis being 0 [Nm], the hand 220 rotates to cancel the generated moment and places the object 10, contacting the set obstacle O, in other words, in parallel to the side surface O1 of the obstacle O. As described above, in the present embodiment, the object handling system 1 can set an appropriate value to the parameter for each axial direction, thereby placing the object 10 in a close contact with the obstacle O depending on the state of the obstacle O inside the container 140b, and housing the object 10 in the container 140b at higher space efficiency.”);
It would have been obvious to one with ordinary skill in the art, before the effective filing date of the claimed invention to have modified the teachings of Kimura/Oka with that of Ogawa to include a method for manipulating the object to better fit into the receiving space. This would allow for the system to better package objects together. (Ogawa [0003] reads “It is beneficial to provide a novel hand control device with less inconvenience that can ensure grasping of an object.”);
Other references not Cited
Throughout examination other references were found that could read onto the prior art. Though these references were not used in this examination they could be used in future examination and could read on the contents of the current disclosure. These references are, Zak (US 11370111 B2); Wang (US 20220016779 A1); Ebrahimi US 20220066456 A1).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN MARTIN O'MALLEY whose telephone number is (571)272-6228. The examiner can normally be reached Mon - Fri 9 am - 5 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached at (571) 270 - 5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOHN MARTIN O'MALLEY/Examiner, Art Unit 3658
/Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658