DETAILED ACTION
Amendment of the claims filed on 12/17/2024 had been acknowledged. Applicant had canceled claim 10 and added claim 21.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-9 and 11-21 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Robertson et al. (Pub. No.: US 2021/0000013 A1).
Regarding claim 1, Robertson et al. disclose a robotic fruit picking system, the system comprising:
a mobile robot navigating the environment (a robot configured to move over the ground to pick target fruits - par. 65, 108-109), the mobile robot comprising:
a sensor generating sensor data indicative of a physical attribute of the object of interest (e.g., camera configured to detect target fruit and determine their suitability for picking (par. 133 and 134)); and
a robotic arm comprising an actuator (e.g., picking arm (100) comprising actuator (par. 128, 225 and 525)), wherein the actuator causes the robotic arm to move relative to the object of interest (e.g., e.g., picking arm (100) configured to position the Picking Head for picking and moves picked fruit (par. 92) via actuator (par. 128, 225 and 525) ); and
a computer system (e.g., Robot Control subsystem / computer Vision Subsystem – par. 135 ) communicably coupled to the mobile robot (e.g., robot using Robot Control subsystem / computer Vision Subsystem – par. 19 and 92) , wherein the computer system comprises a processor (e.g., computer) and a memory storing an algorithm (e.g., computer memory to store instruction / program - image processing software - to detect and pick up target fruit (par. 273 and 270 )) that, when executed by the processor, causes the computer system to:
receive sensor data generated by the sensor (e.g., processing camera captured images of a scene (par. 136));
detect the object of interest based on the sensor data (e.g., detect target fruits in the captured image via camera – par. 137);
determine whether the detected object of interest meets or exceeds a first threshold based on the sensor data (e.g., determine an indication of fruit health based on threshold of acceptability for colour – for instance, colour of the achenes – based on captured image via camera (par. 279-281) );
generate, upon determining that the object of interest meets or exceeds the first threshold (e.g., determine an indication of fruit health based on threshold of acceptability for colour (par. 279-281)), a label associated with a keypoint of the object of interest based on the sensor data (e.g., perform semantic segmentation; for instance, to label pixels corresponding to ripe fruit and unripe fruit – par. 148);
determine an orientation and scale of the object of interest in the environment relative to the robotic arm (e.g., estimate pose (position and orientation) and shape / size for detected fruit (par. 138, 151 and 157) relative to the Picking Arm “so that the centroid of the detected fruit appears in the centre of the frame and scaled so that the fruit occupies constant size” (par. 153)) based on the generated label (e.g., based on performed semantic segmentation; for instance, to label pixels corresponding to ripe fruit and unripe fruit – par. 148);
generate a motion plan for the robotic arm (e.g., determining movement of the Picking Arm / Head to an appropriate pose for picking detected fruit(s) without colliding with the plant (par. 111) via range of possible trajectories (par. 167 and 157 and Figure 15)) based on the determined orientation and scale of the object of interest (e.g., based on estimated pose and shape for detected fruit (par. 138)); and
control the actuator of the robotic arm to cause the robotic arm to perform the generated motion plan (e.g. moving the whole robot forward or backward and Picking Arm to pick a target / candidate fruit (par. 236 and 240) via actuator(s) (par. 128, 225 and 525) based on determined movement / trajectories without colliding with the plant (par. 111, 167 and 157)).
Regarding claim 2, Robertson et al. disclose a robotic fruit picking system, wherein the object of interest comprises a strawberry (e.g., wherein the target fruit comprising strawberry – par. 61 and 114).
Regarding claim 3, Robertson et al. disclose a robotic fruit picking system, wherein the generated label comprises an attribute label comprising at least one of: a calyx of the strawberry, a tip of the strawberry, a shoulder of the strawberry, or a stem of the strawberry (e.g., location of the calyx and stem of the target fruit (par. 153, 278 and 114)).
Regarding claim 4, Robertson et al. disclose a robotic fruit picking system, wherein the generated label comprises a segmentation label comprising a masked output that outlines a shape of the strawberry (e.g., performed semantic segmentation of target fruit (par. 148) includes obtaining information about the target (e.g. its shape and size, its suitability for picking, its pose) by obtaining more views from new viewpoints (par. 246)).
Regarding claim 5, Robertson et al. disclose a robotic fruit picking system, wherein generating the label associated with a keypoint of the object of interest based on the sensor data (e.g., perform semantic segmentation; for instance, to label pixels corresponding to ripe fruit and unripe fruit – par. 148) includes:
determining an orientation of the strawberry based on the sensor data (e.g., determine the position and orientation for detected fruit / strawberry (par. 77) via camera (par. 133 and 134);
determining a viewpoint of the strawberry based on the determined orientation of the strawberry (e.g., determining multiple viewpoints of the target fruit / strawberries using camera – par. 152 and 147); and
applying an instance identification to the sensor data based on the determined orientation and viewpoint of the strawberry (e.g., identification of the calyx and stem of the target fruit (par. 153, 278 and 114)).
Regarding claim 6, Robertson et al. disclose a robotic fruit picking system wherein, when executed by the processor, the algorithm further causes the computer system to: determine whether the detected object of interest meets or exceeds a second threshold based on the sensor data (e.g., determine the strawberry achenes becomes redder when the fruit becomes overripe – based on captured image via camera (par. 279-281) ) and wherein the label is further generated upon determining that the object of interest meets or exceeds the second threshold (e.g., perform semantic segmentation; for instance, to label pixels corresponding to ripe fruit and unripe fruit – par. 148);
Regarding claim 7, Robertson et al. disclose a robotic fruit picking system, wherein determining that the object of interest meets or exceeds the first threshold comprises a determination that the strawberry is not turning (e.g., determine the colour of the flesh of the fruit / strawberry as under-ripeness (par. 281)).
Regarding claim 8, Robertson et al. disclose a robotic fruit picking system, wherein determining that the object of interest meets or exceeds the second threshold comprises a determination that the strawberry is ripe (e.g., determine strawberry ripeness based on captured image via camera (par. 177 and 78) ).
Regarding claim 9, Robertson et al. disclose a robotic fruit picking system, wherein the robotic arm further comprises a gripper interacting with the strawberry (e.g., a picking head comprising a means of (i) cutting the strawberry stalk and (ii) gripping the cut fruit for transfer (par. 69 and 109)), and wherein performing the generated motion (e.g., determined movement of the Picking Arm / Head to an appropriate pose for picking detected fruit(s) (par. 167 and 157 and Figure 15)) comprises causing the gripper to establish a predetermined angle between a stem of the strawberry and a shoulder of the strawberry (e.g., move the Picking Head to appropriate positions for locating, localizing, and picking target fruit (par. 109) at a particular angle (par. 348)).
Regarding claim 11, Robertson et al. disclose a robotic fruit picking system, wherein the predetermined angle is greater than or equal to five degrees and less than or equal to twenty-five degrees (e.g., adjusting the robot Picking Arm and Head angles relative to the pose of the target fruit, wherein the range of motion is +/- 275 degree (par. 151-152, 348, and 435), which covers predetermined angle greater than or equal to five degrees and less than or equal to twenty-five degrees).
Regarding claim 12, Robertson et al. disclose a robotic fruit picking system, wherein the predetermined angle is greater than or equal to ten degrees and less than or equal to sixteen degrees (e.g., adjusting the robot Picking Arm and Head angles relative to the pose of the target fruit; wherein the range of motion is +/- 275 degree (par. 151-152, 348, and 435), which covers predetermined angle is greater than or equal to ten degrees and less than or equal to sixteen degrees).
Regarding claim 13, Robertson et al. disclose a robotic fruit picking system wherein the sensor comprises at least one of: a camera (e.g., camera – par. 67),
Regarding claim 14, Robertson et al. disclose a robotic fruit picking system, wherein the sensor comprises at least one of: a hyperspectral camera, high-resolution camera, a charge-coupled device sensor, or a complementary metal oxide semiconductor sensor (e.g., cameras sensitive to specific (and possibly non-visible) portions of the EM spectrum – par. 178 and 453).
Regarding claim 15, Robertson et al. disclose a robotic fruit picking system wherein the mobile robot further comprises a vehicle navigating the environment (e.g., Figure 2 shows a mobile robot configured to move the whole robot along the ground to appropriate positions for locating, localizing, and picking target fruit – par. 65, 109 and 107), wherein the robotic arm is mounted to the vehicle (Figure 2 show Picking Arm mounted on the mobile robot – par. 109).
Regarding claim 16, Robertson et al. disclose a robotic fruit picking system, wherein the computing system is positioned remotely relative to the robot (e.g., remotely control the robot, which covers a remotely computer system – par. 223).
Regarding claim 17, Robertson et al. disclose a robotic fruit picking system, wherein the computing system is mounted to the vehicle (e.g., computer vision system mounted on the Picking Head of the robot – par. 67 and 109 and Figure 8).
Regarding claim 18, Robertson et al. disclose Robot Control subsystem / computer Vision Subsystem comprising image analysis software for a robotic fruit picking system, the system / subsystem comprising:
a processor (e.g., Robot Control subsystem / computer Vision Subsystem – par. 135) communicably coupled to the robot (e.g., robot using Robot Control subsystem / computer Vision Subsystem – par. 19 and 92); and
a memory storing an algorithm that, when executed by the processor, causes the computer system (computer memory to store instruction / program - image processing software - to detect and pick up target fruit (par. 273 and 270 ) via Robot Control subsystem / computer Vision Subsystem) to:
receive sensor data generated by a sensor of the robot e.g., processing camera captured images of a scene (par. 136));
detect an object of interest based on the sensor data (e.g., detect target fruits in the captured image via camera – par. 137);
determine whether the detected object of interest meets or exceeds a first threshold based on the sensor data (e.g., determine an indication of fruit health based on threshold of acceptability for colour – for instance, colour of the achenes – based on captured image via camera (par. 279-281));
generate, upon determining that the object of interest meets or exceeds the first threshold (e.g., determine an indication of fruit health based on threshold of acceptability for colour (par. 279-281)), a label associated with a keypoint of the object of interest based on the sensor data (e.g., perform semantic segmentation; for instance, to label pixels corresponding to ripe fruit and unripe fruit – par. 148);
determine an orientation and scale of the object of interest in an environment relative to a gripper of the robot (e.g., estimate pose (position and orientation) and shape / size for detected fruit (par. 138, 151 and 157) relative to the Picking Head “so that the centroid of the detected fruit appears in the centre of the frame and scaled so that the fruit occupies constant size” (par. 153 and 26)) based on the generated label (e.g., based on performed semantic segmentation; for instance, to label pixels corresponding to ripe fruit and unripe fruit – par. 148);
generate a motion for the robot (e.g., determining movement of the Picking Arm / Head to an appropriate pose for picking detected fruit(s) without colliding with the plant (par. 111) via range of possible trajectories (par. 167 and 157 and Figure 15)) based on the determined orientation and scale of the object of interest (e.g., based on estimated pose and shape for detected fruit (par. 138)); and
cause the robot to perform the generated motion (e.g. moving the whole robot forward or backward and Picking Arm to pick a target / candidate fruit (par. 236 and 240) via actuator(s) (par. 128, 225 and 525) based on determined movement / trajectories without colliding with the plant (par. 111, 167 and 157)).
Regarding claim 19, Robertson et al. disclose Robot Control subsystem / computer Vision Subsystem comprising image analysis software for a robotic fruit picking system, wherein the object of interest comprises a strawberry (e.g., wherein the target fruit comprising strawberry – par. 61 and 114), and wherein performing the generated motion (e.g., determined movement of the Picking Arm / Head to an appropriate pose for picking detected fruit(s) (par. 167 and 157 and Figure 15)) comprises causing the gripper of the robot to establish a predetermined angle between a stem of the strawberry and a shoulder of the strawberry (e.g., move the Picking Head to appropriate positions for locating, localizing, and picking target fruit (par. 109) at a particular angle (par. 348)).
Regarding claim 20, Robertson et al. disclose Robot Control subsystem / computer Vision Subsystem comprising image analysis software for a robotic fruit picking system, wherein the predetermined angle is greater than or equal to five degrees and less than or equal to twenty-five degrees (e.g., adjusting the robot Picking Arm and Head angles relative to the pose of the target fruit, wherein the range of motion is +/- 275 degree (par. 151-152, 348, and 435), which covers predetermined angle greater than or equal to five degrees and less than or equal to twenty-five degrees).
Regarding claim 21, Robertson et al. disclose Robot Control subsystem / computer Vision Subsystem comprising image analysis software for a robotic fruit picking system, wherein the segmentation label comprises at least one of: a semantic segmentation e.g., perform semantic segmentation; for instance, to label pixels corresponding to ripe fruit and unripe fruit – par. 148),
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jorge O. Peche whose telephone number is (571)270-1339. The examiner can normally be reached Monday-Friday 8:30 AM - 5:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi H. Tran can be reached at 571 272 6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Jorge O Peche/Examiner, Art Unit 3656