Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 10/21/2024 has been considered and is in compliance with the provisions of 37 CFR 1.97.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-8 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Faulring et al. (US 2022/0078972; hereinafter Faulring).
Regarding Claim 1:
Faulring discloses an articulated-robot-arm-control device for controlling an articulated robot arm performing work on at least one target object in a work area, the articulated-robot-arm-control device comprising: an imaging device including a camera, provided at the articulated robot arm and configured to capture a plurality of work area images, each of which is an image of the work area, respectively at a plurality of imaging positions that are different from one another relative to the work area (Faulring, Para. [0059-0060], Faulring discloses an articulating robot arm which includes sensors for collecting images of the work targets and work target area, with the sensors including at least a camera);
an image display that displays the plurality of work area images captured by the imaging device (Faulring, Para. [0085-0089], Faulring discloses the collected images may be displayed to the remote operator);
an image processor that determines a reference imaging position that is a position of the imaging device (Faulring, Para. [0067], Faulring discloses the machine vision camera and computing systems determine the three dimensional coordinates within a global reference system); and
a driving controller that drives an actuator of the articulated robot arm, to thereby move the imaging device to the plurality of imaging positions, wherein the image processor includes: a specified-position-coordinate calculator that, in response to an operator of the articulated-robot-arm-control device specifying each of the target objects appearing in at least one work area image of the plurality of work area images displayed by the image display, calculates a set of coordinates corresponding to a position of said each specified target object (Faulring, Para. [0065-0067], Faulring discloses the harvesting system targets and analyzes the target objects as identified by the operator who is analyzing the visual representation as determined by the sensors and determines the coordinate position within the global reference system),
a target detector that detects a plurality of target images, each of which is an image of one of the target objects, in the plurality of work area images (Faulring, Para. [0062-0065], Faulring discloses identifying work targets based on the operator identification from a plurality of captured target images), and a reference-imaging-position determiner that
determines a subset of the target images in each of the plurality of work area images, each target image in the subset having at least one of the calculated sets of coordinates within a boundary thereof (Faulring, Para. [0063-0065], Faulring discloses determining a subset of the target images which are determined to have been identified by the operator as harvestable, such as determining subset properties of the work target which include fruit health, quality, and ripeness),
determines a total number of the target images in said subset for said each work area image (Faulring, Para. [0068], Faulring discloses determining a plurality of mapped target coordinates within the work area and places the target coordinates within a harvest queue order), and
determines the reference imaging position based on the total numbers of the target images for the plurality of work area images (Faulring, Para. [0068-0070], Faulring discloses determining a starting target coordinate (i.e. first harvestable fruit) based on the harvest queue order).
Regarding Claim 2:
Faulring discloses the articulated-robot-arm-control device according to claim 1.
Faulring further discloses wherein the reference-imaging-position determiner selects one work area image from the plurality of work area images based on the total numbers of the target images for the plurality of work area images, and sets an imaging position at which the imaging device captures the selected one work area image as the reference imaging position (Faulring, Para. [0068-0070], Faulring discloses determining a starting target coordinate (i.e. user identified targets) based on the harvest queue order with the starting target coordinate position is set within the global reference system (Para. [0067]).
Regarding Claim 3:
Faulring discloses the articulated-robot-arm-control device according to claim 1.
Faulring further discloses wherein the reference-imaging-position determiner selects one of the work area images that has a largest total number of the target images, and sets an imaging position at which the imaging device captures the selected work area image as the reference imaging position (Faulring, Para. [0068-0070], Faulring discloses determining a starting target coordinate (i.e. user identified targets) based on the harvest queue order with the starting target coordinate position is set within the global reference system (Para. [0067], with the user identifying image locations with the most fruits (i.e. most efficient, see at least Para. [0093])).
Regarding Claim 4:
Faulring discloses the articulated-robot-arm-control device according to claim 1.
Faulring further discloses wherein the reference-imaging-position determiner selects one of the work area images that has a smallest total number of the target images, and sets an imaging position at which the imaging device captures the selected work area image as the reference imaging position (Faulring, Para. [0068-0070], Faulring discloses determining a starting target coordinate (i.e. missed single work targets, see at least Para. [010-0102]) based on the harvest queue order with the starting target coordinate position is set within the global reference system (Para. [0067]).
Regarding Claim 5:
Faulring discloses the articulated-robot-arm-control device according to claim 1.
Faulring further discloses wherein the reference-imaging-position determiner obtains the reference imaging position based on a ratio between a total number of the target images detected by the target detector and the total number of the target images for each of the plurality of work area images (Faulring, Para. [0068-0070], Faulring discloses determining a starting target coordinate (i.e. user identified targets) based on the harvest queue order with the starting target coordinate position is set within the global reference system (Para. [0067]), and the target positions only being chosen if work targets are identified within the captured work area images (Para. [0007], [0073], [0078], [0101]).
Regarding Claim 6:
Faulring discloses the articulated-robot-arm-control device according to claim 1.
Faulring further discloses wherein the driving controller moves the imaging device relative to the work area with predetermined intervals by driving the actuator to thereby position the imaging device at the plurality of imaging positions (Faulring, Para. [0068-0070], Faulring discloses determining a starting target coordinate (i.e. user identified targets along a row) based on the harvest queue order with the starting target coordinate position is set within the global reference system (Para. [0067]).
Regarding Claim 7:
Faulring discloses the articulated-robot-arm-control device according to claim 1.
Faulring further discloses the imaging device includes the camera configured to capture a second plurality of work area images at the reference imaging position respectively with a plurality of different exposure times of the camera (Faulring, Para. [0135-0137], Faulring discloses a plurality of subassemblies may be configured on the robotic arm to provide a plurality of lighting at specific frequencies), and
the image processor further includes a reference-exposure-time determiner that determines a reference exposure time of the camera based on the second plurality of work area images (Faulring, Para. [0059-0060], Faulring discloses the harvesting system processes the collected imagery through at least a computer which detects the work target in whichever manner the light and sensors are operating).
Regarding Claim 8:
Faulring discloses the articulated-robot-arm-control device according to claim 1.
Faulring further discloses the imaging device captures a second plurality of work area images at the reference imaging position respectively with a plurality of different color tones (Faulring, Para. [0135-0137], Faulring discloses a plurality of subassemblies may be configured on the robotic arm to provide a plurality of lighting exposures during image collection), and
the image processor further includes a color tone determiner that determines a reference color tone based on the second plurality of work area images (Faulring, Para. [0059-0060], [0081], [0085], [0091], Faulring discloses the harvesting system processes the collected imagery through at least a computer which detects the work target in whichever manner the work target color tone is determined).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Yuan et al. (US 2021/0053229) – discloses implementations for coordinating semi-autonomous robots to perform agricultural task on a plurality of plants.
Satat (US 2022/0168898) – discloses a method of identifying a target surface of a work object based on a plurality of images collected.
Cherian et al. (US 2023/0267614) – discloses an imaging controller provided for segmenting images to determine depth information of objects and the like.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZACHARY JOSEPH WALLACE whose telephone number is (469)295-9087. The examiner can normally be reached 7:00 am - 5:00 pm, Monday - Friday.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached at (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Z.J.W./Examiner, Art Unit 3656
/WADE MILES/Supervisory Patent Examiner, Art Unit 3656