DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Telleria et al. (US 2018/0283015 A1, hereinafter referred to as “Telleria”).
Regarding claim 1, Telleria discloses a robotic system for performing targeted application of material, the robotic system comprising: a base unit (Figs. 1-4, element 120; paragraph 0026) comprising: a ground positioning system (Figs. 1-2, element 128) to position the base unit (paragraph 0027), and a support (Figs. 1-2, element 122) coupled to the ground positioning system (paragraph 0026), an end effector positioning system (Figs. 1-4, element 140) comprising a first portion (Figs. 1-2, element 142) and a second portion (Figs. 1-2 and 4, element 144), the first portion coupled (via Figs. 1-2, element 130) to the support (paragraphs 0026-0031, 0039); an end effector (Figs. 1-4, element 160, 160M, 160P, 370) coupled to the second portion of the end effector positioning system (paragraphs 0026, 0032, 0037, 0039, 0047); a perception system (Fig. 3, element 324, 364) to detect data associated with a seam (Figs. 6-9, 11 and 18-19, element 620) between two or more components (Figs. 6-9, 11 and 18-19, element 610; paragraphs 0032-0034, 0038, 0067, 0074, 0084, 0119); a planner to generate a plan for the end effector based on the data (paragraphs 0067, 0115); and a control system (Figs. 1-3, element 322) to generate control signals based on the plan for one or more of the ground positioning system, the end effector positioning system, and the end effector to cause the end effector to selectively apply a coating (Fig. 18, element 630; paint) to the seam (paragraphs 0032-0033, 0038, 0041, 0047, 0064, 0067, 0074, 0115, 0126).
Regarding claim 2, Telleria discloses the robotic system of claim 1, wherein the end effector selectively applying the coating to the seam comprises: the end effector (Figs. 10 and 16-18, element 160M, 160M5, 160M6, 160M7) applying the coating on a first portion of the one or more components within a threshold distance of the seam, and the end effector avoiding (via Figs. 10 and 16-18, element 1005, 1605, 1705, 1805) applying the coating on a second portion of the one or more components outside of the threshold distance of the seam (paragraphs 0080, 0101-0102, 0105-0106).
Regarding claim 3, Telleria discloses the robotic system of claim 1, wherein: the robotic system further includes a vision system to capture an image including at least a portion of the seam; and the perception system detects the data associated with the seam based on the image (paragraphs 0058, 0082, 0118, 0124).
Regarding claim 4, Telleria discloses the robotic system of claim 1, wherein: the perception system comprises a seam data determination system; and the seam data determination system determines the data associated with the seam, wherein the data includes a bounding box around the seam, a label identifying an orientation of the seam, optionally a first confidence score associated with the bounding box, optionally a second confidence score associated with the label, and optionally a third confidence score associated with the bounding box and the label (paragraph 0079, 0083-0084, 0132-0133).
Regarding claim 5, Telleria discloses the robotic system of claim 1, wherein the perception system has: a component orientation detection system (Fig. 3, element 324, 364) to detect orientation of the components (paragraph 0032-0035); and a seam type identification system detects a seam type based on the detected orientation of the components (paragraph 0083-0084).
Regarding claim 6, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; the user input system is to receive user input identifying an orientation of the components; and the perception system has a seam type identification system that detects a type of the seam based on the received user input identifying the orientation of the components (paragraph 0066-0067, 0076, 0083-0084, 0109-0110, 0123, 0126-0131).
Regarding claim 7, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system (paragraph 0066-0067, 0076, 0127-0131); the user input system is to receive user input indicative of a location of the seam, an orientation of the seam and a type of the seam (paragraph 0083-0084); and the perception system detects the data associated with the seam based on the user input (paragraph 0083-0084).
Regarding claim 8, Telleria discloses the robotic system of claim 1, wherein: the data associated with the seam is determined based on an image capturing at least a portion of seam; the planner determines, based on data associated with the seam, two coordinates in the image corresponding to endpoints of the seam; and the planner translates the two coordinates in the image into coordinates of a three-dimensional coordinate system of the robotic system (paragraph 0065-0067, 0076-0079, 0084, 0119, 0127-0129).
Regarding claim 9, Telleria discloses the robotic system of claim 8, wherein: the control system generates the control signals based on the coordinates of the three-dimensional coordinate system corresponding to endpoints of the seam to cause the end effector to apply the coating on the seam (paragraph 0065-0067, 0076-0079, 0111, 0125-0129).
Regarding claim 10, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; and the user input system is to receive an input indicative of an orientation of the components (paragraph 0066-0067, 0076, 0127-0131).
Regarding claim 11, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; and the user input system is to receive an input indicative of an orientation of the seam (paragraph 0066-0067, 0076, 0127-0131).
Regarding claim 12, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system (paragraph 0066-0067, 0076, 0127-0131); and the user input system is to receive an input indicative of a type of the seam (paragraph 0083-0084).
Regarding claim 13, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; and the user input system is to receive an input changing a location of the seam (paragraph 0066-0067, 0076, 0127-0131).
Regarding claim 14, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; and the user input system is to receive an input changing a length of the seam (paragraph 0066-0067, 0076, 0127-0131).
Regarding claim 15, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; and the user input system is to receive an input changing an orientation of the seam (paragraph 0066-0067, 0076, 0127-0131).
Regarding claim 16, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; and the user input system is to receive an input changing a type of the seam (paragraph 0066-0067, 0076, 0083-0084 0127-0131).
Regarding claim 17, Telleria discloses the robotic system of claim 1, wherein: the robotic system is communicably coupled to a user input system; and the user input system is to receive user input indicative of the data associated with the seam and/or the components; the perception system includes a machine learning model that outputs data about the seam and/or the components; and the received user input is used to further train, correct, and/or calibrate the machine learning model (paragraph 0066-0067, 0076, 0127-0131, 0137).
Regarding claim 18, Telleria discloses the robotic system of claim 1, wherein the end effector is controlled by the control signals generated by the control system to selectively apply a further coating using a fan bias angle that is 180 degrees offset from a fan bias angle used with the coating (paragraphs 0090-0091, 0099, 0102, 0115, 0125).
Regarding claim 19, Telleria discloses the robotic system of claim 1, wherein: the end effector comprises two spray nozzles (Fig. 4, element 106M, 106P), selectively controllable to apply material onto a surface; the control signals cause a first one of the spray nozzles to apply the coating (paragraph 0039, 0042, 0045, 0053, 0061); and the control signals cause a second one of the spray nozzles to apply a further coating (paragraph 0039, 0042, 0047, 0054).
Regarding claim 20, Telleria discloses a method for performing targeted application of material, the method comprising: determining, by a perception system (Fig. 3, element 324, 364), data associated with a seam (Fig. 8, element 620) between two or more components (Fig. 8, element 610A, 610C), wherein the data associated with the seam includes a location of the seam, and a type of the seam (paragraph 0032-0034, 0038, 0067, 0074, 0083-0084, 0119); translating the data associated with the seam from a coordinate system of the perception system (Fig. 3, element 324, 364) to a coordinate system of one or more positioning systems (Figs. 1-2, element 128) of a robotic system (Figs. 1-4, element 100) having an end effector (Figs. 1-4, element 160, 160M, 160P, 370; paragraphs 0026-0031, 0032-0034, 0038-0039, 0067, 0074, 0084, 0119 ); generating a toolpath for the end effector based on the translated data (paragraph 0064, 0066-0072); generating control signals for the one or more positioning systems and the end effector based on the toolpath (paragraph 0078); and controlling, using the control signals, an end effector positioning system and the end effector to cause the end effector to selectively apply a coating to the seam (paragraph 0111, 0125-0126).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DALE MOYER whose telephone number is (571)270-7821. The examiner can normally be reached Monday-Friday 8am-5pm PT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi H Tran can be reached at 571-272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Dale Moyer/Primary Examiner, Art Unit 3656