DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 62-85 are pending.
Election/Restrictions
Claims 75-81 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected invention, there being no allowable generic or linking claim. Election was made without traverse in the reply filed on 9/4/2025.
Claim Objections
Claim 62 is objected to because of the following informalities:
Claim 62, line 8: “an” should be replaced with --and--.
Claim 62, line 13: --a-- should be inserted before “non-transitory computer readable storage medium.
Claim 71: “the first optical source” and “the second optical source” lack antecedent basis in the claims.
Appropriate correction is required.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 62, 68, 69, 72-74, and 84 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Krause et al. (US Publication No. 2011/0282492) with inherent features demonstrated by Fanuc (“iRVision Fully Integrated Plug & Play Vision System Machine Vision in 2D and 3D”) and iRVision 3DL Operator Manual.
Krause teaches:
Re claim 62. A system for welding at least a portion of a piece, the system comprising:
a 6-axis welding robot, the 6-axis welding robot comprising a robotized arm, the robotized arm having first to sixth axes (robot 12, Figure 1; and paragraph [0047]: “welding torch”);
a vision module mounted on a fourth axis of the robotized arm (vision sensor 28, Fig. 1; and paragraph [0049]: “three-dimensional laser vision sensor 28 (e.g. iRVision® sensor manufactured by FANUC Robotics America, Inc.)”), the vision module comprising:
at least one optical source operable to irradiate the at least portion of the piece along at least two different irradiation paths (paragraph [0075]: “an iRVision® 3DL system manufactured by FANUC Robotics America, Inc. is used to find the start and stop positions with respect to X, Y, and Z. A vision program locates the start position. Then there is a move to the found position, and refining of the position while extracting the Z height. This process is duplicated for the end position.”; and paragraph [0077]: “The A_EMULATE_CAD.TP program locates the start of the section, and then moves a 3D camera to the starting point. A Cross Section Program extracts X, Y, Z position that the profile of the laser line, and finds the desired geometric relationship or junction. … Once the laser profile position is found, the tooling is moved forward a set increment, before the next Cross Section is scrutinized.”. Additionally, Fanuc demonstrates, at the “iRVision 3D with 3DL Sensor” section, the iRVision® 3DL system used by Krause uses structured laser light projections for reliable detections.); an
a camera configured to receive light emanating from the at least portion of the piece irradiated by the at least one optical source and generate image data representative of the at least portion of the piece (paragraphs [0049 and 0075-0077]); and
a computing device, operatively connected to the camera, the computing device comprising non-transitory computer readable storage medium having stored thereon computer executable instructions that, when executed by a processor, cause the processor to (controller 20, Fig. 1; and paragraph [0055]):
receive the image data generated by the camera (108, Fig. 2; and paragraph [0057]: “The image is transmitted to the controller 20 for extracting information relating to a critical feature or edge represented in the image.”);
obtain a reference welding path to be followed by the 6-axis welding robot, based on the image data (112, Fig. 2; paragraph [0058]: “The controller 20 processes the critical features of the image along with the calibration data retrieved from step 106 to generate a continuous three-dimensional path along the workpiece 22, as shown in steps 110 and 112 respectively.”; and paragraph [0068]: “As a non-limiting example, the path 200 represents an actual seam of the workpiece 204 to be welded by a robotic torch.”); and
send instructions to the 6-axis welding robot to weld the at least portion of the piece according to the reference welding path (114, Fig. 2; and paragraphs [0067 and 0068]: “Once the three-dimensional path is generated, the robot system 10 moves the tool 18 along the path using the critical features to determine a location of the tool 18 along any portion of the path, as shown in step 114. Specifically, the generated path can be used to generate motion commands for the mechanism to trace the path.”).
Re claim 68. Wherein the image data representative of the at least portion of the piece conveys information about a location of at least one welding joint (paragraph [0068]).
Re claim 69. Wherein the instructions cause a relative movement between the robotized arm and the at least portion of the piece (114, Fig. 2; and paragraph [0067]).
Re claim 72. Wherein the camera has a working distance ranging between 300 mm and 1000 mm (iRVision 3DL Operator Manual teaches, at “Installation of the 3D Laser Vision Sensor”, “The 3D Laser Vision Sensor is designed so that the measurement distance ranges from 350 mm to 450 mm.”).
Re claim 73. Wherein the working distance is adjustable (Krause teaches, at Fig. 1 the vision sensor 28 (iRVision® 3DL sensor) is positioned on a movable portion of a robot, and the distance to the workpiece is adjustable. iRVision 3DL Operator Manual teaches, at “Installation of the 3D Laser Vision Sensor”, “The 3D Laser Vision Sensor is designed so that the measurement distance ranges from 350 mm to 450 mm.” and the sensor can be used as a hand camera).
Re claim 74. Wherein the computing device is operatively connected to a database adapted to store at least one of reference images, reference data and reference points (paragraphs [0053, 0061, 0074 and 0077]).
Re claim 84. Wherein receiving the image data comprises acquiring the image data (108, Fig. 2; and paragraph [0057]: “In step 108, the vision system 26 generates an image of the workpiece 22 … The image is transmitted to the controller 20 for extracting information relating to a critical feature or edge represented in the image.”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 63 is rejected under 35 U.S.C. 103 as being unpatentable over Krause et al. (US Publication No. 2011/0282492) as applied to claim 62 above, and further in view of Tyson, II (US Publication No. 2020/0230899).
The teachings of Krause have been discussed above. Krause fails to specifically teach: (re claim 63) wherein the vision module comprises at least one profilometer.
Tyson teaches, at paragraphs [0064 and 0094], such robotic welding sensor systems may include laser profilometers to provide real-time non-contact measurement of the height of materials for continuous real-time measurement of gap, lapping and twists to provide out-of-specification alerts or present remedial operations.
In view of Tyson’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the system as taught by Krause, (re claim 63) wherein the vision module comprises at least one profilometer, with a reasonable expectation of success, since Tyson teaches such robotic welding sensor systems may include laser profilometers to provide real-time non-contact measurement of the height of materials for continuous real-time measurement of gap, lapping and twists to provide out-of-specification alerts or present remedial operations.
Claims 64-66, 70, 71, 82, and 83 are rejected under 35 U.S.C. 103 as being unpatentable over Krause et al. (US Publication No. 2011/0282492) as applied to claim 62 above, and further in view of Boillot et al. (US Publication No. 2021/0001423).
The teachings of Krause have been discussed above. Krause fails to specifically teach: (re claim 64) wherein the at least one optical source comprises a first optical source and a second optical source, wherein the first optical source is configured to irradiate a first irradiation path of the at least two irradiation paths and the second optical source is configured to irradiate a second irradiation path of the at least two irradiation paths; (re claim 65) wherein the first optical source is a first laser source and the second optical source is a second laser source; (re claim 66) wherein the first optical source is configured to emit a first light beam and the second optical source is configured to emit a second light beam, the first light beam having a first spatial profile and the second light beam having a second spatial profile, the first spatial profile and the second spatial profile being line shaped; (re claim 82) wherein the first optical source is a first laser source; and (re claim 83) wherein the second optical source is a second laser source.
Boillot teaches, at Figs. 1-3, and paragraphs [0024 and 0033], projecting laser beams 24, 26 to produce laser lines on a joint 62 of a workpiece 6 using dual laser range finders 16, 18. This allows such welding systems to both track joints and inspect a weld bead as discussed at paragraph [0034].
In view of Boillot’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the system as taught by Krause, (re claim 64) wherein the at least one optical source comprises a first optical source and a second optical source, wherein the first optical source is configured to irradiate a first irradiation path of the at least two irradiation paths and the second optical source is configured to irradiate a second irradiation path of the at least two irradiation paths; (re claim 65) wherein the first optical source is a first laser source and the second optical source is a second laser source; (re claim 66) wherein the first optical source is configured to emit a first light beam and the second optical source is configured to emit a second light beam, the first light beam having a first spatial profile and the second light beam having a second spatial profile, the first spatial profile and the second spatial profile being line shaped; (re claim 82) wherein the first optical source is a first laser source; and (re claim 83) wherein the second optical source is a second laser source, with a reasonable expectation of success, since Boillot teaches projecting laser beams to produce laser lines on a joint of a workpiece using dual laser range finders. This allows such welding systems to both track joints and inspect a weld bead.
Krause fails to specifically teach: (re claim 70) further comprising mechanical fasteners configured to mount the vision module on the fourth axis of the robotized arm; and (re claim 71) wherein the mechanical fasteners comprise:
a first support, the first support being provided on a middle portion of an upper arm of the robotized arm associated with the fourth axis, the first support being configured to hold the camera and one of the first optical source and the second optical source; and
a second support provided on a bottom portion of the upper arm of the robotized arm associated with the fourth axis, the second support being configured to hold a remaining one of the first optical source and the second optical source.
Boillot teaches, at Figs. 1-3, using mechanical fasteners to mount laser range finders 16, 18 near the middle portion of, and on either side of a robot processing tool 4. This robot processing tool 4 of Boillot corresponds to the tool 18 of Krause, which is mounted such that it moves about the fourth axis of the robot 12 of Krause. The housing 8 of Boillot supports the range finders 16, 18 on a middle portion of a distal portion of a robotic arm which moves with corresponding rotation of a fourth axis of a robotic arm. When welding a vertical seam, a portion of the housing 8 of Boillot will hold one of the range finders 16 or 18 on a bottom portion of the distal portion of the robotic arm. This allows such welding systems to both track joints and inspect a weld bead as discussed at paragraph [0034] of Boillot.
In view of Boillot’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the system as taught by Krause, (re claim 70) further comprising mechanical fasteners configured to mount the vision module on the fourth axis of the robotized arm; and (re claim 71) wherein the mechanical fasteners comprise: a first support, the first support being provided on a middle portion of an upper arm of the robotized arm associated with the fourth axis, the first support being configured to hold the camera and one of the first optical source and the second optical source; and a second support provided on a bottom portion of the upper arm of the robotized arm associated with the fourth axis, the second support being configured to hold a remaining one of the first optical source and the second optical source, with a reasonable expectation of success, since Boillot teaches mounting laser range finders 16, 18 near the middle portion of, and on either side of a robot processing tool 4. The housing 8 of Boillot supports the range finders 16, 18 on a middle portion of a distal portion of a robotic arm which moves with corresponding rotation of a fourth axis of a robotic arm. When welding a vertical seam, a portion of the housing 8 of Boillot will hold one of the range finders 16 or 18 on a bottom portion of the distal portion of the robotic arm. This allows such welding systems to both track joints and inspect a weld bead.
Claim 67 is rejected under 35 U.S.C. 103 as being unpatentable over Krause et al. (US Publication No. 2011/0282492) as modified by Boillot et al. (US Publication No. 2021/0001423) as applied to claim 66 above, and further in view of Boillot et al. (US Publication No. 2017/0345157; hereinafter Boillot ‘157).
The teachings of Krause have been discussed above. Krause fails to specifically teach: (re claim 67) wherein the first light beam is associated with a first illumination plane and the second light beam is associated with a second illumination plane, the camera being configured to receive a projection of an intersection of the first illumination plane, the second illumination plane, and the at least portion of the piece.
Boillot ‘157 teaches, at Fig. 2 and paragraph [0029], such welding robot sensing systems may project laser beam 66 at an angle different from an angle of laser beam 24. Both laser beams 66 and 24 intersect at the top surface of the workpiece 12 in the illustrated case of Fig. 2. Camera 2 can use the observed laser beams to determine if the robotic tool is at the surface of the workpiece 12, or if the robotic tool is above or below the plane of the workpiece 12 by determining if the laser beams 66 and 24 are coincident on the workpiece 12. Fig. 1 demonstrates that the laser beam 24 is projected in an illumination plane.
In view of Boillot ‘157’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the system as taught by Krause, (re claim 67) wherein the first light beam is associated with a first illumination plane and the second light beam is associated with a second illumination plane, the camera being configured to receive a projection of an intersection of the first illumination plane, the second illumination plane, and the at least portion of the piece, with a reasonable expectation of success, since Boillot ‘157 teaches such welding robot sensing systems may project laser beam 66 at an angle different from an angle of laser beam 24; and Camera 2 can use the observed laser beams to determine if the robotic tool is at the surface of the workpiece 12, or if the robotic tool is above or below the plane of the workpiece 12 by determining if the laser beams 66 and 24 are coincident on the workpiece 12.
Claim 85 is rejected under 35 U.S.C. 103 as being unpatentable over Krause et al. (US Publication No. 2011/0282492) with inherent features demonstrated by Fanuc (“iRVision Fully Integrated Plug & Play Vision System Machine Vision in 2D and 3D”) and iRVision 3DL Operator Manual as applied to claim 62 above.
The teachings of Krause have been discussed above. Krause fails to specifically teach: (re claim 85) wherein the camera has a substantially square field of view, the field of view having a side length ranging from about 30 mm to about 120 mm.
iRVision 3DL Operator Manual teaches, at “Workpiece”, when measurements are performed by the iRVision 3DL sensor used by Krause, the size of a detection part must fit in a field of view of approximately 200 mm x 200 mm for the 3D Laser Vision Sensor with an 8-mm lens or in a field of view of approximately 140 mm x 130 mm for the 3D Laser Vision Sensor with a 12-mm lens, and an approximately 30 mm square laser beam is directed to the measurement area.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention for the camera to have a substantially square field of view, the field of view having a side length ranging from about 30 mm, to about 120 mm, since it has been held that where the general conditions of a claim are disclosed in the prior art, discovering the optimum or workable ranges involves only routine skill in the art. In re Aller, 220 F.2d 454, 456, 105 USPQ 233, 235 (CCPA 1955)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SPENCER D PATTON whose telephone number is (571)270-5771. The examiner can normally be reached Monday to Friday 9:00-5:00 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571)272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SPENCER D PATTON/ Primary Examiner, Art Unit 3656