DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3 and 4 are rejected under 35 U.S.C. 103 as being unpatentable over Sadri et al. US 2021/097674 further in view of Asada et al. US 2010/0091272.
In regarding to claim 1 Sadri teaches:
1. A method of repairing a defect on a surface, the method comprising: imaging the surface to locate the defect with a first imaging system;
[0031] Any camera(s) that is capable of capturing images of surfaces while the object 10 is illuminated under the aforementioned wavelengths is suitable such as the cameras 106 and 110. Exemplary cameras include image colorimeters and photometers, commercially available from Radiant Vision Systems, LLC. The cameras 106 and 110 are movable (in rotation and translation, with an unlimited number of degrees of freedom (DOF)) such that all features, including corners, pockets or recesses, curved surfaces, flat surfaces, and all surface profile geometries, of the object 10 can be captured. In a variation, the camera 106 can be secured to a movable robot or robot arm to allow the camera 106 to more accurately and comprehensively capture all of the surfaces of the object 10. In a variation, multiple cameras work together to capture all of the surfaces of the object 10 and each can be secured to a respective robot or robot arm.
[0033] In a variation, the cameras (such as cameras 106 and 110) are configured to detect light from the light sources reflected and/or refracted from the surfaces of the object, e.g., by emitting electromagnetic waves that interact with the surfaces of the object 10. Based on these reflections or refractions of the light and the refractive index of the surfaces of the object 10, the camera 106 and 110 can detect surface irregularities on the surfaces of the object 10 and capture images of the detected surface irregularities. One example of a surface irregularity is scratch 19 (FIG. 1). Other non-limiting examples of surface irregularities include dimples, splotches and excess or wrinkled coating material, among others.
Sadri, 0030-0033 and Fig. 2, emphasis added
conducting a repair operation by contacting the surface with an abrasive article, wherein the abrasive article is pressed into contact with the surface in an area of the defect by a robotic repair system;
[0035] Referring to FIG. 3, a coordinate system 150 is displayed such that an image captured by the cameras depicting an irregularity can be displayed relative to the coordinate system 150. The coordinate system 150 can be displayed as a grid, array, or the like, to make it easier for identifying the precise location of the irregularity detected by any of the cameras. The coordinate system 150, shown as a grid 152 overlaying an image of a door panel 154 is illustrated. While a two-dimensional grid 152 and door panel 154 is illustrated in FIG. 3, it is contemplated three-dimensional models may also be illustrated. Coordinates corresponding to an irregularity 156 can be mapped to a set of coordinates corresponding to the door panel 154. In one variation the location of the irregularity 156 can be directly mapped to a 3D CAD (computer aided design) model of the door panel 154 for proper location. Accordingly, the coordinate system 150 allows identification of the location where an irregularity 156 has been identified by any of the cameras so that the irregularity 156 can be quickly and conveniently addressed. For example, and as described below, the irregularity 156 can be corrected by applying an abrasive to a location specified by the mapped corresponding coordinates. Referring to FIG. 4, the object 10 continues to traverse via the conveyor 102 to a finishing station 200. The conveyor 102 may be configured to continuously transport the conveying platform 104 and the object 10 through the finishing station 200 at a substantially constant speed with or without stopping in the finishing station 200. The finishing station 200 includes at least a movable first robot 202 on a first side 204 of the finishing station 200 and at least a movable second robot 206 on a second side 208 of the finishing station 200. The finishing station 200 further includes a first tool station 210 adjacent to the first robot 202 and a second tool station 212 adjacent to the second robot 206. Any tool stations (such as the first tool station 210 and the second tool station 212) have tools, such as sandpaper, polishing stones, grinder pads, grinder stones, and buffing stones, among others, that are appropriate for correcting irregularities of the surfaces of the object 10 that are identified. The tools may be tailored to the makeup of the object 10. For example, a different abrasive tool may operate to correct an irregularity when the object 10 comprises an aluminum alloy, as opposed to steel.
[0038] In operation, in the finishing station 200, a robot (such as the first robot 202) identifies and selects an abrasive, such as sandpaper, polishing stones, grinder pads, grinder stones, and buffing stones, among others, for attachment to the abrasive tool 214 from a respective tool station (such as the first tool station 212) that is tailored to correct the irregularity detected. The robot then moves the abrasive tool 214 to the irregularity of the surface of the object 10, applies a predetermined force to the surface via the abrasive tool 214, and abrades the irregularity for a predetermined amount of time (referred to herein as a “first abrasion”). While the robot abrades the irregularity, the light source 216 illuminates the irregularity and captures photographs of the irregularity. The captured photos are visually inspected by an operator or data corresponding to the captured photos are digitally analyzed to determine whether irregularities are visible, and, if so, whether further abrasion is warranted. If not, the robot again applies the abrasive tool 214 to the irregularity at a predetermined force for a predetermined amount of time (referred to herein as a “second abrasion”). The force and time in the second abrasion may be the same or different from the first abrasion, depending on the changing nature of the irregularity detected after the initial abrasion. If warranted, a third abrasion, a fourth abrasion, and additional abrasions, may occur. If, after a predetermined number of attempts at correcting the irregularity do not satisfactorily correct the irregularity, the object 10 may exit the production cycle for further processing.
Sadri, 0035-0038 and Figs. 2-3, emphasis added
imaging the abraded surface, with a second imaging system, wherein the second imaging system comprises a light source
Sadri, 0035-0038 and Figs. 2-3
However, Sadri fails to explicitly teach, but Asada teaches:
and wherein the imaging system operates in a near-dark field mode,
[0113] In a monochrome image captured by the CCD 21b for detecting blue color, the inclined portions 43, 44 of the defect Wb are displayed in dark color, and the remaining portion other than the inclined portions 43, 44 is displayed in light color, as shown in FIG. 7C. In a monochrome image captured by the CCD 21c for detecting green color, the inclined portion 44 of the defect Wb is displayed in light color, and the remaining portion other than the inclined portion 44 is displayed in dark color, as shown in FIG. 7D. Accordingly, the defect Wb can be contrasted with the other portion, and the image processing means 33 can easily extract the defect Wb.
[0115] In the surface inspection apparatus 1 of this embodiment, when the sensor unit 6 is positioned in a preset orientation to be opposed to the inspection surface Wa of the body W to be inspected, as shown in FIG. 8B, blue light B emitted from the blue light source 13 is specularly reflected by the inspection surface Wa, and the reflected light is received by the CCD 21b for detecting blue light. If a defect Wb is present on the inspection surface Wa, inclined portions of the defect Wb are displayed in dark color in a monochrome image captured by the CCD 21b for detecting blue light. Accordingly, the defect Wb can be clearly recognized.
Asada, 0113-0117, 0152-0159, emphasis added.
with the light source and the linescan array in a first configuration with respect to the surface,
[0025] In view of the above situation, a line camera 211 may be used in place of the area camera 201, and combined with the illuminating means 203, as shown in FIG. 27A-FIG. 27C, for example. In operation, the line camera 211 and the illuminating means 203 are moved relative to a body W to be inspected, so as to scan an inspection surface Wa of the body W. In this case, the range that can be photographed per unit time can be increased as compared with the case where the area camera 201 is used, because the line camera 211 generally has a greater frame rate than the area camera 201, and is able to capture image data in a shorter time. Accordingly, the defect Wb can be detected in a relatively short time.
Asada, 0025-0027, 0083-0084, 0113-0117, 0152-0159 and Figs. 16A-F, emphasis added
and in a dark field mode, with the light source and the linescan array in a second configuration,
[0113] In a monochrome image captured by the CCD 21b for detecting blue color, the inclined portions 43, 44 of the defect Wb are displayed in dark color, and the remaining portion other than the inclined portions 43, 44 is displayed in light color, as shown in FIG. 7C. In a monochrome image captured by the CCD 21c for detecting green color, the inclined portion 44 of the defect Wb is displayed in light color, and the remaining portion other than the inclined portion 44 is displayed in dark color, as shown in FIG. 7D. Accordingly, the defect Wb can be contrasted with the other portion, and the image processing means 33 can easily extract the defect Wb.
Asada, 0025-0027, 0083-0084, 0113-0117, 0152-0159 and see Figs. 16A-F for different configuration, emphasis added
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Asada with the system of Sadri in order wherein the imaging system operates in a near-dark field mode, with the light source and the linescan array in a first configuration with respect to the surface, and in a dark field mode, with the light source and the linescan array in a second configuration, as such, provides a surface inspection apparatus for quickly and easily detecting defects on a surface of a body to be inspected with high accuracy…--0028.
Furthermore, Sadri teaches:
wherein imaging comprises: scanning the surface in the defect area to obtain a topography of the defect area;
[0031] Any camera(s) that is capable of capturing images of surfaces while the object 10 is illuminated under the aforementioned wavelengths is suitable such as the cameras 106 and 110. Exemplary cameras include image colorimeters and photometers, commercially available from Radiant Vision Systems, LLC. The cameras 106 and 110 are movable (in rotation and translation, with an unlimited number of degrees of freedom (DOF)) such that all features, including corners, pockets or recesses, curved surfaces, flat surfaces, and all surface profile geometries, of the object 10 can be captured. In a variation, the camera 106 can be secured to a movable robot or robot arm to allow the camera 106 to more accurately and comprehensively capture all of the surfaces of the object 10. In a variation, multiple cameras work together to capture all of the surfaces of the object 10 and each can be secured to a respective robot or robot arm.
[0035] Referring to FIG. 3, a coordinate system 150 is displayed such that an image captured by the cameras depicting an irregularity can be displayed relative to the coordinate system 150. The coordinate system 150 can be displayed as a grid, array, or the like, to make it easier for identifying the precise location of the irregularity detected by any of the cameras. The coordinate system 150, shown as a grid 152 overlaying an image of a door panel 154 is illustrated. While a two-dimensional grid 152 and door panel 154 is illustrated in FIG. 3, it is contemplated three-dimensional models may also be illustrated. Coordinates corresponding to an irregularity 156 can be mapped to a set of coordinates corresponding to the door panel 154. In one variation the location of the irregularity 156 can be directly mapped to a 3D CAD (computer aided design) model of the door panel 154 for proper location. Accordingly, the coordinate system 150 allows identification of the location where an irregularity 156 has been identified by any of the cameras so that the irregularity 156 can be quickly and conveniently addressed. For example, and as described below, the irregularity 156 can be corrected by applying an abrasive to a location specified by the mapped corresponding coordinates. Referring to FIG. 4, the object 10 continues to traverse via the conveyor 102 to a finishing station 200. The conveyor 102 may be configured to continuously transport the conveying platform 104 and the object 10 through the finishing station 200 at a substantially constant speed with or without stopping in the finishing station 200. The finishing station 200 includes at least a movable first robot 202 on a first side 204 of the finishing station 200 and at least a movable second robot 206 on a second side 208 of the finishing station 200. The finishing station 200 further includes a first tool station 210 adjacent to the first robot 202 and a second tool station 212 adjacent to the second robot 206. Any tool stations (such as the first tool station 210 and the second tool station 212) have tools, such as sandpaper, polishing stones, grinder pads, grinder stones, and buffing stones, among others, that are appropriate for correcting irregularities of the surfaces of the object 10 that are identified. The tools may be tailored to the makeup of the object 10. For example, a different abrasive tool may operate to correct an irregularity when the object 10 comprises an aluminum alloy, as opposed to steel.
Sadri, 0031, 0035-0038 and Figs. 2-3, emphasis added
Furthermore, Asada teaches:
passing the second imaging system over the defect area such that a distance between the second imaging system and the surface is maintained;
[0098] FIG. 5 is a view explaining a method of setting the width of each light source of the illuminating means. The width "D" of each light source 12, 13, 14 as viewed in the sensor movement direction (the width as viewed in the direction of arrangement of the light sources) is determined by a distance between the light source 12, 13, 14 and the inspection surface Wa. For example, if the distance between the blue light source 13 and the inspection surface Wa is equal to "a", as shown in FIG. 5, distance "d" (=D/2) as a half of the width D as viewed in the sensor movement direction, i.e., distance "d" from the center axis Lc of the blue light source 13 to one end of the light source, is geometrically determined according to the following equation (1).
[0103] Then, the robot arm 5 moves the sensor unit 6 while keeping a certain distance between the sensor unit 6 and the body W to be inspected. When the sensor unit 6 reaches a specified point, an imaging start signal is sent to the camera control means 32, so that the imaging unit 3 starts capturing images.
Asada, 0085, 0098-0103, emphasis added.
and generating an image of the defect area,
[0084] As shown in FIG. 1, the surface inspection apparatus 1 has an irradiating unit 2 that emits a plurality of illumination light beams R, B having mutually different wavelength ranges, an imaging unit 3 that captures images of an inspection surface Wa of a body W to be inspected which is illuminated by the illumination light beams R, G, B from the irradiating unit 2, a control unit 4 that detects a defect Wb on the inspection surface Wa based on image data representing the images captured by the imaging unit 3, and a result display unit 7 that displays the result of detection obtained by the control unit 4.
Asada, 0083-0085, emphasis added.
wherein the image is a near dark field image or a dark field image;
[0113] In a monochrome image captured by the CCD 21b for detecting blue color, the inclined portions 43, 44 of the defect Wb are displayed in dark color, and the remaining portion other than the inclined portions 43, 44 is displayed in light color, as shown in FIG. 7C. In a monochrome image captured by the CCD 21c for detecting green color, the inclined portion 44 of the defect Wb is displayed in light color, and the remaining portion other than the inclined portion 44 is displayed in dark color, as shown in FIG. 7D. Accordingly, the defect Wb can be contrasted with the other portion, and the image processing means 33 can easily extract the defect Wb.
[0115] In the surface inspection apparatus 1 of this embodiment, when the sensor unit 6 is positioned in a preset orientation to be opposed to the inspection surface Wa of the body W to be inspected, as shown in FIG. 8B, blue light B emitted from the blue light source 13 is specularly reflected by the inspection surface Wa, and the reflected light is received by the CCD 21b for detecting blue light. If a defect Wb is present on the inspection surface Wa, inclined portions of the defect Wb are displayed in dark color in a monochrome image captured by the CCD 21b for detecting blue light. Accordingly, the defect Wb can be clearly recognized.
Asada, 0113-0117, 0152-0159, emphasis added.
Furthermore, Sadri teaches:
and generating an evaluation regarding the repair operation based on the generated image.
[0035] Referring to FIG. 3, a coordinate system 150 is displayed such that an image captured by the cameras depicting an irregularity can be displayed relative to the coordinate system 150. The coordinate system 150 can be displayed as a grid, array, or the like, to make it easier for identifying the precise location of the irregularity detected by any of the cameras. The coordinate system 150, shown as a grid 152 overlaying an image of a door panel 154 is illustrated. While a two-dimensional grid 152 and door panel 154 is illustrated in FIG. 3, it is contemplated three-dimensional models may also be illustrated. Coordinates corresponding to an irregularity 156 can be mapped to a set of coordinates corresponding to the door panel 154. In one variation the location of the irregularity 156 can be directly mapped to a 3D CAD (computer aided design) model of the door panel 154 for proper location. Accordingly, the coordinate system 150 allows identification of the location where an irregularity 156 has been identified by any of the cameras so that the irregularity 156 can be quickly and conveniently addressed. For example, and as described below, the irregularity 156 can be corrected by applying an abrasive to a location specified by the mapped corresponding coordinates. Referring to FIG. 4, the object 10 continues to traverse via the conveyor 102 to a finishing station 200. The conveyor 102 may be configured to continuously transport the conveying platform 104 and the object 10 through the finishing station 200 at a substantially constant speed with or without stopping in the finishing station 200. The finishing station 200 includes at least a movable first robot 202 on a first side 204 of the finishing station 200 and at least a movable second robot 206 on a second side 208 of the finishing station 200. The finishing station 200 further includes a first tool station 210 adjacent to the first robot 202 and a second tool station 212 adjacent to the second robot 206. Any tool stations (such as the first tool station 210 and the second tool station 212) have tools, such as sandpaper, polishing stones, grinder pads, grinder stones, and buffing stones, among others, that are appropriate for correcting irregularities of the surfaces of the object 10 that are identified. The tools may be tailored to the makeup of the object 10. For example, a different abrasive tool may operate to correct an irregularity when the object 10 comprises an aluminum alloy, as opposed to steel.
Sadri, 0031, 0035-0038 and Figs. 2-3, emphasis added
Note: The motivation that was applied to claim 1 above, applies equally as well to claims 2-4 as presented blow.
In regarding to claim 2 Sadri and Asada teaches:
2. The method of claim 1, furthermore, Asada teaches: wherein the second imaging system comprises a linescan array.
Asada, 0025-0027 and Fig. 2
In regarding to claim 3 Sadri and Asada teaches:
3. The method of claim 1, furthermore, Sadri teaches: wherein the second imaging system comprises a light source.
Sadri, 0031, 0035-0038 and Figs. 2-3
In regarding to claim 4 Sadri and Asada teaches:
4. The method of claim 1, furthermore, Asada teaches: wherein the image is a near dark field image,
Asada, 106, 0112-0113 and Figs. 7a-d
and wherein the method further comprises: passing the second imaging system over the defect area in a second pass such that a second distance between the second imaging system and the surface is maintained; and generating a dark field image of the defect area.
Asada, 0083-0084, 0152-0159 and Figs. 20a-f.
Claim Rejections - 35 USC § 103
Claims 5, 7-9, 11-16 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Asada et al. US 2010/0091272 further in view of Sadri et al. US 2021/097674.
In regarding to claim 5 Asada teaches:
5. A surface evaluation system comprising: an image capturing system that captures an image of a surface, wherein the image capturing system comprises: a light source;
[0084] As shown in FIG. 1, the surface inspection apparatus 1 has an irradiating unit 2 that emits a plurality of illumination light beams R, B having mutually different wavelength ranges, an imaging unit 3 that captures images of an inspection surface Wa of a body W to be inspected which is illuminated by the illumination light beams R, G, B from the irradiating unit 2, a control unit 4 that detects a defect Wb on the inspection surface Wa based on image data representing the images captured by the imaging unit 3, and a result display unit 7 that displays the result of detection obtained by the control unit 4.
Asada, 0083-0085, emphasis added
an image capturing device configured to capture a near dark field or dark field image of the surface;
[0113] In a monochrome image captured by the CCD 21b for detecting blue color, the inclined portions 43, 44 of the defect Wb are displayed in dark color, and the remaining portion other than the inclined portions 43, 44 is displayed in light color, as shown in FIG. 7C. In a monochrome image captured by the CCD 21c for detecting green color, the inclined portion 44 of the defect Wb is displayed in light color, and the remaining portion other than the inclined portion 44 is displayed in dark color, as shown in FIG. 7D. Accordingly, the defect Wb can be contrasted with the other portion, and the image processing means 33 can easily extract the defect Wb.
[0115] In the surface inspection apparatus 1 of this embodiment, when the sensor unit 6 is positioned in a preset orientation to be opposed to the inspection surface Wa of the body W to be inspected, as shown in FIG. 8B, blue light B emitted from the blue light source 13 is specularly reflected by the inspection surface Wa, and the reflected light is received by the CCD 21b for detecting blue light. If a defect Wb is present on the inspection surface Wa, inclined portions of the defect Wb are displayed in dark color in a monochrome image captured by the CCD 21b for detecting blue light. Accordingly, the defect Wb can be clearly recognized.
Asada, 0113-0117, 0152-0159, emphasis added.
However, Asada fails to explicitly teach, but Sadri teaches:
and a movement mechanism configured to move the image capturing device with respect to the curved surface,
[0031] Any camera(s) that is capable of capturing images of surfaces while the object 10 is illuminated under the aforementioned wavelengths is suitable such as the cameras 106 and 110. Exemplary cameras include image colorimeters and photometers, commercially available from Radiant Vision Systems, LLC. The cameras 106 and 110 are movable (in rotation and translation, with an unlimited number of degrees of freedom (DOF)) such that all features, including corners, pockets or recesses, curved surfaces, flat surfaces, and all surface profile geometries, of the object 10 can be captured. In a variation, the camera 106 can be secured to a movable robot or robot arm to allow the camera 106 to more accurately and comprehensively capture all of the surfaces of the object 10. In a variation, multiple cameras work together to capture all of the surfaces of the object 10 and each can be secured to a respective robot or robot arm.
Sadri, 0031-0033, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Sadri with the system of Asada in order a movement mechanism configured to move the image capturing device with respect to the curved surface, as such, all features, including corners, pockets or recesses, curved surfaces, flat surfaces, and all surface profile geometries, of the object can be captured …--0031.
Furthermore, Asada teaches:
wherein the movement mechanism maintains a fixed distance between the image capturing stance and the surface while the image capturing device moves with respect to the surface;
[0098] FIG. 5 is a view explaining a method of setting the width of each light source of the illuminating means. The width "D" of each light source 12, 13, 14 as viewed in the sensor movement direction (the width as viewed in the direction of arrangement of the light sources) is determined by a distance between the light source 12, 13, 14 and the inspection surface Wa. For example, if the distance between the blue light source 13 and the inspection surface Wa is equal to "a", as shown in FIG. 5, distance "d" (=D/2) as a half of the width D as viewed in the sensor movement direction, i.e., distance "d" from the center axis Lc of the blue light source 13 to one end of the light source, is geometrically determined according to the following equation (1).
[0103] Then, the robot arm 5 moves the sensor unit 6 while keeping a certain distance between the sensor unit 6 and the body W to be inspected. When the sensor unit 6 reaches a specified point, an imaging start signal is sent to the camera control means 32, so that the imaging unit 3 starts capturing images.
Asada, 0085, 0098-0101, emphasis added.
a view generator that, based on the near dark field or dark field image, generates a view of the surface.
[0113] In a monochrome image captured by the CCD 21b for detecting blue color, the inclined portions 43, 44 of the defect Wb are displayed in dark color, and the remaining portion other than the inclined portions 43, 44 is displayed in light color, as shown in FIG. 7C. In a monochrome image captured by the CCD 21c for detecting green color, the inclined portion 44 of the defect Wb is displayed in light color, and the remaining portion other than the inclined portion 44 is displayed in dark color, as shown in FIG. 7D. Accordingly, the defect Wb can be contrasted with the other portion, and the image processing means 33 can easily extract the defect Wb.
[0115] In the surface inspection apparatus 1 of this embodiment, when the sensor unit 6 is positioned in a preset orientation to be opposed to the inspection surface Wa of the body W to be inspected, as shown in FIG. 8B, blue light B emitted from the blue light source 13 is specularly reflected by the inspection surface Wa, and the reflected light is received by the CCD 21b for detecting blue light. If a defect Wb is present on the inspection surface Wa, inclined portions of the defect Wb are displayed in dark color in a monochrome image captured by the CCD 21b for detecting blue light. Accordingly, the defect Wb can be clearly recognized.
Asada, 0113-0117, 0152-0159, emphasis added.
Note: The motivation that was applied to claim 5 above, applies equally as well to claims 7-9, 11-16 and 18 as presented blow.
In regarding to claim 7 Asada and Sadri teaches:
7. The system of claim 5, furthermore, Sadri teaches: wherein the generated view shows surface variations indicative of discrete defects.
Sadri, 0033
In regarding to claim 8 Asada and Sadri teaches:
8. The system of claim 7, furthermore, Sadri teaches: wherein the discrete defects are dents or similar surface variations; and further comprising a dent evaluator that provides a localized position of the dent and an indication of dent severity.
Sadri, 0031, 0033, 0035-0038 and Figs. 2-3
In regarding to claim 9 Asada and Sadri teaches:
9. The system of claim 8, furthermore, Sadri teaches: wherein the discrete defects are scratches; and further comprising a scratch evaluator that provides a localized position of the scratch and an indication of scratch severity.
Sadri, 0031, 0033, 0035-0038 and Figs. 2-3
In regarding to claim 11 Asada and Sadri teaches:
11. The system of claim 5, furthermore, Sadri teaches: and further comprising: a path generator that receives topography information for the curved surface and, based on the topography information, generates a path for the movement mechanism that maintains a relative position of the image capturing device, the light source and the curved surface with respect to each other.
Sadri, 0031, 0033, 0035-0038 and Figs. 2-3
In regarding to claim 12 Asada and Sadri teaches:
12. The system of claim 11, furthermore, Sadri teaches: wherein the image capturing device, the surface and the light source form a right angle at a point on the surface being imaged.
Sadri, 0031, 0033, 0035-0038 and Figs. 2-3
In regarding to claim 13 Asada and Sadri teaches:
13. The system of claim 11, furthermore, Asada teaches: wherein the topography information comprises a topography generated based on sensor information from a distance sensor array.
Asada, 0085, 0098-0101, 0103
In regarding to claim 14 Asada and Sadri teaches:
14. The system of claim 13, furthermore, Asada teaches: wherein the distance sensor array is coupled to the movement mechanism, and moves ahead of the image capturing device,
[0098] FIG. 5 is a view explaining a method of setting the width of each light source of the illuminating means. The width "D" of each light source 12, 13, 14 as viewed in the sensor movement direction (the width as viewed in the direction of arrangement of the light sources) is determined by a distance between the light source 12, 13, 14 and the inspection surface Wa. For example, if the distance between the blue light source 13 and the inspection surface Wa is equal to "a", as shown in FIG. 5, distance "d" (=D/2) as a half of the width D as viewed in the sensor movement direction, i.e., distance "d" from the center axis Lc of the blue light source 13 to one end of the light source, is geometrically determined according to the following equation (1).
[0103] Then, the robot arm 5 moves the sensor unit 6 while keeping a certain distance between the sensor unit 6 and the body W to be inspected. When the sensor unit 6 reaches a specified point, an imaging start signal is sent to the camera control means 32, so that the imaging unit 3 starts capturing images.
Asada, 0085, 0098-0101, 0103, emphasis added.
Furthermore, Sadri teaches:
with respect to the curved surface, and wherein the path generator generates the path and provides the path to the movement mechanism in situ.
[0031] Any camera(s) that is capable of capturing images of surfaces while the object 10 is illuminated under the aforementioned wavelengths is suitable such as the cameras 106 and 110. Exemplary cameras include image colorimeters and photometers, commercially available from Radiant Vision Systems, LLC. The cameras 106 and 110 are movable (in rotation and translation, with an unlimited number of degrees of freedom (DOF)) such that all features, including corners, pockets or recesses, curved surfaces, flat surfaces, and all surface profile geometries, of the object 10 can be captured. In a variation, the camera 106 can be secured to a movable robot or robot arm to allow the camera 106 to more accurately and comprehensively capture all of the surfaces of the object 10. In a variation, multiple cameras work together to capture all of the surfaces of the object 10 and each can be secured to a respective robot or robot arm.
Sadri, 0031-0033, emphasis added.
In regarding to claim 15 Asada and Sadri teaches:
15. The system of claim 5, furthermore, Asada teaches: wherein the image capturing device is a linescan array,
Asada, Asada, 0025-0027
furthermore, Sadri teaches: or a 3D camera.
Sadri, 0035
In regarding to claim 16 Asada and Sadri teaches:
16. The system of claim 5, furthermore, Asada teaches: and further comprising a lens between the image capturing device and the light source.
Sadri, 0093, 0095 and Figs. 1-2
In regarding to claim 18 Asada and Sadri teaches:
18. The system of claim 5, furthermore, Asada teaches: wherein the surface is a curved surface and wherein maintaining the distance comprises adjusting a position of the imaging system to follow a curvature of the curved surface.
Asada, 0011, 0085, 0095, 0098-0103
Claim Rejections - 35 USC § 103
Claims 6 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Asada et al. US 2010/0091272 and Sadri et al. US 2021/097674 further in view of Nishijima US 2017/0098298.
In regarding to claim 6 Asada and Sadri teaches:
6. The system of claim 5, however, Asada and Sadri fails to explicitly teach, but Nishijim teaches: wherein the generated view shows surface variations indicative of haze.
[0016] The above object of embodiments of the present invention can be achieved by A haze determination method comprising: acquiring a plurality of images photographed by a stereo camera; calculating parallax from the plurality of images and extracting a set in which the parallax is within a predetermined range, as a segment, at intervals of a certain width in an image lateral direction; extracting segments coupled in the image lateral direction and an image depth direction, as a target; calculating a variation amount of upper end positions and a variation amount of height widths of the segments that constitute the target; and determining that the target is a haze if at least one of the variation amount of the upper end positions and the variation amount of the height widths is greater than a predetermined threshold value.
.Nishijima 0013-0017, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Nishijima with the system of Asada and Sadri in order wherein the generated view shows surface variations indicative of haze, as such, the haze determination process is used for a pre-crash safety system of the vehicle …--0042.
Note: The motivation that was applied to claim 6 above, applies equally as well to claim 10 as presented blow.
In regarding to claim 10 Asada and Sadri teaches:
10. The system of claim 5, furthermore, Nishijima teaches: and wherein, based on the image, a haze view or a scratch view, a surface evaluator provides a pass indication or a fail indication based on a comparison of the haze view or scratch view to a threshold, and wherein the pass indication is provided if the haze view or scratch view is outside an acceptable range.
Nishijima 0013-0017
Claim Rejections - 35 USC § 103
Claim 17 are rejected under 35 U.S.C. 103 as being unpatentable over Asada et al. US 2010/0091272 and Sadri et al. US 2021/097674 further in view of Yaya et al. US 20120038703.
In regarding to claim 17 Asada and Sadri teaches:
17. The system of claim 5, however, Asada and Sadri fails to explicitly teach, but Taya teaches: and further comprising a knife edge between the image capturing device and the light source.
[0072] With regard to image clarity, the value of the sharpness which was measured was adopted as the value of the image clarity by using DIAS DOI Image Analysis System made by QEA Inc. The sharpness value is defined as follows. With a white LED as a light source, a knife edge is located between the light source and a measurement sample, and the reflection image of the knife edge on the sample is photographed by a CCD camera (three hundred thousand pixels: 5 per pixel). The pixel visual field is 2.4 mm square. The luminance distribution of the knife edge portion of the reflection image is subjected to the first derivation, and the inverse of the half-value width thereof is defined as the Sharpness value. Accordingly, as a Sharpness value is the larger, the sharper reflection image is obtained, which means that the image clarity is high.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Taya with the system of Asada and Sadri in order further comprising a knife edge between the image capturing device and the light source, as such, with luminance distribution of the knife edge portion of the reflection image is subjected to the first derivation, and the inverse of the half-value width thereof is defined as the Sharpness value. Accordingly, as a Sharpness value is the larger, the sharper reflection image is obtained, which means that the image clarity is high.…--0072.
Claim Rejections - 35 USC § 103
Claims 19-23 and 24 are rejected under 35 U.S.C. 103 as being unpatentable over Asada et al. US 2010/0091272 further in view of Yaya et al. US 20120038703.
In regarding to claim 19 Asada teaches:
19. A robotic surface inspection system comprising: a motive robotic arm;
[0085] The irradiating unit 2 and the imaging unit 3 are provided in a sensor unit 6 attached to a distal end of a robot arm 5, such that these units 2, 3 are fixed integrally to the sensor unit 6. With the robot arm 5 controlled, the sensor unit 6 is moved in a preset sensor movement direction F along the inspection surface Wa, while keeping a constant distance or spacing from the inspection surface Wa of the body W to be inspected, as shown in FIG. 2.
Asada, 0084-0085, emphasis added.
an imaging system, coupled to the motive robotic arm, that captures an image of a surface,
[0084] As shown in FIG. 1, the surface inspection apparatus 1 has an irradiating unit 2 that emits a plurality of illumination light beams R, B having mutually different wavelength ranges, an imaging unit 3 that captures images of an inspection surface Wa of a body W to be inspected which is illuminated by the illumination light beams R, G, B from the irradiating unit 2, a control unit 4 that detects a defect Wb on the inspection surface Wa based on image data representing the images captured by the imaging unit 3, and a result display unit 7 that displays the result of detection obtained by the control unit 4.
[0085] The irradiating unit 2 and the imaging unit 3 are provided in a sensor unit 6 attached to a distal end of a robot arm 5, such that these units 2, 3 are fixed integrally to the sensor unit 6. With the robot arm 5 controlled, the sensor unit 6 is moved in a preset sensor movement direction F along the inspection surface Wa, while keeping a constant distance or spacing from the inspection surface Wa of the body W to be inspected, as shown in FIG. 2.
Asada, 0084-0085, emphasis added.
however, Asada fails to explicitly teach, but Taya teaches:
the imaging system comprising: a light source; a knife edge positioned in front of the light source; an image capturing device positioned such that light from the light source passes in front of the knife edge, reflects off the surface to the image capturing device;
[0072] With regard to image clarity, the value of the sharpness which was measured was adopted as the value of the image clarity by using DIAS DOI Image Analysis System made by QEA Inc. The sharpness value is defined as follows. With a white LED as a light source, a knife edge is located between the light source and a measurement sample, and the reflection image of the knife edge on the sample is photographed by a CCD camera (three hundred thousand pixels: 5 per pixel). The pixel visual field is 2.4 mm square. The luminance distribution of the knife edge portion of the reflection image is subjected to the first derivation, and the inverse of the half-value width thereof is defined as the Sharpness value. Accordingly, as a Sharpness value is the larger, the sharper reflection image is obtained, which means that the image clarity is high.
Taya, 0072, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Taya with the system of Asada in order the imaging system comprising: a light source; a knife edge positioned in front of the light source; an image capturing device positioned such that light from the light source passes in front of the knife edge, reflects off the surface to the image capturing device, as such, with luminance distribution of the knife edge portion of the reflection image is subjected to the first derivation, and the inverse of the half-value width thereof is defined as the Sharpness value. Accordingly, as a Sharpness value is the larger, the sharper reflection image is obtained, which means that the image clarity is high.…--0072.
Furthermore, Asada teaches: wherein a position of the light source and the image capturing device are fixed with respect to each other during an imaging operation;
[0085] The irradiating unit 2 and the imaging unit 3 are provided in a sensor unit 6 attached to a distal end of a robot arm 5, such that these units 2, 3 are fixed integrally to the sensor unit 6. With the robot arm 5 controlled, the sensor unit 6 is moved in a preset sensor movement direction F along the inspection surface Wa, while keeping a constant distance or spacing from the inspection surface Wa of the body W to be inspected, as shown in FIG. 2.
Asada, 0084-0085 and Figs. 1-2, emphasis added.
and a movement mechanism that moves the imaging system with respect to a surface during the imaging operation so that a fixed distance and orientation is maintained between the surface and the imaging system is maintained;
[0039] In the surface inspection apparatus according to the above aspect of the invention, the irradiating unit and the imaging unit may move as a unit in a given direction while keeping a specified distance from the inspection surface.
[0085] The irradiating unit 2 and the imaging unit 3 are provided in a sensor unit 6 attached to a distal end of a robot arm 5, such that these units 2, 3 are fixed integrally to the sensor unit 6. With the robot arm 5 controlled, the sensor unit 6 is moved in a preset sensor movement direction F along the inspection surface Wa, while keeping a constant distance or spacing from the inspection surface Wa of the body W to be inspected, as shown in FIG. 2.
Asada, 0039, 0084-0085 and Figs. 1-2, emphasis added.
a surface topography system comprising: a distance sensor array that moves with respect to the surface;
[0098] FIG. 5 is a view explaining a method of setting the width of each light source of the illuminating means. The width "D" of each light source 12, 13, 14 as viewed in the sensor movement direction (the width as viewed in the direction of arrangement of the light sources) is determined by a distance between the light source 12, 13, 14 and the inspection surface Wa. For example, if the distance between the blue light source 13 and the inspection surface Wa is equal to "a", as shown in FIG. 5, distance "d" (=D/2) as a half of the width D as viewed in the sensor movement direction, i.e., distance "d" from the center axis Lc of the blue light source 13 to one end of the light source, is geometrically determined according to the following equation (1).
Asada, 0098-0104 and Figs. 1-2, emphasis added.
and a topography generator that generates a topography based on sensor signals from the distance sensor array;
[0098] FIG. 5 is a view explaining a method of setting the width of each light source of the illuminating means. The width "D" of each light source 12, 13, 14 as viewed in the sensor movement direction (the width as viewed in the direction of arrangement of the light sources) is determined by a distance between the light source 12, 13, 14 and the inspection surface Wa. For example, if the distance between the blue light source 13 and the inspection surface Wa is equal to "a", as shown in FIG. 5, distance "d" (=D/2) as a half of the width D as viewed in the sensor movement direction, i.e., distance "d" from the center axis Lc of the blue light source 13 to one end of the light source, is geometrically determined according to the following equation (1).
Asada, 0098-0104 and Figs. 1-2, emphasis added.
and a controller that generates movement commands to the motive robotic arm that maintains a relative position of the imaging system with respect to a surface being imaged as the imaging system and the surface are moved with respect to each other,
[0085] The irradiating unit 2 and the imaging unit 3 are provided in a sensor unit 6 attached to a distal end of a robot arm 5, such that these units 2, 3 are fixed integrally to the sensor unit 6. With the robot arm 5 controlled, the sensor unit 6 is moved in a preset sensor movement direction F along the inspection surface Wa, while keeping a constant distance or spacing from the inspection surface Wa of the body W to be inspected, as shown in FIG. 2.
[0096] The control unit 4 consists of a computer or an electronic circuit device, or the like, which is housed in a control board (not shown). The control unit 4 executes control programs, so as to serve as a light source control means 31, camera control means 32, image processing means 33 and a lens aperture control means 34, as its internal functions.
Asada, 0084-0085, 0096-0097 and Figs. 1-2, emphasis added.
and wherein the controller generates the movement commands based on the generated topography.
[0084] As shown in FIG. 1, the surface inspection apparatus 1 has an irradiating unit 2 that emits a plurality of illumination light beams R, B having mutually different wavelength ranges, an imaging unit 3 that captures images of an inspection surface Wa of a body W to be inspected which is illuminated by the illumination light beams R, G, B from the irradiating unit 2, a control unit 4 that detects a defect Wb on the inspection surface Wa based on image data representing the images captured by the imaging unit 3, and a result display unit 7 that displays the result of detection obtained by the control unit 4.
[0096] The control unit 4 consists of a computer or an electronic circuit device, or the like, which is housed in a control board (not shown). The control unit 4 executes control programs, so as to serve as a light source control means 31, camera control means 32, image processing means 33 and a lens aperture control means 34, as its internal functions.
Asada, 0084-0085, 0096-0097 and Figs. 1-2, emphasis added.
Note: The motivation that was applied to claim 19 above, applies equally as well to claims 20-24 as presented blow.
In regarding to claim 20 Asada and Taya teaches:
20. The system of claim 19, furthermore, Asada teaches: wherein the orientation comprises a right angle formed between the image capturing device, the surface, and the light source.
Asada, 0094 and Fig. 6,
In regarding to claim 21 Asada and Taya teaches:
21. The system of claim 19, furthermore, Asada teaches: wherein, in a first movement sequence, the distance sensor array captures topography information and wherein, in a second movement sequence, the imaging system captures image information.
Asada, 0098-0103
In regarding to claim 22 Asada and Taya teaches:
22. The system of claim 19, furthermore, Asada teaches: wherein the surface topography system and the imaging system are both active during a movement sequence, wherein the topography generator generates the topography in-situ, and wherein the controller generates the movement commands in-situ based on received topography information from the topography generator in substantially real-time.
Asada, 0084-0085 and Figs. 1-2
In regarding to claim 23 Asada and Taya teaches:
23. The system of claim 19, furthermore, Asada teaches: wherein the surface is a curved surface.
Asada, 0011
In regarding to claim 24 Asada and Taya teaches:
24. The system of claim 23, furthermore, Asada teaches: wherein the curved surface comprises curvature in two directions.
Asada, 0011
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 25, 28-31 and 35 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Asada et al. US 2010/0091272.
In regarding to claim 25 Asada teaches:
25. A method of evaluating a surface, the method comprising: imaging the surface, using a line scan array imaging system, to produce an image of the surface,
[0093] The imaging unit 3 includes a line camera 21, a lens system 24, and a prism 25 (see FIG. 3), as shown in FIG. 1. The line camera 21 is in the form of a linear array sensor for capturing color images, and has three CCDs (Charge Coupled Device) 21a, 21b, 21c for detecting red light, blue light and green light, respectively, as shown in FIG. 3.
Asada, 0093-0095, emphasis added.
wherein the imaging system moves along an imaging path with respect to the surface, and wherein the imaging path maintains a substantially constant distance between the line scan array imaging system and the surface;
[0085] The irradiating unit 2 and the imaging unit 3 are provided in a sensor unit 6 attached to a distal end of a robot arm 5, such that these units 2, 3 are fixed integrally to the sensor unit 6. With the robot arm 5 controlled, the sensor unit 6 is moved in a preset sensor movement direction F along the inspection surface Wa, while keeping a constant distance or spacing from the inspection surface Wa of the body W to be inspected, as shown in FIG. 2.
Asada, 0085, 0098-0103 and Fig. 2, emphasis added.
processing the image to generate a processed image;
[0107] FIG. 6 is a view showing how a defect Wb is presented as a contrast in images captured, and FIG. 7A through FIG. 7D are schematic views of images captured by the imaging unit 3 in a condition shown in FIG. 6.
Asada, 0107 and Fig. 2, emphasis added.
and automatically generating an evaluation, using an image evaluate the image or processed image, wherein the evaluation comprises an indication of surface quality.
[0009] The above-mentioned visual inspection conducted by a worker imposes a great physical burden on the worker, and it is thus difficult for the worker to continue the inspecting operation for a long time, which makes it difficult to increase the productivity. Also, the visual inspection depends largely on the ability or efficiency of the worker, and the inspection quality varies to a great extent from one worker to another, which makes it difficult to maintain uniform quality.
[0111] In the color image, therefore, a top portion 41 of the defect Wb and a flat or smooth portion 42 other than the defect Wb are indicated in blue color B, and the inclined portion 43 of the defect Wb is indicated in red color R, while the inclined portion 44 of the defect Wb is indicated in green color as shown in FIG. 7A.
Asada, 0009, 0111-0113 and Fig. 7a-d, emphasis added.
In regarding to claim 28 Asada teaches:
28. The method of claim 25, wherein the imaging system is mounted on a robotic arm, and wherein the imaging system is moved along the imaging path by the robotic arm.
[0085] The irradiating unit 2 and the imaging unit 3 are provided in a sensor unit 6 attached to a distal end of a robot arm 5, such that these units 2, 3 are fixed integrally to the sensor unit 6. With the robot arm 5 controlled, the sensor unit 6 is moved in a preset sensor movement direction F along the inspection surface Wa, while keeping a constant distance or spacing from the inspection surface Wa of the body W to be inspected, as shown in FIG. 2.
Asada, 0084-0085 and Fig. 2, emphasis added.
In regarding to claim 29 Asada teaches:
29. The method of claim 28, wherein the imaging path is generated by a controller based on a topography of the curved surface.
Asada, 0011 and Fig. 2
In regarding to claim 30 Asada teaches:
30. The method of claim 29, wherein the topography is provided to the controller from a distance sensor array that detects the topography as the distance sensor array travels over the curved surface.
Asada, 0011 and Fig. 2
In regarding to claim 31 Asada teaches:
31. The method of claim 30, wherein the distance sensor array is mounted to the robotic arm, such that the distance sensor array travels ahead of the imaging system, and wherein the controller generates the imaging path in situ based on incoming sensor signals from the distance sensor array.
Asada, 0084-0085 and Fig. 2, emphasis added.
In regarding to claim 35 Asada teaches:
35. The method of claim 25, wherein the imaging system comprises a linescan array.
Asada, 0093
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness
Claim 26 is rejected under 35 U.S.C. 103 as being unpatentable over Asada et al. US 2010/0091272 further in view of Nishijima US 2017/0098298.
In regarding to claim 26 Asada teaches:
26. The method of claim 25, however, Asada fails to explicitly teach, but Nishijim teaches wherein the line scan array imaging system is in a haze imaging mode, and the processed image is a haze image,
[0016] The above object of embodiments of the present invention can be achieved by A haze determination method comprising: acquiring a plurality of images photographed by a stereo camera; calculating parallax from the plurality of images and extracting a set in which the parallax is within a predetermined range, as a segment, at intervals of a certain width in an image lateral direction; extracting segments coupled in the image lateral direction and an image depth direction, as a target; calculating a variation amount of upper end positions and a variation amount of height widths of the segments that constitute the target; and determining that the target is a haze if at least one of the variation amount of the upper end positions and the variation amount of the height widths is greater than a predetermined threshold value.
.Nishijima 0013-0017, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Nishijima with the system of Asada and Sadri in order wherein the generated view shows surface variations indicative of haze, as such, the haze determination process is used for a pre-crash safety system of the vehicle …--0042.
Furthermore, Asada teaches: and wherein the haze imaging mode comprises the imaging system in a dark field configuration.
Asada, 0106, 0113
Claim Rejections - 35 USC § 103
Claim 27 is rejected under 35 U.S.C. 103 as being unpatentable over Asada et al. US 2010/0091272 further in view of Sadri et al. US 2021/097674.
In regarding to claim 27 Asada teaches:
27. The method of claim 25, however, Asada fails to explicitly teach, but Sadri teaches wherein the processed image is a scratch image, and wherein the indication of surface quality is a scratch quantity, scratch severity, scratch depth, or scratch location,
[0033] In a variation, the cameras (such as cameras 106 and 110) are configured to detect light from the light sources reflected and/or refracted from the surfaces of the object, e.g., by emitting electromagnetic waves that interact with the surfaces of the object 10. Based on these reflections or refractions of the light and the refractive index of the surfaces of the object 10, the camera 106 and 110 can detect surface irregularities on the surfaces of the object 10 and capture images of the detected surface irregularities. One example of a surface irregularity is scratch 19 (FIG. 1). Other non-limiting examples of surface irregularities include dimples, splotches and excess or wrinkled coating material, among others.
Sadri, 0033, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Sadri with the system of Asada in order wherein the processed image is a scratch image, and wherein the indication of surface quality is a scratch quantity, scratch severity, scratch depth, or scratch location as such, detect surface irregularities on the surfaces of the object in order to correct irregularity within the surface of the detected object.
Furthermore, Asada teaches: and wherein the scratch image is captured while the line scan imaging system is in a near-dark field configuration.
[0106] The image processing unit 33 performs image processing on the images captured by the respective CCDs 21a, 21b, 21c, so that a portion of the image where the reflected light is not incident upon the imaging unit 3 is displayed in a dark color, and the dark portion is extracted as a defect Wb. Then, the position and image, or the like, of the defect Wb are displayed on the result display unit 7.
Asada, 0106, 0113, emphasis added.
Claim Rejections - 35 USC § 103
Claims 32-34 and 36-38 are rejected under 35 U.S.C. 103 as being unpatentable over Asada et al. US 2010/0091272 further in view of Sadri et al. US 2021/097674.
In regarding to claim 32 Asada teaches:
32. The method of claim 31, however, Asada fails to explicitly teach, but Sadri teaches: wherein the imaging path comprises the robot arm changing a relative position or orientation of the imaging system with respect to the curved surface as the imaging path is executed.
[0031] Any camera(s) that is capable of capturing images of surfaces while the object 10 is illuminated under the aforementioned wavelengths is suitable such as the cameras 106 and 110. Exemplary cameras include image colorimeters and photometers, commercially available from Radiant Vision Systems, LLC. The cameras 106 and 110 are movable (in rotation and translation, with an unlimited number of degrees of freedom (DOF)) such that all features, including corners, pockets or recesses, curved surfaces, flat surfaces, and all surface profile geometries, of the object 10 can be captured. In a variation, the camera 106 can be secured to a movable robot or robot arm to allow the camera 106 to more accurately and comprehensively capture all of the surfaces of the object 10. In a variation, multiple cameras work together to capture all of the surfaces of the object 10 and each can be secured to a respective robot or robot arm.
Sadri, 0031, 0035-0038 and Figs. 2-3, emphasis added
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Sadri with the system of Asada in order wherein the imaging path comprises the robot arm changing a relative position or orientation of the imaging system with respect to the curved surface as the imaging path is executed, as such, all features, including corners, pockets or recesses, curved surfaces, flat surfaces, and all surface profile geometries, of the object can be captured …--0031.
Note: The motivation that was applied to claim 32 above, applies equally as well to claims 33-34 and 36-38 as presented blow.
In regarding to claim 33 Asada and Sadri teaches:
33. The method of claim 32, furthermore Sadri teaches: wherein changing the relative orientation of the imaging system with respect to the robot arm maintains a relative orientation of the imaging system with respect to the curved surface as the imaging path is executed.
Sadri, 0031, 0035-0038 and Figs. 2-3
In regarding to claim 34 Asada teaches:
34. The method of claim 30, wherein the distance sensor array travels a topography path to detect the topography,
Asada, 0098-0101 and Fig. 2
Furthermore, Sadri teaches: and wherein the topography path is based on a retrieved 3D model of the curved surface.
Sadri, 0031, 0035
In regarding to claim 36 Asada teaches:
36. The method of claim 25, furthermore, Sadri teaches: wherein the imaging system comprises a 3D camera.
Sadri, 0035
In regarding to claim 37 Asada teaches:
37. The method of claim 25, furthermore, Sadri teaches: wherein the indication of surface quality comprises an orange peel characterization, a defect residual indication, a scratch indication or a haze indication.
Sadri, 0033
In regarding to claim 38 Asada teaches:
38. The method of claim 25, furthermore, Sadri teaches: wherein the curved surface comprises a repaired area, and wherein the indication of surface quality comprises an indication of repair quality for the repaired area.
Sadri, 0011, 0031, 0035
Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure.
Allen et al. US 2019/0096057
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL T TEKLE whose telephone number is (571)270-1117. The examiner can normally be reached Monday-Friday 8:00-4:30 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached at 571-272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANIEL T TEKLE/Primary Examiner, Art Unit 2481