DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendment filed on 12/01/2025 has been entered and accepted.
Response to Arguments
Applicant's arguments filed 12/01/2025 have been fully considered but they are not persuasive.
Applicant argues that “Fujii does not teach the claimed … specific width/length geometry of the high-intensity zone” (Page 11 of applicant’s remarks filed 12/01/2025) and that “Nakanishi does not compute a center of gravity of the high-intensity region, nor does it define any feature with respect to such a centroid. It also does not decompose the bright region into a main portion and one or more elongated portions extending from the main portion, and does not define a length as the maximal extension in the direction of relative advancement from a center of gravity of the high-intensity zone to the farthest elongated portion” (Page 12 of applicant’s remarks filed 12/01/2025). However, as currently stated in the claims, the limitations merely provide a clarifying definition of the term “a length of each high-intensity zone”. While said limitations effectively limits said term to the provided definition, the term “length of each high-intensity zone” is not present anywhere else in the claim. As such, the limitation defining the length of each high-intensity zone is not effectively limiting. The prior art teaches measuring length using a different definition, but that applicant’s claims do not specifically limit, as an example, the characteristic parameter to be defined by said length of each high-intensity zone. The applicant’s claims currently only recite that the characteristic parameter is defined by “a width of the high-intensity zone” and additionally “another geometric feature of the high-intensity zone or an intensity of the high-intensity zone”. The applicant’s definition of “a length of each high-intensity zone” can reasonably be considered a geometric feature of the high-intensity zone but the prior art’s defined length can also reasonably be a geometric feature of the high-intensity zone. The Office recommends the applicant amend the claims such as to include the limitation to which the current claims defines if the applicant intends for the limitation of “a length of each high-intensity zone” to be limiting.
Applicant’s other arguments with respect to claim(s) 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. A new rejection has been made in view of FUJII (US 20210162544 A1) in view of NAKANISHI (CN 107803585 A) and Stork (US 20110278277 A1).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-2, 5-8, 12-13, 15-21, and 24-27 is/are rejected under 35 U.S.C. 103 as being unpatentable over FUJII (US 20210162544 A1) in view of NAKANISHI (CN 107803585 A) and Stork (US 20110278277 A1).
Regarding claim 1, FUJII (US 20210162544 A1) teaches a laser treatment method of a metallic work piece (Paragraph 9, laser processing of a workpiece) comprising at least the steps of:
a) directing a laser beam onto the work piece at a working zone of the working piece in order to execute a cutting and/or piercing (Paragraph 21, laser oscillator 2 emits a laser beam so as to irradiate the workpiece 11; Paragraph 100, cutting process where the workpiece is cut by the processing);
b) executing a relative movement between the laser beam and the work piece at a determined velocity (Paragraph 22, control device 6 control the laser oscillator 2 and the drive unit 5 such that the laser beam scans a processing path on the workpiece 11 according to set processing conditions; Paragraph 32, processing speed is a set processing condition);
c) acquiring a plurality of acquired images of the working zone (Paragraph 23, detection unit 7 is a sensor for observing a state of the workpiece 11 being processed and outputs a result of the observation as a time series signal; Paragraph 149, detection unit 7 includes a camera for observing the surface condition of the workpiece; Paragraph 24, information is gathered from the processing path of the laser beam on the workpiece);
d) determining, during which an analysis group determines respective time courses of a plurality of characteristic parameters from the acquired images (Paragraph 23, detection unit 7 is a sensor for observing a state of the workpiece 11 being processed and outputs a result of the observation as a time series signal);
e) calculating, during which an analysis group calculates at least one statistical parameter from the respective time course of the characteristic parameters (Paragraphs 39, processed state observation unit 8 obtains evaluation information by evaluation each section on the basis of the time series signal acquired from the detection unit 7 during processing of each section; Paragraphs 42-43, the processed state observation unit 8 obtains features values representing the characteristics of the processing including an average and standard deviation of measured values);
f) establishing, during which an analysis group establishes a quality value from the statistical parameters (Paragraph 43, a reference value is set for each of the standard deviation and a degree of defectiveness occurring in each section is evaluated according to a difference from the reference value); and
g) controlling one or more process parameters (Paragraph 52, control device 6 changes the operation of the laser processing apparatus 1 to eliminate the estimated cause of defection), the one or more process parameters comprising at least one of: an intensity of the laser beam and/or a laser frequency of the laser beam (Paragraph 32, laser operating conditions include output intensity and output frequency) and/or a position of the focus of the laser beam (Paragraph 32, position of the focal point of the laser beam) and/or the determined velocity (Paragraph 32, processing speed) and/or a gas jet and/or a gas pressure of the gas jet, in function of the quality value,
wherein as part of element e), determining respective probabilistic distributions from the respective time courses of the characteristic parameters over a defined time that is constant (Paragraph 23, detection unit 7 observes a state of the laser processing apparatus 1 during processing and outputs a time series signal; Paragraph 24, observation unit obtains evaluation information in each of a plurality of sections on the basis of time series signal acquired from the detection unit; Paragraph 32, processing speed is predetermined; Paragraph 49, section divided at regular intervals of processing or every certain distance along the processing path; Since the time series signal is output only during the processing, the amount of processing is predetermined by sections, and the processing speed is also predetermined, the time series is determined for a determined time; Paragraph 49, section may be divided at regular intervals of processing or at every certain distance along the processing path), and determining the respective statistical parameters from the respective probabilistic distributions (Paragraphs 39, processed state observation unit 8 obtains evaluation information by evaluation each section on the basis of the time series signal acquired from the detection unit 7 during processing of each section; Paragraphs 42-43, the processed state observation unit 8 obtains features values representing the characteristics of the processing including an average and standard deviation of measured values), wherein, for determining each quality value (for the at least one quality value), the analysis group analyzes a plurality of acquired images acquired in succession during the defined time (Paragraph 23, detection unit 7 is a sensor for observing a state of the workpiece 11 being processed and outputs a result of the observation as a time series signal; Paragraph 149, detection unit 7 includes a camera for observing the surface condition of the workpiece; Paragraph 24, information is gathered from the processing path of the laser beam on the workpiece)
FUJII fails to teach:
successive pluralities of acquired images cover the defined time and partially overlap by at least one acquired image,
wherein during step d), a transformation sub-step is performed, during which each acquired image is transformed into a respective transformed image; the transformation step being a thresholding sub-phase during which each acquired image is segmented in order to obtain a transformed image corresponding to a binary image;
wherein during step d), each acquired transformed image is analyzed to determine the one or more characteristic parameters;
wherein each acquired transformed image comprises a high-intensity zone,
wherein each high-intensity zone is defined by the zones of the respective acquired image, which have intensities that are greater than or equal to a determined intensity threshold;
wherein each characteristic parameter if obtained from the respective high intensity zones of the respective acquired images and at least one characteristic parameter is or is a function of, a width of the high-intensity zone and at least one other characteristic parameter is defined by, or as a function of, another geometrical feature of the high-intensity zone or an intensity of the high-intensity zone; and
wherein the respective width of each high-intensity zone is defined by a maximal extension of the respective high-intensity zone in a direction perpendicular to a direction of relative advancement; and
wherein each high-intensity zone comprises a main portion and one or more elongated portions extending from the main portion, and
wherein a length of each high-intensity zone is defined as a maximal extension of the high-intensity zone in the direction of relative advancement from a center of gravity of the high-intensity zone, the length corresponding to a maximal extension of an elongated portion of the high-intensity zone that is farthest from the center of gravity
NAKANISHI (CN 107803585 A) teaches a laser processing machine, wherein:
wherein during step d), a transformation sub-step is performed, during which each acquired image is transformed into a respective transformed image; the transformation step being a thresholding sub-phase during which each acquired image is segmented in order to obtain a transformed image corresponding to a binary image (Paragraphs 44-47, image processing unit 6 performs image processing using information on a position on the workpiece irradiated with the processing laser head; Paragraph 48-49, binarization unit 32 binarizes the image and supplies data of the image to the region identification unit);
wherein during step d), each acquired transformed image is analyzed to determine the one or more characteristic parameters (Paragraphs 50-51, feature amount extraction unit 34 extracts a size X1 length and a size Y1 width of the area from the image);
wherein each acquired transformed image comprises a high-intensity zone (Paragraph 48, image comprises an area with pixels with values greater than a threshold value),
wherein each high-intensity zone is defined by the zones of the respective acquired image, which have intensities that are greater than or equal to a determined intensity threshold (Paragraph 48, image comprises an area with pixels with values greater than a threshold value);
wherein each characteristic parameter if obtained from the respective high intensity zones of the respective acquired images and at least one characteristic parameter is or is a function of, a width of the high-intensity zone (Paragraphs 50-51, feature amount extraction unit 34 extracts a size Y1 width of the area from the image) and at least one other characteristic parameter is defined by, or as a function of, another geometrical feature of the high-intensity zone (Paragraphs 50-51, feature amount extraction unit 34 extracts a size X1 length) or an intensity of the high-intensity zone (Paragraph 48, image comprises an area with pixels with values greater than a threshold value; Paragraphs 50-51, feature amount extraction unit 34 extracts a size X1 length and a size Y1 width of the area from the image); and
wherein the respective width of each high-intensity zone is defined by a maximal extension of the respective high-intensity zone in a direction perpendicular to a direction of relative advancement (Paragraph 50, width direction is a direction perpendicular to the relative movement direction; Paragraph 46, relative movement direction is the direction parallel to the cut groove produced by laser processing);
wherein each high-intensity zone comprises a main portion and one or more elongated portions extending from the main portion (Figures 4a-5b, each high intensity zone AR1-AR4 has a variety of sizes and thus the feature extraction of the high-intensity zone could reasonably consist of a high-intensity zone comprising a main portion and one or more elongated portions; AR2 for example could reasonably be considered as containing a main portion and an elongated portion extending from the main portion)1, and
wherein a length of each high-intensity zone is defined as a maximal extension of the high-intensity zone in the direction of relative advancement from a center of gravity of the high-intensity zone, the length corresponding to a maximal extension of an elongated portion of the high-intensity zone that is farthest from the center of gravity (Figures 4-5, the limitation as provided above is merely providing a definition for the term of “a length of each high-intensity zone” which is not otherwise present in the claim; said limitation provides clarification on the definition of said term but given that such term is not otherwise present in the claim, said term and limitation is not limiting).
It would have thus been obvious to someone of ordinary skill in the art before the filing date of the claimed invention to have modified FUJII with NAKANISHI and have the image be binarized and analyzed to determine characteristic parameters. This would have been done such that the machining state can be grasped with high precision (NAKANISHI Paragraph 15).
The Office further notes that the determination of defects from parameters determined from images of the laser processing region is known in the art as evidenced by Paragraph 62 and Figure 2 of Serruys (US 20090050612 A1).
FUJII modified with NAKANISHI fails to teach:
successive pluralities of acquired images cover the defined time and partially overlap by at least one acquired image,
Stork (US 20110278277 A1) teaches a method and device for monitoring a laser processing operation, wherein:
for determining each quality value, the analysis group analyzes a plurality of acquired images acquired in succession during the defined time (Paragraph 83, two or more images are taken with a plurality of cameras are subsequently processed with each other to form at least one image), and successive pluralities of acquired images cover the defined time and partially overlap by at least one acquired image (Paragraphs 83-85, high dynamic range method wherein multiple images are captured of the same location wherein multiple images which overlap over the same location are averaged by weighting; Paragraph 102, said images can be used to perform closed-loop control of a laser processing operation),
It would have thus been obvious to someone of ordinary skill in the art before the filing date of the claimed invention to have modified FUJII with Stork and have at least some of the images overlap such that images which overlap over the same location are averaged. This would have been done to properly display intensity values of said areas which are distributed over a wide range in a single image (Stork Paragraph 83), which is useful for closed-loop control of a laser processing operation (Stork Paragraph 102).
Regarding claim 2, FUJII as modified teaches the laser treatment method according to claim 1, further comprising
at least one h) repeat step during which at least the steps c) - f) are repeated one or more times (Paragraph 38, processing path is divided into four sections; Paragraph 39, processed state observation unit obtains the evaluation information by evaluation each section during each section; Paragraph 36, detection unit gathers information during a specific section).
Regarding claim 5, FUJII as modified teaches the laser treatment method according to claim 1, wherein
as part of element e) the statistical parameter is standard deviation (Paragraph 43, feature values include standard deviation of the measured values acquired from the time series signal)
While FUJII does not explicitly teach that the statistical parameter is variance, one of ordinary skill in the art would recognize that the variance is merely the square of standard deviation and that one of ordinary skill in the art would have found it obvious to use variance as the statistical parameter instead of standard deviation as the use of one over another is merely a matter of obvious engineering choice.
Regarding claim 6, FUJII as modified teaches the laser treatment method according to claim 1, wherein
the statistical parameter and the quality value characterize a presence and/or a formation and/or a quantity of dross (Paragraph 48, processed state observation unit uses the feature values to obtain the evaluation information indicating classification information which includes dross).
The office further notes that monitoring of the volume of dross in laser processing apparatuses is well known in the art as evidenced by Mochizuki (US 11167377 B2).
Regarding claim 7, FUJII as modified teaches the laser treatment method according to claim 1, further comprising
the step of defining a desired quality value (Paragraph 43, reference value is set for the standard deviation);
wherein during the step of controlling, the process parameter or the process parameters are controlled as a function of the quality value and the desired quality value (Paragraph 43, a degree of defectiveness is defined as a value obtained based on the difference of each section from the reference value; Paragraph 52, control device 6 changes the operation of the laser processing apparatus 1 to eliminate the estimated cause of defection).
Regarding claim 8, FUJII as modified teaches the laser treatment method according to claim 7, wherein
during the step g), the process parameter or the process parameters are controlled in a manner to obtain a quality value equal to the desired quality value (Paragraph 52, control device 6 changes the operation of the laser processing apparatus 1 to eliminate the estimated cause of defection).
The Office further notes having a control device which corrects process parameters in a manner such as to achieve a quality value equal to a desired quality value is well known in the art as evidenced by Paragraph 48 of CALEFATI (US 20110192825 A1).
Regarding claim 12, FUJII as modified teaches the laser treatment method according to claim 1.
during the step f), the quality value is obtained from the statistical parameter from a linear function or a non-linear function (Paragraph 45, processed state observation unit 8 can obtain the evaluation information from the processed state by using linear discrimination, logistic regression, etc.; Paragraph 46, evaluation information can also be evaluated using neural network or the like).
Further, all functions are by definition either linear or non-linear, any function would satisfy the limitation of “a linear function or a non-linear function”.
Regarding claim 13, FUJII as modified teaches the laser treatment method according to claim 1.
FUJII fails to explicitly teach:
wherein during the step c), the acquired images are acquired at a rate of at least 1000 frames per second.
However, it would be obvious to one of ordinary skill in the art to have a high frame rate as having a higher frame rate would allow for a more detailed time-series image data. A higher frame rate resulting in better accuracy of data is known in the art as evidenced by TRAN (US 20220143704 A1). Furthermore, processing units with up to 1 million frames per second is known in the art as evidenced by Li (US 20170355635 A1) and figuring out the optimal frames per second to acquire images at would be the result of routine optimization.
Regarding claim 15, FUJII as modified teaches the laser treatment method according to claim 13.
FUJII fails to explicitly teach:
during the step c), the acquired images are acquired at a rate of at least 1500 frames per second.
However, it would be obvious to one of ordinary skill in the art to have a high frame rate as having a higher frame rate would allow for a more detailed time-series image data. A higher frame rate resulting in better accuracy of data is known in the art as evidenced by TRAN (US 20220143704 A1). Furthermore, processing units with up to 1 million frames per second is known in the art as evidenced by Li (US 20170355635 A1) and figuring out the optimal frames per second to acquire images at would be the result of routine optimization.
Regarding claim 16, FUJII as modified teaches the laser treatment method according to claim 1, further comprising
at least one h) repeat step during which at least the steps a) - f) are repeated one or more times (Paragraph 39, during the processing of each section the state observation unit 8 obtains the evaluation information by evaluation each section on the basis of the time series signal acquired from the detection unit 7 during processing of each section; Paragraph 38, processing path is divided into four sections).
Regarding claim 17, FUJII teaches a laser treatment machine configured to cut and/or pierce a work piece (Paragraph 9, laser processing of a workpiece), the laser treatment machine comprising:
a control unit (control device 6) configured to control an actuation of the laser treatment machine (Paragraph 31, control device 6 controls the laser oscillation and the drive unit 5 such that the laser beam scans the processing path on the workpiece);
an emission source of a laser beam operatively connected to the control unit and configured to emit the laser beam (Paragraph 21, laser oscillator 2 oscillates and emits a laser beam);
an optical group configured to control the laser beam (Paragraph 28, optical path 3 is a path for transmitting the laser beam output from the laser oscillator 2ot the processing head 4); and
a movement device operatively connected to the control unit (Paragraph 31, control device 6 controls the drive unit 5) and configured to execute a relative movement between the laser beam and the work piece at a determined velocity (Paragraph 30, drive unit 30 controls the relative position between the processing head 4 and the workpiece 11), the laser treatment machine being configured at least to:
a) direct a laser beam onto the work piece at a working zone of the working piece in order to execute a cutting and/or piercing (Paragraph 21, laser oscillator 2 emits a laser beam so as to irradiate the workpiece 11; Paragraph 100, cutting process where the workpiece is cut by the processing);
b) execute a relative movement between the laser beam and the work piece at a determined velocity (Paragraph 22, control device 6 control the laser oscillator 2 and the drive unit 5 such that the laser beam scans a processing path on the workpiece 11 according to set processing conditions; Paragraph 32, processing speed is a set processing condition);
c) acquire a plurality of acquired images of the working zone (Paragraph 23, detection unit 7 is a sensor for observing a state of the workpiece 11 being processed and outputs a result of the observation as a time series signal; Paragraph 149, detection unit 7 includes a camera for observing the surface condition of the workpiece; Paragraph 24, information is gathered from the processing path of the laser beam on the workpiece);
d) determine, by an analysis group, a time course of at least one characteristic parameter from the acquired images (Paragraph 23, detection unit 7 is a sensor for observing a state of the workpiece 11 being processed and outputs a result of the observation as a time series signal);
e) calculate, by an analysis group, at least one statistical parameter from the time course of the characteristic parameter (Paragraphs 39, processed state observation unit 8 obtains evaluation information by evaluation each section on the basis of the time series signal acquired from the detection unit 7 during processing of each section; Paragraphs 42-43, the processed state observation unit 8 obtains features values representing the characteristics of the processing including an average and standard deviation of measured values);
f) establish, by an analysis group, a quality value from the statistical parameter (Paragraph 43, a reference value is set for each of the standard deviation and a degree of defectiveness occurring in each section is evaluated according to a difference from the reference value); and
g) control one or more process parameters (Paragraph 52, control device 6 changes the operation of the laser processing apparatus 1 to eliminate the estimated cause of defection), the one or more process parameters comprising at least one of: an intensity of the laser beam and/or a laser frequency of the laser beam (Paragraph 32, laser operating conditions include output intensity and output frequency) and/or a position of the focus of the laser beam (Paragraph 32, position of the focal point of the laser beam) and/or the determined velocity (Paragraph 32, processing speed) and/or a gas jet and/or a gas pressure of the gas jet, in function of the quality value,
wherein as part of element e), the analysis group determines respective probabilistic distributions from the respective time courses of the characteristic parameters over a defined time that is constant (Paragraph 23, detection unit 7 observes a state of the laser processing apparatus 1 during processing and outputs a time series signal; Paragraph 24, observation unit obtains evaluation information in each of a plurality of sections on the basis of time series signal acquired from the detection unit; Paragraph 32, processing speed is predetermined; Paragraph 49, section divided at regular intervals of processing or every certain distance along the processing path; Since the time series signal is output only during the processing, the amount of processing is predetermined by sections, and the processing speed is also predetermined, the time series is determined for a determined time; Paragraph 49, section may be divided at regular intervals of processing or at every certain distance along the processing path), and determines the respective statistical parameters from the respective probabilistic distributions (Paragraphs 39, processed state observation unit 8 obtains evaluation information by evaluation each section on the basis of the time series signal acquired from the detection unit 7 during processing of each section; Paragraphs 42-43, the processed state observation unit 8 obtains features values representing the characteristics of the processing including an average and standard deviation of measured values), wherein for determining each quality value, the analysis group analyzes a plurality of acquired images acquired in succession during the defined time (Paragraph 23, detection unit 7 is a sensor for observing a state of the workpiece 11 being processed and outputs a result of the observation as a time series signal; Paragraph 149, detection unit 7 includes a camera for observing the surface condition of the workpiece; Paragraph 24, information is gathered from the processing path of the laser beam on the workpiece),
FUJII fails to teach:
successive pluralities of acquired images cover the defined time and partially overlap by at least one acquired image
wherein during step d), a transformation sub-step is performed, during which each acquired image is transformed into a respective transformed image; the transformation step being a thresholding sub-phase during which each acquired image is segmented in order to obtain a transformed image corresponding to a binary image;
wherein during step d), each acquired transformed image is analyzed to determine the one or more characteristic parameters;
wherein each acquired transformed image comprises a high-intensity zone,
wherein each high-intensity zone is defined by the zones of the respective acquired image, which have intensities that are greater than or equal to a determined intensity threshold;
wherein each characteristic parameter is obtained from the respective high intensity zones of the respective acquired images and at least one characteristic parameter is, or is a function of, a width of the high-intensity zone and at least one other characteristic parameter is defined by, or as a function of, another geometrical feature of the high-intensity zone or an intensity of the high-intensity zone;
wherein the respective width of each high-intensity zone is defined by a maximal extension of the respective high-intensity zone in a direction perpendicular to a direction of relative advancement;
wherein each high-intensity zone comprises a main portion and one or more elongated portions extending from the main portion; and
wherein a length of each high-intensity zone is defined as a maximal extension of the high-intensity zone in the direction of relative advancement from a center of gravity of the high-intensity zone, the length corresponding to a maximal extension of an elongated portion of the high-intensity zone that is farthest from the center of gravity.
NAKANISHI (CN 107803585 A) teaches a laser processing machine, wherein:
wherein during step d), a transformation sub-step is performed, during which each acquired image is transformed into a respective transformed image; the transformation step being a thresholding sub-phase during which each acquired image is segmented in order to obtain a transformed image corresponding to a binary image (Paragraphs 44-47, image processing unit 6 performs image processing using information on a position on the workpiece irradiated with the processing laser head; Paragraph 48-49, binarization unit 32 binarizes the image and supplies data of the image to the region identification unit);
wherein during step d), each acquired transformed image is analyzed to determine the one or more characteristic parameters (Paragraphs 50-51, feature amount extraction unit 34 extracts a size X1 length and a size Y1 width of the area from the image);
wherein each acquired transformed image comprises a high-intensity zone (Paragraph 48, image comprises an area with pixels with values greater than a threshold value),
wherein each high-intensity zone is defined by the zones of the respective acquired image, which have intensities that are greater than or equal to a determined intensity threshold (Paragraph 48, image comprises an area with pixels with values greater than a threshold value);
wherein each characteristic parameter is obtained from the respective high intensity zones of the respective acquired images and at least one characteristic parameter is, or is a function of, a width of the high-intensity zone (Paragraphs 50-51, feature amount extraction unit 34 extracts a size Y1 width of the area from the image) and at least one other characteristic parameter is defined by, or as a function of, another geometrical feature of the high-intensity zone (Paragraphs 50-51, feature amount extraction unit 34 extracts a size X1 length) or an intensity of the high-intensity zone (Paragraph 48, image comprises an area with pixels with values greater than a threshold value; Paragraphs 50-51, feature amount extraction unit 34 extracts a size X1 length and a size Y1 width of the area from the image);
wherein the respective width of each high-intensity zone is defined by a maximal extension of the respective high-intensity zone in a direction perpendicular to a direction of relative advancement (Paragraph 50, width direction is a direction perpendicular to the relative movement direction; Paragraph 46, relative movement direction is the direction parallel to the cut groove produced by laser processing); and wherein
each high-intensity zone comprises a main portion and one or more elongated portions extending from the main portion (Figures 4a-5b, each high intensity zone AR1-AR4 has a variety of sizes and thus the feature extraction of the high-intensity zone could reasonably consist of a high-intensity zone comprising a main portion and one or more elongated portions; AR2 for example could reasonably be considered as containing a main portion and an elongated portion extending from the main portion)2; and
wherein a length of each high-intensity zone is defined as a maximal extension of the high-intensity zone in the direction of relative advancement from a center of gravity of the high-intensity zone, the length corresponding to a maximal extension of an elongated portion of the high-intensity zone that is farthest from the center of gravity (Figures 4-5, the limitation as provided above is merely providing a definition for the term of “a length of each high-intensity zone” which is not otherwise present in the claim; said limitation provides clarification on the definition of said term but given that such term is not otherwise present in the claim, said term and limitation is not limiting).
It would have thus been obvious to someone of ordinary skill in the art before the filing date of the claimed invention to have modified FUJII with NAKANISHI and have the image be binarized and analyzed to determine characteristic parameters. This would have been done such that the machining state can be grasped with high precision (NAKANISHI Paragraph 15).
The Office further notes that the determination of defects from parameters determined from images of the laser processing region is known in the art as evidenced by Paragraph 62 and Figure 2 of Serruys (US 20090050612 A1).
FUJII fails to teach:
successive pluralities of acquired images cover the defined time and partially overlap by at least one acquired image
Stork (US 20110278277 A1) teaches a method and device for monitoring a laser processing operation, wherein:
wherein for determining each quality value, the analysis group analyzes a plurality of acquired images acquired in succession during the defined time (Paragraph 83, two or more images are taken with a plurality of cameras are subsequently processed with each other to form at least one image), and successive pluralities of acquired images cover the defined time and partially overlap by at least one acquired image (Paragraphs 83-85, high dynamic range method wherein multiple images are captured of the same location wherein multiple images which overlap over the same location are averaged by weighting; Paragraph 102, said images can be used to perform closed-loop control of a laser processing operation),
It would have thus been obvious to someone of ordinary skill in the art before the filing date of the claimed invention to have modified FUJII with Stork and have at least some of the images overlap such that images which overlap over the same location are averaged. This would have been done to properly display intensity values of said areas which are distributed over a wide range in a single image (Stork Paragraph 83), which is useful for closed-loop control of a laser processing operation (Stork Paragraph 102).
Regarding claim 18, FUJII as modified teaches the laser treatment machine according to claim 17, wherein
the laser treatment machine is further configured such that at least the elements c) - f) are repeated one or more times (Paragraph 38, processing path is divided into four sections; Paragraph 39, processed state observation unit obtains the evaluation information by evaluation each section during each section; Paragraph 36, detection unit gathers information during a specific section).
Regarding claim 19, FUJII as modified teaches the laser treatment machine according to claim 17, wherein
the laser treatment machine is further configured such that at least the elements a) - f) are repeated one or more times (Paragraph 39, during the processing of each section the state observation unit 8 obtains the evaluation information by evaluation each section on the basis of the time series signal acquired from the detection unit 7 during processing of each section; Paragraph 38, processing path is divided into four sections).
Regarding claim 20, FUJII as modified teaches the laser treatment machine according to claim 17, wherein
the laser treatment machine is further configured such that in element e) the statistical parameter is standard deviation (Paragraph 43, feature values include standard deviation of the measured values acquired from the time series signal)
While FUJII does not explicitly teach that the statistical parameter is variance, one of ordinary skill in the art would recognize that the variance is merely the square of standard deviation and that one of ordinary skill in the art would have found it obvious to use variance as the statistical parameter instead of standard deviation as the use of one over another is merely a matter of obvious engineering choice.
Regarding claim 21, FUJII as modified teaches the laser treatment method according to claim 1.
NAKANISHI further teaches:
the other geometrical feature is at least one of: a length, or a width (Paragraphs 50-51, feature amount extraction unit 34 extracts a size Y1 width and a size Y2 width).
It would have been obvious for the same motivation as claim 1.
Regarding claim 24, FUJII as modified teaches the laser treatment machine according to claim 17.
NAKANISHI further teaches:
the intensity is the at least one other characteristic parameter (Paragraph 48, image comprises an area with pixels with values greater than a threshold value; Paragraphs 50-51, feature amount extraction unit 34 extracts a size X1 length and a size Y1 width of the area from the image; whether the intensity is above or below threshold is determined)
It would have been obvious for the same motivation as claim 17.
Regarding claim 25, FUJII as modified teaches the laser treatment method according to claim 1, wherein:
as part of element c), the acquired images are acquired by a video camera (Paragraph 149, a camera for observing the surface condition of the workpiece),
NAKANISHI further teaches:
the camera being disposed in a coaxial configuration with an optical axis along which the laser beam propagates (Paragraph 32, imaging optical system 21 can observe the workpiece W coaxially with the irradiation optical system 15).
It would have been obvious for the same motivation as claim 1.
The Office further notes that use of a video camera coaxially with a laser beam is well known in the art as evidenced by KIM (KR 20050108702 A).
Regarding claim 26, FUJII as modified teaches the laser treatment method of claim 1, wherein:
NAKANISHI further teaches:
the direction of relative advancement being a movement of the laser beam relative to the work piece (Paragraph 50, width direction is a direction perpendicular to the relative movement direction; Paragraph 46, relative movement direction is the direction parallel to the cut groove produced by laser processing).
It would have been obvious for the same motivation as claim 1.
Regarding claim 27, FUJII as modified teaches the laser treatment machine of claim 17.
NAKANISHI further teaches:
the direction of relative advancement being a movement of the laser beam relative to the work piece (Paragraph 50, width direction is a direction perpendicular to the relative movement direction; Paragraph 46, relative movement direction is the direction parallel to the cut groove produced by laser processing).
It would have been obvious for the same motivation as claim 17.
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over FUJII (US 20210162544 A1) in view of NAKANISHI (CN 107803585 A) and Stork (US 20110278277 A1) as applied to claim 1 above, and further in view of Hesse (US 20180326534 A1).
Regarding claim 9, FUJII as modified teaches the laser treatment method according to claim 1.
FUJII as modified fails to teach:
each acquired image is a heat image
Hesse (US 20180326534 A1) teaches a method for monitoring a laser cutting process, wherein:
each acquired image is a heat image (Paragraphs 14 and 47, a thermal image of the zone of interaction can be captured and the evaluation apparatus can determine at least one characteristic value of the laser cutting)
It would have thus been obvious to someone of ordinary skill in the art before the filing date of the claimed invention to have modified FUJII with Hesse and have the acquired image be a heat image. This would have been done as a thermal image is known in the art as an acceptable alternative to detecting other forms of light when capturing an image of a detail of the workpiece (Hesse Paragraph 14).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANKLIN JEFFERSON WANG whose telephone number is (571)272-7782. The examiner can normally be reached M-F 10AM-6PM (E.S.T).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ibrahime Abraham can be reached at (571) 270-5569. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/F.J.W./Examiner, Art Unit 3761
/IBRAHIME A ABRAHAM/Supervisory Patent Examiner, Art Unit 3761
1 The Office further notes that it is well known in the art that captured high-intensity zones from a laser cutting region of interaction can consist of a main portion and one or more elongated portions extending from the main portion as evidenced by Figures 7A and 7B of Hesse (US 20180326534 A1) based on the formation of burr being formed during the cutting process (Hesse Paragraphs 72-73).
2 The Office further notes that it is well known in the art that captured high-intensity zones from a laser cutting region of interaction can consist of a main portion and one or more elongated portions extending from the main portion as evidenced by Figures 7A and 7B of Hesse (US 20180326534 A1) based on the formation of burr being formed during the cutting process (Hesse Paragraphs 72-73).