Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendment presents claims 1-2, 6, and 13-14 as amended, claims 7 and 8 as cancelled, and claims 16/17 as added. Claims 1-6 and 9-17 are pending examination.
The amendment is sufficient in overcoming the previously indicated objection of claim 6, the rejections of claims 7 and 13 under 35 USC 112 (b), and the prior art rejections of independent claims 1 and 14 under 35 USC 103 in view of Kim and Mitani.
Further grounds of rejection, necessitated by the amendment, are presented herein.
Response to Arguments
Applicant's arguments filed 02/10/2026 have been fully considered but they are not persuasive.
Applicant contends that:
“…the neuromorphic sensor is directed to a vapor capillary or to a melt pool or to a cutting kerf. In Kim, the camera is directed to the welding bead. Mitani does not describe any laser machining process at all.” See Remarks, page 7.
Applicant further contends that “[A]lthough Fukushima discloses the use of CNNs and RNNs…Fukushima does not even use a camera.” See Remarks, page 7.
In response, the Examiner respectfully disagrees. As detailed herein, the inclusion of “vapor capillary,” “melt pool,” or “cutting kerf” does not provide a patentable distinction over the prior art.
A claim is only limited by positively recited elements. Thus, "[i]nclusion of the material or article worked upon by a structure being claimed does not impart patentability to the claims." In re Otto, 312 F.2d 937, 136 USPQ 458, 459 (CCPA 1963); see also In re Young, 75 F.2d 996, 25 USPQ 69 (CCPA 1935).
In Otto, the claims were directed to a core member for hair curlers (i.e., a particular device) and a method of making the core member (i.e., a particular method of making that device) and "not to a method of curling hair wherein th[e] particular device is used." 312 F.2d at 940. The court held that patentability of the claims cannot be based "upon a certain procedure for curling hair using th[e] device and involving a number of steps in the process." The court noted that "the process is irrelevant as is the recitation involving the hair being wound around the core" in terms of determining patentability of the particular device. Id. Therefore, the inclusion of the material or article worked upon by a structure being claimed does not impart patentability to the claims.
In Young, a claim to a machine for making concrete beams included a limitation to the concrete reinforced members made by the machine as well as the structural elements of the machine itself. The court held that the inclusion of the article formed within the body of the claim did not, without more, make the claim patentable.
In In re Casey, 370 F.2d 576, 152 USPQ 235 (CCPA 1967), an apparatus claim recited "[a] taping machine comprising a supporting structure, a brush attached to said supporting structure, said brush being formed with projecting bristles which terminate in free ends to collectively define a surface to which adhesive tape will detachably adhere, and means for providing relative motion between said brush and said supporting structure while said adhesive tape is adhered to said surface." An obviousness rejection was made over a reference to Kienzle which taught a machine for perforating sheets. The court upheld the rejection stating that "the references in claim 1 to adhesive tape handling do not expressly or impliedly require any particular structure in addition to that of Kienzle." Id. at 580-81. The perforating device had the structure of the taping device as claimed, the difference was in the use of the device, and "the manner or method in which such machine is to be utilized is not germane to the issue of patentability of the machine itself." Id. at 580.
In this case, the limitation of “a cutting kerf, a vapor capillary and/or a melt pool surrounding the vapor capillary” refers to the material or article worked upon during laser machining, which does not structurally define or limit the claimed monitoring system. In other words, the claim is directed to a system for monitoring a laser machining process on a workpiece and not to a laser machining process itself. The details of the machining area do not structurally limit the monitoring system or the claimed image sensor. As such, the image sensor of Kim, being described as a high speed camera imaging the machining area, would be structurally capable of imaging a cutting kerf, vapor capillary, and/or melt pool surrounding the vapor capillary. See also MPEP 2112 and 2114.
With respect to Applicant’s arguments against Fukushima, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). In this case, Fukushima was not relied upon to teach a camera.
For the reasons detailed herein, the Examiner maintains, based on the preponderance of evidence, that the rejections of claims 1-6 and 9-17 under 35 USC 103 is proper.
Claim Interpretation
The claims use the term “neuromorphic image sensor.” Such term is understood, in view of the instant specification (e.g., para. 0014), to include “event-based image sensor” and “event-based camera.”
Claim Objections
Claim 16 is objected to because of the following informalities: “compuring unit” and “funciton” should be “computing unit” and “function,” respectively. Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 2 and 16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 2 recites that the image sensor is configured to generate image data from “an area in advance of the machining area and/or an area in the wake of the machining area.” Claim 1 has been amended to recite that the image sensor is configured to generate image data from a machining area of the workpiece. It is unclear if the “areas” in advance and in the wake of the machining area are including in the machining area or if claim 2 is intending for the image sensor to generate additional image data from such areas. In other words, claim 1 requires the image data to be from a machining area while claim 2 states that the image data is from areas in advance or in the wake of the machining area. It is unclear, for instance, if claim 2 is attempting to broaden the area being imaged or provide additional areas for imaging. For purpose of examination, claim 2 will be interpreted as imaging any area of the machining area.
Claim 16 recites that the input data is determined by a further transfer function and that the further transfer function between the image data and the input data is formed by a trained neural network. However, claim 1 is amended to recite that output data is based on the input data by a transfer function and that the transfer function between the input data and the output data is formed by a trained neural network. It is unclear if the trained neural network of claim 16 is referring to the same as required in claim 1 or to a second trained neural network. If the neural network of claim 16 is the same as required in claim 1, it is unclear in what way, if any, the transfer function of claim 16 and that of claim 1 are different. For purposes of examination, claim 16 will be understood to reference the same neural network of claim 1 and for the transfer functions to refer to the different layers, between which data is passed, inherent in such a neural network.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1, 2, 4, 6, 9-13, and 16-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US20150001196) in view of Mitani (US20220349707; relying on effective filing date of 10/28/2019) and in further view of Fukushima (US20200406392, relying of effective filing date of 06/28/2019).
Regarding claim 1, Kim teaches a system for monitoring a laser machining process on a workpiece (para. 0003; “a method and an apparatus for monitoring a laser welding bead, and more particularly, to a technology capable of determining in real time whether or not welding defects are generated during laser welding.”), said system (Fig. 1-3) comprising:
PNG
media_image1.png
394
426
media_image1.png
Greyscale
Fig. 1 of Kim
a
a computing unit (classification processor 1244 which is operatively coupled to camera 114 via interface 121 and processor 1242) configured to determine input data based on the image data (para. 0052; “determines whether or not welding defects are generated using the feature variable and classifies a defect form (pattern).”), and to determine output data (Fig. 3, processor 1244 operatively coupled to controller 1246 such that controller 1246 receives output data from processor 1244. Para. 0053; controller 1246 controls the operation of the laser welding machine 10 depending on data from processor 1244) based on the input data by means of a transfer function (processor 1244 includes quality inspection classification processor 710 and precise inspection classification processor 720-Figs. 7-9) (para. 0077-0079; processors 710 and 720 utilize a neural network) [Here, the neural network algorithm is understood to refer to a transfer function utilized by the computing unit to generate the output data], said output data containing information (defects from bead shape-para. 0010) about the laser machining process,
wherein the transfer function between the input data and the output data and/or the further transfer function between the image data and the input data is formed by a trained neural network (neural network of Kim).
Kim teaches substantially the claimed invention including using an image sensor to capture an image. Kim teaches using a high speed camera for real time monitoring of a resulting weld bead (see para. 0043) (para. 0010; measuring bead shape) where the camera senses light reflected from the surface of the weld bead. Kim is silent on the sensor (i.e., camera) being a neuromorphic sensor.
Kim is additionally silent on the machining area being imaged including a cutting kerf, a vapor capillary and/or a melt pool surrounding the vapor capillary.
A claim is only limited by positively recited elements. Thus, "[i]nclusion of the material or article worked upon by a structure being claimed does not impart patentability to the claims." In re Otto, 312 F.2d 937, 136 USPQ 458, 459 (CCPA 1963); see also In re Young, 75 F.2d 996, 25 USPQ 69 (CCPA 1935).
In Otto, the claims were directed to a core member for hair curlers (i.e., a particular device) and a method of making the core member (i.e., a particular method of making that device) and "not to a method of curling hair wherein th[e] particular device is used." 312 F.2d at 940. The court held that patentability of the claims cannot be based "upon a certain procedure for curling hair using th[e] device and involving a number of steps in the process." The court noted that "the process is irrelevant as is the recitation involving the hair being wound around the core" in terms of determining patentability of the particular device. Id. Therefore, the inclusion of the material or article worked upon by a structure being claimed does not impart patentability to the claims.
In Young, a claim to a machine for making concrete beams included a limitation to the concrete reinforced members made by the machine as well as the structural elements of the machine itself. The court held that the inclusion of the article formed within the body of the claim did not, without more, make the claim patentable.
In In re Casey, 370 F.2d 576, 152 USPQ 235 (CCPA 1967), an apparatus claim recited "[a] taping machine comprising a supporting structure, a brush attached to said supporting structure, said brush being formed with projecting bristles which terminate in free ends to collectively define a surface to which adhesive tape will detachably adhere, and means for providing relative motion between said brush and said supporting structure while said adhesive tape is adhered to said surface." An obviousness rejection was made over a reference to Kienzle which taught a machine for perforating sheets. The court upheld the rejection stating that "the references in claim 1 to adhesive tape handling do not expressly or impliedly require any particular structure in addition to that of Kienzle." Id. at 580-81. The perforating device had the structure of the taping device as claimed, the difference was in the use of the device, and "the manner or method in which such machine is to be utilized is not germane to the issue of patentability of the machine itself." Id. at 580.
In this case, the limitation of “a cutting kerf, a vapor capillary and/or a melt pool surrounding the vapor capillary” refers to the material or article worked upon during laser machining, which does not structurally define or limit the claimed monitoring system. In other words, the claim is directed to a system for monitoring a laser machining process on a workpiece and not to a laser machining process itself. The details of the machining area do not structurally limit the monitoring system or the claimed image sensor. As such, the image sensor of Kim, being described as a high speed camera imaging the machining area, would be structurally capable of imaging a cutting kerf, vapor capillary, and/or melt pool surrounding the vapor capillary. See also MPEP 2112 and 2114.
Along a related field, Mitani teaches a measurement device for measuring the shape of an object or information indicating the shape of an object (para. 0002), which is considered reasonably pertinent to quality control (See for instance, paras. 0003, 0016, and 0035 of the instant application that details monitoring for quality control purposes). In other words, Mitani is concerned with improving the measuring of an object, which would be reasonably pertinent to the quality control of the resulting weld beads formed by a laser processing machine.
Furthermore, Mitani’s imaging unit for measuring the shape of an object is considered analogous to Kim’s imaging unit for measuring the shape of a weld bead.
Mitani’s system uses an event-based sensor (even-based camera 30-para. 0060) that receives reflected light from the surface of an object (R; Fig. 1), which is considered analogous to the arrangement of the imaging camera (114) receiving reflected light from the surface of the workpiece (Fig. 2) of Kim.
Mitani teaches that the event-based camera generates “an image from the event data outputted from the imaging unit” and, specifically, “outputs event data (specifically, two-dimensional point data, time, and polarities of luminance changes) including two-dimensional point data that identifies the position of each pixel that undergoes luminance changes” (para. 0060). Mitani also teaches that the camera captures and outputs data regarding luminance changes determined by respective pixels of the camera (para. 0038 and 0061).
Therefore, it would have been obvious to someone with ordinary skill in the art at the time the invention was filed to modify Kim with Mitani, by substituting the pixel based high speed camera of Kim, with the pixel based event-based camera of Mitani, for in doing so would provide an imaging unit that reduces data communication and generates images of the object at a higher speed (see Mitani, para. 0007).
Furthermore, the proposed combination would amount to a simple substitution of art recognized cameras (high speed vs event-based) performing the same function of imaging light reflected from the surface of the workpiece in order to determine a shape and the results of the substitution would have been predictable. (See MPEP 2144.06-II).
The combination of Kim and Mitani teaches substantially the claimed invention except for wherein the trained neural network comprises a convolutional neural network, a binary neural network, and/or a recurrent neural network.
Fukushima relates to an evaluation system for monitoring a laser welding system (Abstract) and teaches a trained neural network being a convolutional neural network, a binary neural network, and/or a recurrent neural network (para. 0081; “evaluation model 52d may be generated using various techniques known in the machine learning art. Examples of the techniques include, but are not limited to, various deep learning techniques such as CNN (Convolutional Neural Network) and RNN (Recurrent Neural Network). Another non-limiting example technique is SVM (Support Vector Machine). It is to be noted that these techniques have been provided for exemplary purposes, and it is possible to select any other learning technique suitable for the information sought to be obtained in generating the evaluation model 52d.”).
Therefore, it would have been obvious to someone with ordinary skill in the art at the time the invention was filed to modify Kim, as modified by Mitani, with Fukushima by replacing the type of neural network of modified Kim, with the neural network of Fukushima for in doing so would provide an alternative machine learning known in the art.
Regarding claim 2, the primary combination, as applied in claim 1, teaches each claimed limitation, including wherein said neuromorphic image sensor (Kim as modified to include the event-based camera of Mitani) is configured to generate image data from an area in advance of the machining area and/or an area in the wake of the machining area (Kim, image data from the weld bead).
Regarding claim 4, the primary combination, as applied in claim 1, teaches each claimed limitation, including wherein said neuromorphic image sensor (Kim as modified to include the event-based camera of Mitani) comprises a plurality of pixels configured to generate image data independently of one another in response to changes in brightness sensed by the respective pixel (Mitani; Abstract; “The object is optically imaged by an imaging unit and an image based on event data is acquired. The event data, which are outputted from the image sensor, include two-dimensional point data that specifies the positions of pixels corresponding to the pixels that had luminance changes responsively to the stripe pattern projected. Based on the event data, an image of the object is obtained.” Para. 0016; “the imaging unit is provided with an image sensor, the image sensor outputting event data including data of two-dimensional points whose positions of pixels are specified corresponding to changes of luminescence when receiving the light, and is configured to generate the captured images from the event data outputted by the image sensor.” ).
Regarding claim 6, the primary combination, as applied in claim 1, teaches each claimed limitation, including wherein said computing unit (Kim, 1244) is configured to determine the input data by a further transfer function based on the image data, and/or [Note: the use of “and/or” allows, under broadest reasonable interpretation for only one limitation to be met by the prior art] wherein the image data transmitted from said neuromorphic image sensor (Kim as modified by Mitani) are the input data (as detailed above in claim 1. The image data from the sensor is the input data going into the computing unit).
Regarding claim 9, the primary combination, as applied in claim 1, teaches each claimed limitation, including wherein the information about the laser machining process includes information about a state of the laser machining process, about a machining result, about a machining error and/or about a machining area of said workpiece [Note: the use of “and/or” allows, under broadest reasonable interpretation for only one limitation to be met by the prior art] (Kim, as detailed in claim 1 above, teaches that the information includes a shape of the weld bead and determining a defect based on that information).
Regarding claim 10, the primary combination, as applied in claim 1, teaches each claimed limitation, including wherein the computing unit (Kim, 1244) is configured to output the output data as control data for a laser machining system carrying out the laser machining process (Fig. 3 of Kim shows processor 1244 operatively coupled to controller 1246 such that controller 1246 receives output data from processor 1244. Para. 0053 discloses that controller 1246 controls operation of the laser welding machine based on data from processor 1244).
Regarding claim 11, the primary combination, as applied in claim 1, teaches each claimed limitation, including wherein a laser machining system (Kim, machining system 10) for machining a workpiece (20A/B) using a laser beam (laser welding-para. 0003, said laser machining system comprising: a laser machining head (the machining head is taken as the head shown in Figure 1 that outputs the laser onto workpiece 20A/B and generates the weld bead 30) for radiating a laser beam onto said workpiece; and the system according to claim 1 (as detailed above).
Regarding claim 12, the primary combination, as applied in claim 11, teaches each claimed limitation, including wherein [Kim] said computing unit (1244) is arranged on or in said laser machining head (10), and/or wherein said neuromorphic image sensor (Kim as modified by Mitani; Kim’s camera is shown on and outside of the machining head in Figure 1. The proposed combination seeks to use Mitani’s camera in the same location as Kim’s camera) is arranged on an outside of said laser machining head and/or on said laser machining head.
Regarding claim 13, the primary combination, as applied in claim 11, teaches each claimed limitation, including wherein [Kim] a laser source configured to generate the laser beam (Kim teaches system 10 is for laser welding. A laser source in inherently required); and a control unit configured to control, based on the output data determined by said computing unit, said laser machining system and/or said laser machining head and/or said laser source and/or to control the laser machining process (Fig. 3 of Kim shows processor 1244 operatively coupled to controller 1246 such that controller 1246 receives output data from processor 1244. Para. 0053 discloses that controller 1246 controls operation of the laser welding machine based on data from processor 1244).
Regarding claim 16, the primary combination, as applied in claim 1, teaches each claimed limitation, including wherein the computing unit (processor 1244 of Kim) is configured to determine the input data by a further transfer function based on the image data, and wherein the further transfer function between the image data and the input data is formed by a trained neural network (For purposes of examination, claim 16 will be understood to reference the same neural network of claim 1 and for the transfer functions to refer to the different layers, between which data is passed, inherent in such a neural network. See neural network of Kim, as detailed in claim 1).
Regarding claim 17, the primary combination, as applied in claim 1, teaches each claimed limitation, except for wherein the cutting kerf includes a cutting front and/or a puncture hole.
However, as detailed above in claim 1, the details of the machining area do not provide a patentable distinction over the prior art.
Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US20150001196) in view of Mitani (US20220349707; relying on effective filing date of 10/28/2019), Fukushima (US20200406392, relying of effective filing date of 06/28/2019), and in view of Takigawa (US20200387131).
Regarding claim 3, the primary combination, as applied in claim 1, teaches each claimed limitation, except for wherein said neuromorphic image sensor is configured to transmit image data to said computing unit continuously and/or asynchronously.
Takigawa relates to a laser machining system in which image data from an imaging sensor [(Fig. 2; 13) (Fig. 5; 15) (Fig. 7; 17)] is used as input data to a computing unit (machine learning device 6; Fig. 8). Takigawa teaches the laser machining system including a controller (33; Fig. 8) that receives an output from the computing unit (6) and controls at least one parameter in response (para. 0138).
Takigawa also teaches the computing unit (6) acquiring imaging data continuously, as well as, at a point in time when an abnormality occurs (para. 0137).
Therefore, it would have been obvious to someone with ordinary skill in the art at the time the invention was filed to modify Kim, as modified by Mitani and Fukushima, with Takigawa by adding to the temporal operation of the imaging unit of modified Kim, as well as, adding to the functionality of the computing unit of Kim, with the continuous supplying of imaging data taught by Takigawa for in doing so would provide additional data points that would improve the functionality of the computing unit by predicting the occurrence of a defect (see para. 0137 of Takigawa).
Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US20150001196) in view of Mitani (US20220349707; relying on effective filing date of 10/28/2019), Fukushima (US20200406392, relying of effective filing date of 06/28/2019), and in further view of Lee (US20150030204).
Regarding claim 5, the primary combination, as applied in claim 4, teaches each claimed limitation, including wherein the image data of a pixel comprise at least a pixel address corresponding to the pixel (Mitani; output data includes two-dimensional point data that identifies the location of a pixel-para. 0038).
The combination is silent on wherein the image data of a pixel comprises a time stamp corresponding to the sensed change in brightness.
Lee relates to a system for analyzing at least one of an appearance of an object and a motion of an object (Abstract).
Lee teaches using an event-based vision sensor that asynchronously provides an output signal in response to detection of a predetermined event, which includes a change in brightness of light incident on the event-based vision sensor (para. 0030).
Lee further teaches that the “event signal may include a time stamp at which a time of a predetermined event is detected, an indicator for indicating a type of an event, and an index of a pixel in which the predetermined event is detected” and the “time stamps corresponding to the pixels of a resolution may be stored in a table in memory, thereby time signals of event times for pixels may be utilized, as discussed below” (para. 0040). Lee, therefore, teaches the image data of a pixel comprising a time stamp corresponding to the sensed change in brightness.
Therefore, it would have been obvious to someone with ordinary skill in the art at the time the invention was filed to modify Kim, as modified by Mitani and Fukushima, with Lee by adding to the image data from the imaging unit of modified Kim, as well as, adding to the functionality of the computing unit of Kim, with the time stamp data taught by Lee, for in doing so would allow for the classifying of pixel patterns (Lee, para. 0041), thereby further improving imaging of the workpiece (Lee teaches using the determined pattern to analyze the shape, outline, or location of the object. In this case, providing time stamp data to classify a pixel pattern would improve the monitoring system’s ability to accurately measure the shape of the weld bead and, as a result, a defect.).
Claim(s) 14-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US20150001196) in view of Mitani (US20220349707; relying on effective filing date of 10/28/2019), Fukushima (US20200406392, relying of effective filing date of 06/28/2019), and in further view of Lee (US20150030204).
Regarding claim 14, Kim teaches a method for monitoring a laser machining process on a workpiece (para. 0003; “a method and an apparatus for monitoring a laser welding bead, and more particularly, to a technology capable of determining in real time whether or not welding defects are generated during laser welding.”), said method (Fig. 1-3) comprising:
generating image data from a surface of said workpiece using a
determining input data based on the image data (classification processor 1244 which is operatively coupled to camera 114 via interface 121 and processor 1242) (para. 0052; “determines whether or not welding defects are generated using the feature variable and classifies a defect form (pattern).”),
determining output data based on the input data by a transfer function (Fig. 3, processor 1244 operatively coupled to controller 1246 such that controller 1246 receives output data from processor 1244. Para. 0053; controller 1246 controls the operation of the laser welding machine 10 depending on data from processor 1244) (processor 1244 includes quality inspection classification processor 710 and precise inspection classification processor 720-Figs. 7-9) (para. 0077-0079; processors 710 and 720 utilize a neural network) [Here, the neural network algorithm is understood to refer to a transfer function utilized by the computing unit to generate the output data], said output data containing information about the laser machining process (defects from bead shape-para. 0010).
wherein the transfer function between the input data and the output data and/or the further transfer function between the image data and the input data is formed by a trained neural network (neural network of Kim).
Kim teaches substantially the claimed invention including using an image sensor to capture an image. Kim teaches using a high speed camera for real time monitoring of a resulting weld bead (see para. 0043) (para. 0010; measuring bead shape) where the camera senses light reflected from the surface of the weld bead. Kim is silent on the sensor (i.e., camera) being a neuromorphic sensor.
Along a related field, Mitani teaches a measurement device for measuring the shape of an object or information indicating the shape of an object (para. 0002), which is considered reasonably pertinent to quality control (See for instance, paras. 0003, 0016, and 0035 of the instant application that details monitoring for quality control purposes). In other words, Mitani is concerned with improving the measuring of an object, which would be reasonably pertinent to the quality control of the resulting weld beads formed by a laser processing machine.
Furthermore, Mitani’s imaging unit for measuring the shape of an object is considered analogous to Kim’s imaging unit for measuring the shape of a weld bead.
Mitani’s system uses an event-based sensor (even-based camera 30-para. 0060) that receives reflected light from the surface of an object (R; Fig. 1), which is considered analogous to the arrangement of the imaging camera (114) receiving reflected light from the surface of the workpiece (Fig. 2) of Kim.
Mitani teaches that the event-based camera generates “an image from the event data outputted from the imaging unit” and, specifically, “outputs event data (specifically, two-dimensional point data, time, and polarities of luminance changes) including two-dimensional point data that identifies the position of each pixel that undergoes luminance changes” (para. 0060). Mitani also teaches that the camera captures and outputs data regarding luminance changes determined by respective pixels of the camera (para. 0038 and 0061).
Therefore, it would have been obvious to someone with ordinary skill in the art at the time the invention was filed to modify Kim with Mitani, by substituting the pixel based high speed camera of Kim, with the pixel based event-based camera of Mitani, for in doing so would provide an imaging unit that reduces data communication and generates images of the object at a higher speed (see Mitani, para. 0007).
Furthermore, the proposed combination would amount to a simple substitution of art recognized cameras (high speed vs event-based) performing the same function of imaging light reflected from the surface of the workpiece in order to determine a shape and the results of the substitution would have been predictable. (See MPEP 2144.06-II).
The combination of Kim and Mitani teaches substantially the claimed invention except for wherein the trained neural network comprises a convolutional neural network, a binary neural network, and/or a recurrent neural network.
Fukushima relates to an evaluation system for monitoring a laser welding system (Abstract) and teaches a trained neural network being a convolutional neural network, a binary neural network, and/or a recurrent neural network (para. 0081; “evaluation model 52d may be generated using various techniques known in the machine learning art. Examples of the techniques include, but are not limited to, various deep learning techniques such as CNN (Convolutional Neural Network) and RNN (Recurrent Neural Network). Another non-limiting example technique is SVM (Support Vector Machine). It is to be noted that these techniques have been provided for exemplary purposes, and it is possible to select any other learning technique suitable for the information sought to be obtained in generating the evaluation model 52d.”).
Therefore, it would have been obvious to someone with ordinary skill in the art at the time the invention was filed to modify Kim, as modified by Mitani, with Fukushima by replacing the type of neural network of modified Kim, with the neural network of Fukushima for in doing so would provide an alternative machine learning known in the art.
The combination of Kim, Mitani, and Fukushima, teaches each claimed limitation, including wherein the image data of a pixel comprise at least a pixel address corresponding to the pixel (Mitani; output data includes two-dimensional point data that identifies the location of a pixel-para. 0038).
The combination is silent on wherein the image data of a pixel comprises a time stamp corresponding to the sensed change in brightness.
Lee relates to a system for analyzing at least one of an appearance of an object and a motion of an object (Abstract).
Lee teaches using an event-based vision sensor that asynchronously provides an output signal in response to detection of a predetermined event, which includes a change in brightness of light incident on the event-based vision sensor (para. 0030).
Lee further teaches that the “event signal may include a time stamp at which a time of a predetermined event is detected, an indicator for indicating a type of an event, and an index of a pixel in which the predetermined event is detected” and the “time stamps corresponding to the pixels of a resolution may be stored in a table in memory, thereby time signals of event times for pixels may be utilized, as discussed below” (para. 0040). Lee, therefore, teaches the image data of a pixel comprising a time stamp corresponding to the sensed change in brightness.
Therefore, it would have been obvious to someone with ordinary skill in the art at the time the invention was filed to modify Kim, as modified by Mitani and Fukushima, with Lee by adding to the image data from the imaging unit of modified Kim, as well as, adding to the functionality of the computing unit of Kim, with the time stamp data taught by Lee, for in doing so would allow for the classifying of pixel patterns (Lee, para. 0041), thereby further improving imaging of the workpiece (Lee teaches using the determined pattern to analyze the shape, outline, or location of the object. In this case, providing time stamp data to classify a pixel pattern would improve the monitoring system’s ability to accurately measure the shape of the weld bead and, as a result, a defect.).
Regarding claim 15, the primary combination, as applied in claim 14, teaches each claimed limitation, including controlling, in real time, at least one parameter of the laser machining process based on the determined output data (Fig. 3 of Kim shows processor 1244 operatively coupled to controller 1246 such that controller 1246 receives output data from processor 1244. Para. 0053 discloses that controller 1246 controls operation of the laser welding machine based on data from processor 1244) (para. 0009; “an apparatus and a method for monitoring a laser welding bead capable of easily managing a welding production process by performing welding bead quality monitoring in real time and immediately notifying a user of defects upon sensing the defects.”) (see also Fig. 4; steps S412-S422).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JUSTIN C DODSON whose telephone number is (571)270-0529. The examiner can normally be reached Mon.-Fri. 1:00-9:00 PM (ET).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Steven Crabb can be reached at (571)270-5095. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JUSTIN C DODSON/ Primary Examiner, Art Unit 3761