DETAILED ACTION
This is the first office action regarding application number 18/012990, filed on 12/27/2022, which is a 371 of PCT/ PCT/CN2022/131435 filed November 11, 2022, which claims priority to Chinese patent application No. 202111645358.6 filed December 29, 2021.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant's claim for foreign priority based on an application filed in People’s Republic of China on 12/29/2021. It is noted, however, that applicant has not filed a certified copy of the English translation of CN202111645358.6 application.
Drawings
The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the following claimed limitations must be shown or the feature(s) canceled from the claim(s). No new matter should be entered. The limitations are:
an acquisition module
a judgment module
an identification module
a calibration module
a mapping module
a fitting module
a comparison module
a determination module
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Specification
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification.
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
an acquisition module in claim 10, interpreted as camera as described in paragraph [74 ] of the original disclosure, and equivalents thereof
a judgment module in claim 10. Paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “judgment module” is interpreted as any software, and equivalents thereof.
an identification module in claim 10. Paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “identification module” is interpreted as any software, and equivalents thereof.
a calibration module in claim 10. Paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “calibration module” is interpreted as any software, and equivalents thereof.
a mapping module in claim 10. Paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “mapping module” is interpreted as any software, and equivalents thereof.
a fitting module in claim 10. Paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “fitting module” is interpreted as any software, and equivalents thereof.
a comparison module in claim 10. Paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “comparison module” is interpreted as any software, and equivalents thereof.
a determination module in claim 10. Paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “determination module” is interpreted as any software, and equivalents thereof.
“acquiring a two-dimensional welding image and a three-dimensional welding image” in claim 1, interpreted as through camera as described in paragraph [74 ] of the original disclosure, and equivalents thereof
performing welding fume judgment in claim 1, paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “performing welding fume judgment” is interpreted as through any software, and equivalents thereof.
performing an identification process in claim 1, paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “performing an identification process” is interpreted as through any software, and equivalents thereof.
performing a calibration process in claim 1, paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “performing a calibration process” is interpreted as through any software, and equivalents thereof.
mapping the welding seam edge image of the two-dimensional welding image to the three- dimensional welding image in claim 1, paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “performing a calibration process” is interpreted as through any software, and equivalents thereof.
performing an identification process in claim 1, paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “performing an identification process” is interpreted as through any software, and equivalents thereof.
performing an offset comparison process in claim 1, paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “performing an offset comparison process” is interpreted as through any software, and equivalents thereof.
determining the welding seam edge image in claim 1, paragraph [145] describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. Based on a lack of disclosure and for the purposes of examination, “determining the welding seam edge image” is interpreted as through any software, and equivalents thereof.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112(a)
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-12 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the enablement requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to enable one skilled in the art to which it pertains, or with which it is most nearly connected, to make and/or use the invention.
Claim 1 cites “performing welding fume judgment….performing an identification process …performing a calibration process …mapping the welding seam edge image ….performing an identification process ….performing an offset comparison process…determining the welding seam edge image”. Paragraph [145] of the original disclosure describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. It is not clear if the limitations are executed through hardware or through software. It is not clear if the applicant is claiming a software program or integrated circuits.
Claim 10 recites “a judgment module…an identification module …a calibration module ….a mapping module …a fitting module …a comparison module …a determination module”. Paragraph [145] of the original disclosure describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. It is not clear if the limitations are executed through hardware or through software. It is not clear if the applicant is claiming a software program or integrated circuits.
Claims 2-9, 11-12 are rejected based on their dependency to claim 1.
Claim Rejections - 35 USC § 112(b)
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
The following claim limitations invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function.
Claim 1 cites “performing welding fume judgment….performing an identification process …performing a calibration process …mapping the welding seam edge image ….performing an identification process ….performing an offset comparison process…determining the welding seam edge image”. Paragraph [145] of the original disclosure describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. It is not clear if the limitations are executed through hardware or through software. It is not clear if the applicant is claiming a software program or integrated circuits.
Claim 10 recites “a judgment module…an identification module …a calibration module ….a mapping module …a fitting module …a comparison module …a determination module”. Paragraph [145] of the original disclosure describes “all or some of steps and systems in the method disclosed above may be implemented as software, firmware, hardware and appropriate combinations thereof.” The written description fails to disclose the corresponding structure for performing the entire claimed function and to clearly link the structure to the function. The specification does not provide any structure or description for the claimed limitation. It is not clear if the limitations are executed through hardware or through software. It is not clear if the applicant is claiming a software program or integrated circuits.
Therefore, the claims 1, and 10 are indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Claims 2-9, 11-12 are rejected based on their dependency to claim 1.
Applicant may:
(a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph;
(b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)).
If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either:
(a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-12 are rejected under 35 U.S.C. 101 because the claimed invention is directed to abstract idea of mental processes.
Regarding claim 1,
Step 1: With respect to claim 1, applying step 1, the preamble of independent claim 1 claims a method and falls within the statutory category of a process.
Step 2A, prong one: In order to apply step 2A, a recitation of claim 1 is copied below. (Highlighted portions in bold of the claim constitute an abstract idea; the remaining limitations are "additional elements"): The claim recites:
A method for determining a welding seam quality detection region, comprising: acquiring a two-dimensional welding image and a three-dimensional welding image of a target workpiece that has been welded; performing welding fume judgment on the two-dimensional welding image to obtain a welding fume judgment result; performing an identification process on the two-dimensional welding image according to the welding fume judgment result to obtain a welding seam edge image; performing a calibration process on the two-dimensional welding image and the three- dimensional welding image to obtain a mapping relation matrix; mapping the welding seam edge image of the two-dimensional welding image to the three- dimensional welding image according to the mapping relation matrix to obtain a three-dimensional composite image; performing an identification process on the three-dimensional composite image to obtain a welding seam target datum line; performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result; and in response to the offset comparison result being consistent with a preset threshold range, determining the welding seam edge image as the welding seam quality detection region.
As to the first step of the patent eligibility analysis (Step 2A, First Prong), the highlighted portion of the claim constitutes an abstract idea, because it can be construed as reciting mental processes MPEP 2106.04(a)(2)(III)(B & C)). The steps of acquiring images, judging images, identifying seam edge, obtaining mapping matrix, mapping one image to another, comparing offset with threshold, and determining based on the comparison data are mental processes that are performed with or without a physical aid.
Step 2A, prong two: Under step 2A prong two, this judicial exception is not integrated into a practical application because the claim does not recite additional elements.
Applicant’s specification does not include any discussion of how the claimed invention provides a technical improvement realized by the claim over the prior art or any explanation of a technical problem having an unconventional technical solution that is expressed in the claim. That is, the specification fails to provide sufficient details regarding the manner in which the claimed invention accomplishes any technical improvement or solution.
Step 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because the limitation regarding acquiring images, judging images, identifying seam edge, obtaining mapping matrix, mapping one image to another, comparing offset with threshold, and determining based on the comparison data are mental processes that are performed with or without a physical aid. MPEP 2106.04(a)(2)(III)(B & C)).
For the foregoing reasons, claim 1 is directed to an abstract idea of metal processes , and is rejected as not patent eligible under35 U.S.C. 101.
Claims 2-9 add more mental processes to claim 1. Claim 2 recites mental processes of acquiring, judging, and determining. Claim 3 recites mental processes performing elimination or searching seam edge. Claim 4 recites performing calculation on acquired data. Claim 5 recites mental process of identifying seam edge. Claim 6 recites mental process of identifying different regions. Claim 7 recites mental process of acquiring data, performing calculation, and performing fitting. Claim 8 recites mental process of calculating offset and comparing with a threshold value. Claim 9 recites mental process of performing coordinate transforming. For the foregoing reasons, claims 2-9 are directed to an abstract idea of metal processes , and is rejected as not patent eligible under35 U.S.C. 101.
Claim 11 adds generic computer to execute the mental processes of claim 1. MPEP 2106.04(a)(2)(III)(C)(1)).
Claim 12 adds a generic computer readable storage to execute the mental processes of claim 1. MPEP 2106.04(a)(2)(III)(C)(1)).
However, these limitations in claims 11-12 as claimed do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The abstract idea and additional elements as claimed are so generic that this amounts to merely adding the words “apply it” to the abstract idea MPEP 2106.05 (f) (3). For the foregoing reasons, claims 11-12 are directed to an abstract idea without significantly more, and is rejected as not patent eligible under35 U.S.C. 101.
Regarding claim 10,
Step 1: With respect to claim 10, applying step 1, the preamble of independent claim 1 claims a device and falls within the statutory category of a machine.
Step 2A, prong one: In order to apply step 2A, a recitation of claim 10 is copied below. (Highlighted portions in bold of the claim constitute an abstract idea; the remaining limitations are "additional elements"): The claim recites:
A device for determining a welding seam quality detection region, comprising: an acquisition module configured for acquiring a two-dimensional welding image and a three- dimensional welding image of a target workpiece that has been welded; a judgment module configured for performing welding fume judgment on the two-dimensional welding image to obtain a welding fume judgment result; an identification module configured for performing an identification process on the two- dimensional welding image according to the welding fume judgment result to obtain a welding seam edge image; a calibration module configured for performing a calibration process on the two-dimensional welding image and the three-dimensional welding image to obtain a mapping relation matrix; a mapping module configured for mapping the welding seam edge image of the two-dimensional welding image to the three-dimensional welding image according to the mapping relation matrix to obtain a three-dimensional composite image; a fitting module configured for performing an identification process on the three-dimensional composite image to obtain a welding seam target datum line; a comparison module configured for performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result; and a determination module configured for, in response to the offset comparison result being consistent with a preset threshold range, determining the welding seam edge image as the welding seam quality detection region.
As to the first step of the patent eligibility analysis (Step 2A, First Prong), the highlighted portion of the claim constitutes an abstract idea, because it can be construed as reciting mental processes MPEP 2106.04(a)(2)(III)(B & C)). The steps of acquiring images, judging images, identifying seam edge, obtaining mapping matrix, mapping one image to another, comparing offset with threshold, and determining based on the comparison data are mental processes that are performed with or without a physical aid.
Step 2A, prong two: Under step 2A prong two, this judicial exception is not integrated into a practical application because even though the claim recites additional elements of acquisition, judgment, identification, calibration, mapping, fitting, comparison, and determination modules, these additional elements amount to simply implementing the abstract idea on a computer. Accordingly, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
Applicant’s specification does not include any discussion of how the claimed invention provides a technical improvement realized by the claim over the prior art or any explanation of a technical problem having an unconventional technical solution that is expressed in the claim. That is, the specification fails to provide sufficient details regarding the manner in which the claimed invention accomplishes any technical improvement or solution.
Step 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because the limitation regarding acquiring images, judging images, identifying seam edge, obtaining mapping matrix, mapping one image to another, comparing offset with threshold, and determining based on the comparison data are mental processes that are performed on a generic computer. MPEP 2106.04(a)(2)(III)(C)(1)).
The abstract idea and additional elements as claimed are so generic that amounts to merely adding the words “apply it” to the abstract idea MPEP 2106.05 (f) (3).
For the foregoing reasons, claim 10 is directed to an abstract idea of metal processes , and is rejected as not patent eligible under35 U.S.C. 101.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1, 8-9, 11-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shang et al., CN 113192029 (hereafter Shang), and further in view of Jin at al., CN 111583211 (hereafter Jin).
Regarding claim 1,
A method for determining a welding seam quality detection region, comprising: (Abstract teaches “The invention claims a welding seam identification method based on ToF”)
acquiring a two-dimensional welding image and a three-dimensional welding image of a target workpiece that has been welded; (Page 3, paragraph 6 teaches “the original welding line image comprises: an amplitude image and a depth image” )
performing welding fume judgment on the two-dimensional welding image to obtain a welding fume judgment result; (The claim is interpreted as filtering the 2D image to remove noise. Shang teaches collecting an amplitude image and filtering the image to reduce the background light in page 5, paragraph 10-11.)
performing an identification process on the two-dimensional welding image according to the welding fume judgment result to obtain a welding seam edge image; (Page 6, paragraph 2 teaches “step S4, extracting the edge feature of the binarization image by Gabor filter, and obtaining the edge image of the welding seam”.)
performing a calibration process on the two-dimensional welding image and the three- dimensional welding image to obtain a mapping relation matrix; (Page 4 paragraph 9 teaches “firstly converting the world coordinate system into camera coordinate system through rigid transformation; then the camera coordinate system is converted into the image coordinate system through the perspective projection; at last, the image coordinate system is discretized to obtain the pixel coordinate system.” Here rigid transformation, and conversion are performed through mapping matrix as taught in pages 5-6 of the original Chinese document.)
mapping the welding seam edge image of the two-dimensional welding image to the three- dimensional welding image according to the mapping relation matrix to obtain a three-dimensional composite image; (Abstract teaches “by identifying the welding seam image, obtaining the two dimensional information of the welding seam, then combining the corresponding depth information, calculating the three-dimensional coordinate of the welding seam; constructing a world coordinate system, a camera coordinate system, a conversion relation between the image coordinate system and the pixel coordinate system; according to the conversion relation, converting the welding seam three-dimensional coordinate into the space coordinate in the world coordinate system;”)
performing an identification process on the three-dimensional composite image to obtain a welding seam target datum line; (Abstract teaches “then combining the corresponding depth information, calculating the three-dimensional coordinate of the welding seam;”)
Shang is silent about performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result; and in response to the offset comparison result being consistent with a preset threshold range, determining the welding seam edge image as the welding seam quality detection region.
Jin teaches performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result; (Page 10, paragraph 4 teaches “after obtaining the contrast image, performing subtraction operation to the contrast image and the template image, and taking the absolute value of the result as the image to be compared for representing the difference between the contrast image and the template image.”)
and in response to the offset comparison result being consistent with a preset threshold range, determining the welding seam edge image as the welding seam quality detection region. (Page 10, paragraph 5 teaches “after obtaining the image to be compared, the pixel value of the pixel point included in the image to be compared is greater than the preset threshold value area, as the difference area”.)
Even though Jin is silent about comparing welding seam images from 2D and 3D measurement, before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to compare the 2D and 3D images in Shang to obtain offset value between them and compare the offset value with a preset threshold value as taught in Jin. One of ordinary skill in the art would have been motivated to do so in order to “obtaining the difference area between the contrast image and the template graph; and obtaining the defect detection result of the detection image according to the difference area” as taught in page 2, paragraph 11 in Jin.
Regarding claim 8,
The method for determining a welding seam quality detection region according to claim 1, wherein the performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result comprises: calculating an offset of a boundary of the welding seam edge image relative to the welding seam target datum line; (Shang is silent about this.
Page 10, paragraph 4 in Jin teaches “after obtaining the contrast image, performing subtraction operation to the contrast image and the template image, and taking the absolute value of the result as the image to be compared for representing the difference between the contrast image and the template image.”)
and comparing the offset with the preset threshold range to obtain the offset comparison result. (Page 10, paragraph 5 teaches “after obtaining the image to be compared, the pixel value of the pixel point included in the image to be compared is greater than the preset threshold value area, as the difference area”.)
Even though Jin is silent about comparing welding seam images from 2D and 3D measurement, before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to compare the 2D and 3D images in Shang to obtain offset value between them and compare the offset value with a preset threshold value as taught in Jin. One of ordinary skill in the art would have been motivated to do so in order to “obtaining the difference area between the contrast image and the template graph; and obtaining the defect detection result of the detection image according to the difference area” as taught in page 2, paragraph 11 in Jin.
Regarding claim 9,
The method for determining a welding seam quality detection region according to claim 1, wherein the performing a calibration process on the two-dimensional welding image and the three-dimensional welding image to obtain a mapping relation matrix comprises: performing the calibration process on the two-dimensional welding image and the three- dimensional welding image to obtain a plurality of corner point coordinates of the two-dimensional welding image and a plurality of corner point coordinates of the three-dimensional welding image; (Fig. 2 in Shang teaches obtaining coordinate points of 2D image and 3D image)
and performing a transformation matrix calculation process according to the plurality of corner point coordinates of the two-dimensional welding image and the plurality of corner point coordinates of the three-dimensional welding image to obtain the mapping relation matrix. (Abstract in Shang teaches “by identifying the welding seam image, obtaining the two dimensional information of the welding seam, then combining the corresponding depth information, calculating the three-dimensional coordinate of the welding seam; constructing a world coordinate system, a camera coordinate system, a conversion relation between the image coordinate system and the pixel coordinate system; according to the conversion relation, converting the welding seam three-dimensional coordinate into the space coordinate in the world coordinate system;”)
Regarding claim 11,
A computer, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the method for determining a welding seam quality detection region according to claim 1.
(Shang is silent about this.
Jin teaches in page 4, paragraph 13 “the invention further claims an electronic device, comprising a processor and a memory, the memory is stored with a computer program, a processor for executing the computer program,”.)
Even though Jin is silent about comparing welding seam images from 2D and 3D measurement, before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to store and execute the method in Shang using a computer as taught in Jin. One of ordinary skill in the art would have been motivated to do so in order to “obtaining the difference area between the contrast image and the template graph; and obtaining the defect detection result of the detection image according to the difference area” as taught in page 2, paragraph 11 in Jin.
Regarding claim 12,
A computer-readable storage medium, storing a computer-executable instruction, wherein the computer-executable instruction is used for executing the method for determining a welding seam quality detection region according to claim 1. (Shang is silent about this.
Jin teaches in page 14, paragraph 1 “the embodiment of the invention further claims a computer-readable storage medium, a computer-readable storage medium is stored with a computer program, a computer program is executed, realizing the method embodiment provided by the defect detection method”.)
Even though Jin is silent about comparing welding seam images from 2D and 3D measurement, before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to store and execute the method in Shang using a computer readable storage medium as taught in Jin. One of ordinary skill in the art would have been motivated to do so in order to “obtaining the difference area between the contrast image and the template graph; and obtaining the defect detection result of the detection image according to the difference area” as taught in page 2, paragraph 11 in Jin.
Claim(s) 2-4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shang, and Jin as applied to claim 1 above, and further in view of Wikipedia 2020, web.archive.org/web/20200622012910/https://en.wikipedia.org/wiki/Thresholding_(image_processing), June 2020 (hereafter Wikipedia2020).
Regarding claim 2,
The method for determining a welding seam quality detection region according to claim 1, wherein the performing welding fume judgment on the two-dimensional welding image to obtain a welding fume judgment result comprises: acquiring a minimum enclosing graph of a welding seam region in the two-dimensional welding image by using an edge identification tool; (Page 5, paragraph 11 in Shang teaches “cutting the amplitude image to obtain the image containing the welding seam region, and filtering the image; the purpose of using the filtering process is to reduce the influence of the light in the environment to the amplitude image.”)
judging whether welding fume exists in the two-dimensional welding image according to an edge line parameter of the minimum enclosing graph; and in response to the edge line parameter being greater than a threshold, determining the welding fume judgment result as existence of the welding fume, or in response to the edge line parameter being smaller than the threshold, determining the welding fume judgment result as non-existence of the welding fume. ( The claim is interpreted as a parameter of the image is compared to a threshold value to detect noise.
Primary combination of references is silent about this. Wikipedia2020 teaches “The simplest thresholding methods replace each pixel in an image with a black pixel if the image intensity I{{i,j}} is less than some fixed constant T (that is,
I{{i,j}}<T), or a white pixel if the image intensity is greater than that constant.” )
PNG
media_image1.png
858
1884
media_image1.png
Greyscale
Screenshot of Wikipedia2020
Even though Wikipedia2020 is silent about welding fume, before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to apply the method of comparing a parameter value from the image with a threshold value to reduce noise as taught in Wikipedia2020 to the method in Shang. One of ordinary skill in the art would have been motivated to do so in because “In digital image processing, thresholding is the simplest method of segmenting images. From a grayscale image, thresholding can be used to create binary images” as taught in Wikipedia2020.
Regarding claim 3,
The method for determining a welding seam quality detection region according to claim 2, wherein the performing an identification process on the two-dimensional welding image according to the welding fume judgment result to obtain a welding seam edge image comprises: in response to the welding fume judgment result indicating existence of the welding fume, performing a welding fume elimination process on the two-dimensional welding image to obtain the welding seam edge image; or in response to the welding fume judgment result indicating non-existence of the welding fume, performing a searching process on the two-dimensional welding image through the edge identification tool to obtain the welding seam edge image. (The claim is interpreted as in response to the welding fume judgment result indicating existence of the welding fume, performing a welding fume elimination process on the two-dimensional welding image to obtain the welding seam edge image.
Page 5, paragraph 11 in Shang teaches “cutting the amplitude image to obtain the image containing the welding seam region, and filtering the image; the purpose of using the filtering process is to reduce the influence of the light in the environment to the amplitude image.”)
Regarding claim 4,
The method for determining a welding seam quality detection region according to claim 3, wherein in response to the welding fume judgment result indicating existence of the welding fume, performing a welding fume elimination process on the two-dimensional welding image to obtain the welding seam edge image comprises: performing a histogram equalization process on the two-dimensional welding image to obtain the two-dimensional welding image subjected to the equalization process; (Page 6, paragraph 1-2 in Shang teaches “Specifically, the threshold value is obtained by calculating the local image Gaussian weighted average, the amplitude image after pre-processing using histogram method to determine the binarization threshold value, obtaining the binary image capable of reflecting the whole image and local features. step S4, extracting the edge feature of the binarization image by Gabor filter, and obtaining the edge image of the welding seam”.)
performing a quantization process on the two-dimensional welding image subjected to the equalization process to obtain a welding fume region and the welding seam region; (Page 6, paragraph 1-2 in Shang teaches “Specifically, the threshold value is obtained by calculating the local image Gaussian weighted average, the amplitude image after pre-processing using histogram method to determine the binarization threshold value, obtaining the binary image capable of reflecting the whole image and local features. step S4, extracting the edge feature of the binarization image by Gabor filter, and obtaining the edge image of the welding seam”.)
and performing a searching process on the welding seam region through the edge identification tool to obtain the welding seam edge image. (Page 6, paragraph 1-2 in Shang teaches “Specifically, the threshold value is obtained by calculating the local image Gaussian weighted average, the amplitude image after pre-processing using histogram method to determine the binarization threshold value, obtaining the binary image capable of reflecting the whole image and local features. step S4, extracting the edge feature of the binarization image by Gabor filter, and obtaining the edge image of the welding seam”.)
Claim(s) 5-6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shang, and Jin as applied to claim 1 above, and further in view of Mo et al., CN 111462110 A (hereafter Mo).
Regarding claim 5,
The method for determining a welding seam quality detection region according to claim 1, wherein the performing an identification process on the three-dimensional composite image to obtain a welding seam target datum line comprises: performing an identification process on the three-dimensional composite image to obtain a first region and a second region welded with the first region; and performing a fitting process according to the first region and the second region to obtain the welding seam target datum line. (Primary combination of references is silent about this.
Page 4, paragraph 4-6 in Mo teaches “selecting the first region of interest from the height map; determining the first edge line and the second edge line of the target welding seam according to the height difference between the plurality of pixel points in the first region of interest and the preset reference surface; the area image between the first edge line and the second edge line, as the welding seam area of the target welding seam.” It is understood that the welding seam joins two regions within the region of interest.
Page 11, paragraph 3 teaches “In addition, in the embodiment of the invention, a plurality of welding edge point can be, but not limited to 3, 5, 10, obtaining the plurality of welding edge point, can be through least squares method to fit the plurality of welding edge point, determining the target welding object, the edge straight line of the welding side.”)
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to identify welding seam line in 3D image in Shang by the fitting process as taught in Mo. One of ordinary skill in the art would have been motivated to do so because “from the height map, determining the welding seam area for representing the target welding seam, and then analyzing the welding seam area, obtaining the characteristic parameter of the target welding seam, at last, according to the characteristic parameter, obtaining the quality detection result of the target welding seam, wherein the target welding material comprises a substrate, a welding piece, and welding the welding piece on the substrate to form the target welding seam” as taught in abstract in Mo.
Regarding claim 6,
The method for determining a welding seam quality detection region according to claim 5, wherein the performing an identification process on the three-dimensional composite image to obtain a first region and a second region welded with the first region comprises: acquiring height data of the three-dimensional composite image; (Page 3, paragraph 6 in Shang teaches “the original welding line image comprises: an amplitude image and a depth image” )
and performing an identification process on the three-dimensional composite image according to the height data to obtain the first region and the second region welded with the first region, wherein a height value of the first region is different from a height value of the second region. (Primary combination of references is silent about this.
Page 4, paragraph 4-6 in Mo teaches “selecting the first region of interest from the height map; determining the first edge line and the second edge line of the target welding seam according to the height difference between the plurality of pixel points in the first region of interest and the preset reference surface; the area image between the first edge line and the second edge line, as the welding seam area of the target welding seam.” It is understood that the welding seam joins two regions within the region of interest.
Fig. 4 and 6 teach that region of interest welds surfaces 221 and 210 with different heights.)
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to identify welding seam line in 3D image in Shang according to height value of different pixels as taught in Mo. One of ordinary skill in the art would have been motivated to do so because “from the height map, determining the welding seam area for representing the target welding seam, and then analyzing the welding seam area, obtaining the characteristic parameter of the target welding seam, at last, according to the characteristic parameter, obtaining the quality detection result of the target welding seam, wherein the target welding material comprises a substrate, a welding piece, and welding the welding piece on the substrate to form the target welding seam” as taught in abstract in Mo.
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shang, Jin and Mo as applied to claim 5 above, and further in view of US patent application publication of Mo(US20230139733, hereafter Mo) and Sobel edge detector, https://web.archive.org/web/20200114044245/https://homepages.inf.ed.ac.uk/rbf/HIPR2/sobel.htm, Jan 2020 (hereafter Sobel edge detector).
The method for determining a welding seam quality detection region according to claim 5, wherein the performing a fitting process according to the first region and the second region to obtain the welding seam target datum line comprises: acquiring a height value of the first region and a height value of the second region; (Primary combination of references is silent about this.
Page 4, paragraph 4-6 in Mo teaches “selecting the first region of interest from the height map; determining the first edge line and the second edge line of the target welding seam according to the height difference between the plurality of pixel points in the first region of interest and the preset reference surface;”)
performing a first-order derivation process on the height value of the first region and the height value of the second region to obtain at least two edge points between the first region and the second region; (The US patent application publication of Mo teaches in paragraph [104] “During actual implementation, a plurality of weldment edge points may be acquired in a second region of interest of the height map with preset search parameters through a Sobel operator, and then the plurality of weldment edge points are fitted to determine the edge straight line on the welding side of the target weldment.” Sobel operator inherently performs spatial gradient measurement on an image as evidenced by Sobel edge detector. )
PNG
media_image2.png
476
1892
media_image2.png
Greyscale
Screenshot of Sobel edge detector
and performing the fitting process on the at least two edge points by using a welding seam target datum line tool to obtain the welding seam target datum line. (Page 11, paragraph 3 in Mo teaches “In addition, in the embodiment of the invention, a plurality of welding edge point can be, but not limited to 3, 5, 10, obtaining the plurality of welding edge point, can be through least squares method to fit the plurality of welding edge point, determining the target welding object, the edge straight line of the welding side.”)
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to identify welding seam line in 3D image in Shang by the fitting process as taught in Mo. One of ordinary skill in the art would have been motivated to do so because “from the height map, determining the welding seam area for representing the target welding seam, and then analyzing the welding seam area, obtaining the characteristic parameter of the target welding seam, at last, according to the characteristic parameter, obtaining the quality detection result of the target welding seam, wherein the target welding material comprises a substrate, a welding piece, and welding the welding piece on the substrate to form the target welding seam” as taught in abstract in Mo.
Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shang, and further in view of Jin and Mo.
A device for determining a welding seam quality detection region, comprising: (Page 3, paragraph 1 in Shang teaches “With the development of precision electronic technology and microelectronic technique, solving the problem of low resolution of camera based on ToF technology, many noise points, high cost, the flight distance measuring method based on high performance photoelectron is widely applied in each field.” It is implied that ToF method is implemented with microelectronic devices.)
an acquisition module configured for acquiring a two-dimensional welding image and a three- dimensional welding image of a target workpiece that has been welded; (Page 3, paragraph 6 in Shang teaches “obtaining the original welding line image of the welding piece to be processed by camera based on ToF technology; the original welding line image comprises: an amplitude image and a depth image”.)
a judgment module configured for performing welding fume judgment on the two-dimensional welding image to obtain a welding fume judgment result; (Shang teaches collecting an amplitude image and filtering the image to reduce the background light in page 5, paragraph 10-11. Page 3, paragraph 1 in Shang teaches “With the development of precision electronic technology and microelectronic technique, solving the problem of low resolution of camera based on ToF technology, many noise points, high cost, the flight distance measuring method based on high performance photoelectron is widely applied in each field.” It is implied that judgment method is implemented with microelectronic devices.)
an identification module configured for performing an identification process on the two- dimensional welding image according to the welding fume judgment result to obtain a welding seam edge image; (Page 6, paragraph 2 teaches “step S4, extracting the edge feature of the binarization image by Gabor filter, and obtaining the edge image of the welding seam”.
Page 3, paragraph 1 in Shang teaches “With the development of precision electronic technology and microelectronic technique, solving the problem of low resolution of camera based on ToF technology, many noise points, high cost, the flight distance measuring method based on high performance photoelectron is widely applied in each field.” It is implied that identification method is implemented with microelectronic devices.)
a calibration module configured for performing a calibration process on the two-dimensional welding image and the three-dimensional welding image to obtain a mapping relation matrix; (Page 4 paragraph 9 teaches “firstly converting the world coordinate system into camera coordinate system through rigid transformation; then the camera coordinate system is converted into the image coordinate system through the perspective projection; at last, the image coordinate system is discretized to obtain the pixel coordinate system.” Here rigid transformation, and conversion are performed through mapping matrix as taught in pages 5-6 of the original Chinese document.
Page 3, paragraph 1 in Shang teaches “With the development of precision electronic technology and microelectronic technique, solving the problem of low resolution of camera based on ToF technology, many noise points, high cost, the flight distance measuring method based on high performance photoelectron is widely applied in each field.” It is implied that calibration module is implemented with microelectronic devices.)
a mapping module configured for mapping the welding seam edge image of the two-dimensional welding image to the three-dimensional welding image according to the mapping relation matrix to obtain a three-dimensional composite image; (Abstract teaches “by identifying the welding seam image, obtaining the two dimensional information of the welding seam, then combining the corresponding depth information, calculating the three-dimensional coordinate of the welding seam; constructing a world coordinate system, a camera coordinate system, a conversion relation between the image coordinate system and the pixel coordinate system; according to the conversion relation, converting the welding seam three-dimensional coordinate into the space coordinate in the world coordinate system;”
Page 3, paragraph 1 in Shang teaches “With the development of precision electronic technology and microelectronic technique, solving the problem of low resolution of camera based on ToF technology, many noise points, high cost, the flight distance measuring method based on high performance photoelectron is widely applied in each field.” It is implied that mapping module is implemented with microelectronic devices.)
Shang is silent about a fitting module configured for performing an identification process on the three-dimensional composite image to obtain a welding seam target datum line; a comparison module configured for performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result; and a determination module configured for, in response to the offset comparison result being consistent with a preset threshold range, determining the welding seam edge image as the welding seam quality detection region.
Mo teaches a fitting module configured for performing an identification process on the three-dimensional composite image to obtain a welding seam target datum line; (Page 11, paragraph 3 in Mo teaches “In addition, in the embodiment of the invention, a plurality of welding edge point can be, but not limited to 3, 5, 10, obtaining the plurality of welding edge point, can be through least squares method to fit the plurality of welding edge point, determining the target welding object, the edge straight line of the welding side.”
Page 6, paragraph 12 teaches “the embodiment of the invention claims an electronic device, comprising a processor and a memory, the memory is stored with a computer program, a processor for executing the computer program”.)
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to identify welding seam line in 3D image in Shang by the fitting process as taught in Mo. One of ordinary skill in the art would have been motivated to do so because “from the height map, determining the welding seam area for representing the target welding seam, and then analyzing the welding seam area, obtaining the characteristic parameter of the target welding seam, at last, according to the characteristic parameter, obtaining the quality detection result of the target welding seam, wherein the target welding material comprises a substrate, a welding piece, and welding the welding piece on the substrate to form the target welding seam” as taught in abstract in Mo.
Primary combination of references is silent about a comparison module configured for performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result; and a determination module configured for, in response to the offset comparison result being consistent with a preset threshold range, determining the welding seam edge image as the welding seam quality detection region.
Jin teaches a comparison module configured for performing an offset comparison process on the welding seam edge image and the welding seam target datum line to obtain an offset comparison result; (Page 10, paragraph 4 in Jin teaches “after obtaining the contrast image, performing subtraction operation to the contrast image and the template image, and taking the absolute value of the result as the image to be compared for representing the difference between the contrast image and the template image.”
Jin teaches in page 4, paragraph 13 “the invention further claims an electronic device, comprising a processor and a memory, the memory is stored with a computer program, a processor for executing the computer program,”.)
and a determination module configured for, in response to the offset comparison result being consistent with a preset threshold range, determining the welding seam edge image as the welding seam quality detection region. (Page 10, paragraph 5 teaches “after obtaining the image to be compared, the pixel value of the pixel point included in the image to be compared is greater than the preset threshold value area, as the difference area”.
Jin teaches in page 4, paragraph 13 “the invention further claims an electronic device, comprising a processor and a memory, the memory is stored with a computer program, a processor for executing the computer program,”.)
Even though Jin is silent about comparing welding seam images from 2D and 3D measurement, before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to compare the 2D and 3D images in Shang to obtain offset value between them and compare the offset value with a preset threshold value as taught in Jin. One of ordinary skill in the art would have been motivated to do so in order to “obtaining the difference area between the contrast image and the template graph; and obtaining the defect detection result of the detection image according to the difference area” as taught in page 2, paragraph 11 in Jin.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FAHMIDA FERDOUSI whose telephone number is (303)297-4341. The examiner can normally be reached Monday-Friday; 9:00AM-3:00PM; PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Steven Crabb can be reached at (571)270-5095. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/FAHMIDA FERDOUSI/ Examiner, Art Unit 3761