Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Interpretation
During patent examination, pending claims must be “given their broadest reasonable interpretation consistent with the specification.” MPEP 2111; See also, MPEP 2173.02. Limitations appearing in the specification but not recited in the claim are not read into the claim. In re Prater, 415 F.2d 1393, 1404-05, 162 USPQ 541, 550-551 (CCPA 1969). See also, In re Zletz, 893 F.2d 319, 321-22, 13 USPQ2d 1320, 1322 (Fed. Cir. 1989) (“During patent examination the pending claims must be interpreted as broadly as their terms reasonably allow”). The reason is simply that during patent prosecution when claims can be amended, ambiguities should be recognized, scope and breadth of language explored, and clarification imposed. An essential purpose of patent examination is to fashion claims that are precise, clear, correct, and unambiguous. Only in this way can uncertainties of claim scope be removed, as much as possible, during the administrative process.
The Examiner respectfully requests of the Applicant in preparing responses, to consider fully the entirety of the reference(s) as potentially teaching all or part of the claimed invention. It is noted, REFERENCES ARE RELEVANT AS PRIOR ART FOR ALL THEY CONTAIN.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 6-8, and 18-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 6, and 18, both currently recite inter alai, “AI” model. According to U.S. Practice, claim terms should be fully spelled out before they appear in abbreviation, or as an acronym.
Claims 7-8, 19-20 depend from claim 6, and 18 respectively, and fail to correct the basis by which claim 6 and 18 stand rejected, and are therefore rejected under same basis.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 14 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
Claim 14 is a computer program product, which broadly includes forms of transitory propagating signals per se. However, a signal embodied in a transitory carrier or electromagnetic wave is natural phenomena. Claims that recite nothing but physical characteristics of a form of energy are not eligible for patent protection. No preemption is permitted. Therefore, the claim is rejected for being directed to non-statutory subject matter. The rejection may be obviated by including non-transitory computer-readable medium language in the claim.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-7, 9-13, and 15-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. Patent Application Publication No. 2019/0122913 A1 to Lauber et al. (hereinafter Lauber).
With regards to claim 1, Lauber discloses:
1. A computer-implemented method for assessing a component quality of an electron emitter, which is part of a cathode facility, wherein the cathode facility comprises a cathode head and the electron emitter inserted in the cathode head, the method comprising (see, Fig. 9, and detailed description, including, the electron column 901 includes an electron beam source 903 configured to generate electrons that are focused to specimen 904 by one or more elements 905. The electron beam source 903 may include, for example, a cathode source or emitter tip, para. 0074):
receiving an electron emitter image dataset, wherein items of image information of the electron emitter image dataset at least partially map the electron emitter inserted in the cathode head (see, Summary, and detailed description, including, selecting a first local section from the reference image and a second local section from the test image; determining an estimated rotation offset and translation offset from the first local section and the second local section; performing a rough alignment on the test image thereby making a partially-aligned test image;
receiving an electron emitter geometry model from a memory unit (see, detailed description, including, the deep learning model used for the semiconductor inspection applications disclosed herein is configured as a Visual Geometry Group (VGG) network, (a network, specifically a VGG network, possess at least one memory unit for models, para. 0115);
transforming the received electron emitter geometry model to the items of image information of the electron emitter image dataset, wherein an item of electron emitter geometry information of the electron emitter inserted in the cathode head is calculated as an output parameter of the transformation (see, Fig. 3, and detailed description, including, of skew comparison 300. At 301, a fast Fourier transform is performed on the first local section from the reference image, yielding a reference scene function. At 302, a fast Fourier transform is performed on the second local section from the test image, yielding a test scene function. At 303, the test scene function is compared to the reference scene function to determine the skew angle. Skew comparison 300 can involve determining the phase, or the normalized product of the reference scene function and the test scene function. An inverse fast Fourier transform is performed on the phase to yield the offset. This offset can be determined at one or more locations on the image. The skew angle can then be the difference in offset magnitudes divided by the distance between them. Alternately, an inverse tangent of the difference in magnitudes divided by the distance between them can be the skew angle, para. 0045);
ascertaining a degree of similarity of the electron emitter inserted in the cathode head with at least one further electron emitter by using the electron emitter geometry information (see, as above, and At 303, the test scene function is compared to the reference scene function to determine the skew angle. Skew comparison 300 can involve determining the phase, or the normalized product of the reference scene function and the test scene function. An inverse fast Fourier transform is performed on the phase to yield the offset. This offset can be determined at one or more locations on the image, para. 0045); and
assessing the component quality as a function of the ascertained degree of similarity with the at least one further electron emitter (see, as above, and further, The skew angle can then be the difference in offset magnitudes divided by the distance between them. Alternately, an inverse tangent of the difference in magnitudes divided by the distance between them can be the skew angle, para. 0045).
With regards to claim 2, Lauber discloses:
2. The method of claim 1, wherein the electron emitter image dataset is a single electron emitter image dataset and the assessing assesses the component quality of the electron emitter using the single electron emitter image dataset of the electron emitter (see. Fig. 3, and detailed description, including, At 301, a fast Fourier transform is performed on the first local section from the reference image, yielding a reference scene function. At 302, a fast Fourier transform is performed on the second local section from the test image, yielding a test scene function. At 303, the test scene function is compared to the reference scene function to determine the skew angle, separate section performance, is interpreted as being within a single emitter dataset, and the analysis of the skew angle, is interpreted as quality parameter, para. 0045).
With regards to claim 3, Lauber discloses:
3. The method of claim 1, wherein the electron emitter geometry model is annotated with a large number of measuring points describing the electron emitter geometry (see, detailed description, including VGG networks, a Visual Geometry Group (VGG) network. For example, VGG networks were created by increasing the number of convolutional layers while fixing other parameters of the architecture. Adding convolutional layers to increase depth is made possible by using substantially small convolutional filters in all of the layers. Like the other neural networks described herein, VGG networks were created and trained to analyze features for determining rotation and translation offsets. VGG networks also include convolutional layers followed by fully connected layers, para. 0115).
With regards to claim 4, Lauber discloses:
4. The method of claim 1, wherein the transforming takes place while minimizing a complex correlation factor (see, detailed description, including, a fast Fourier transform on the first local section from the reference image to obtain a reference scene function; performing a fast Fourier transform on the second local section from the test image to obtain a test scene function; comparing the test scene function to the reference scene function to determine the skew angle, para.0098.
With regards to claim 5, Lauber discloses:
5. The method of claim 1, wherein the electron emitter geometry information describes at least one of a spatial shift of at least one measuring point or a relative distance between two measuring points (see, Fig. 3, and detailed description, including, An inverse fast Fourier transform is performed on the phase to yield the offset. This offset can be determined at one or more locations on the image. The skew angle can then be the difference in offset magnitudes divided by the distance between them. Alternately, an inverse tangent of the difference in magnitudes divided by the distance between them can be the skew angle, para. 0045).
With regards to claim 6, Lauber discloses:
6. The method of claim 1, wherein the ascertaining the degree of similarity comprises:
Inputting the electron emitter geometry information into a AI model trained via a machine learning method (see, Fig. 3, and detailed description, including, the skew comparison can comprise using a machine learning module to determine the skew angle. The machine learning module can be a deep-learning image classification to recognize the best image features to align to, para. 0047), and
providing the degree of similarity at an output of the AI model (see, detailed description, including, the skew comparison can comprise maximizing projection contrast, using correlations and salient image information in a special domain, or other similar methods, para. 0048).
With regards to claim 7, Lauber discloses:
7. The method of claim 6, wherein the ascertaining the degree of similarity comprises:
dimensionally reducing the input electron emitter geometry information, and the ascertaining ascertains the degree of similarity via the dimensionally reduced electron emitter geometry information (see. Fig. 3, and detailed description, including, At 301, a fast Fourier transform is performed on the first local section from the reference image, yielding a reference scene function. At 302, a fast Fourier transform is performed on the second local section from the test image, yielding a test scene function. At 303, the test scene function is compared to the reference scene function to determine the skew angle, separate section performance, is interpreted as dimensionally reducing a single emitter dataset, and the analysis of the resulting skew angle, is interpreted as ascertaining a result of the degree of similarity via the dimensionally reduced electron emitter geometry information, para. 0045).
With regards to claim 9, Lauber discloses:
9. The method of claim 1, wherein the assessed component quality of the electron emitter is stored in a memory unit (see, detailed description, including, Program code or instructions for the processor 814 or 908 to implement various methods and functions may be stored in readable storage media, such as a memory in the electronic data storage unit 815 or 909, respectively, or other memory, para. 0086).
With regards to claim 10, Lauber discloses:
10. A computer-implemented method for providing a trained AI model (see, detailed description, including, the deep learning model used for the semiconductor inspection applications disclosed herein is configured as a Visual Geometry Group (VGG) network, (a network, specifically a VGG network, possess at least one memory unit for models);, para. 0115) the method comprising:
receiving performance data of further electron emitters as input data (Fig. 8, and detailed description, including, any other suitable optical elements (not shown). Examples of such optical elements include, but are not limited to, polarizing component(s), spectral filter(s), spatial filter(s), reflective optical element(s), apodizer(s), beam splitter(s) (such as beam splitter 813), aperture(s), and the like, which may include any such suitable optical elements known in the art. In addition, the optical based subsystem 801 may be configured to alter one or more of the elements of the illumination subsystem based on the type of illumination to be used for generating the optical based output, para. 0063);
applying a neural network, which comprises an encoder and a decoder, to the input data, wherein an output vector is calculated, wherein the encoder maps a first number of input values to a second number of output values, and wherein the decoder maps a second number of input values to a first number of output values, wherein the second number is smaller than the first number (see, detailed description, including, the deep learning model is a machine learning model. Machine learning can be generally defined as a type of artificial intelligence (AI) that provides computers with the ability to learn without being explicitly programmed. Machine learning focuses on the development of computer programs that can teach themselves to grow and change when exposed to new data. Machine learning explores the study and construction of algorithms that can learn from and make predictions on data. Such algorithms overcome following strictly static program instructions by making data driven predictions or decisions, through building a model from sample inputs, para. 0110);
adapting a parameter of the neural network based on a comparison of the output vector with the input data (see, detailed description, including, When the input layer receives an input, it passes on a modified version of the input to the next layer. In a deep network, there are many layers between the input and output, allowing the algorithm to use multiple processing layers, composed of multiple linear and non-linear transformations, para. 0106); and
outputting the decoder as the trained AI model (see, detailed description, including, a GoogleNet may include layers such as convolutional, pooling, and fully connected layers such as those described further herein configured and trained to analyze features for determining rotation and translation offsets, para. 0114).
With regards to claim 11, Lauber discloses:
11. The method of claim 10, wherein the receiving further receives items of electron emitter geometry information of the further electron emitters as part of the input data (see, Fig. 8, and detailed description, including, any other suitable optical elements (not shown). Examples of such optical elements include, but are not limited to, polarizing component(s), spectral filter(s), spatial filter(s), reflective optical element(s), apodizer(s), beam splitter(s) (such as beam splitter 813), aperture(s), and the like, which may include any such suitable optical elements known in the art. In addition, the optical based subsystem 801 may be configured to alter one or more of the elements of the illumination subsystem based on the type of illumination to be used for generating the optical based output, para. 0063).
With regards to claim 12, Lauber discloses:
12. A cathode facility comprising:
a cathode head (see, fig. 9, and detailed description, including, The electron beam source 903 may include, for example, a cathode source or emitter tip, para. 0074); and from above, claim 1,
an electron emitter inserted in the cathode head, wherein a component quality of the electron emitter is assessed with the method of claim 1 (see, as above, a cathode source or emitter tip, para. 0074) and (A computer-implemented method for assessing a component quality of an electron emitter, which is part of a cathode facility, wherein the cathode facility comprises a cathode head and the electron emitter inserted in the cathode head, the method comprising (see, Fig. 9, and detailed description, including, the electron column 901 includes an electron beam source 903 configured to generate electrons that are focused to specimen 904 by one or more elements 905. The electron beam source 903 may include, for example, a cathode source or emitter tip, para. 0074):
receiving an electron emitter image dataset, wherein items of image information of the electron emitter image dataset at least partially map the electron emitter inserted in the cathode head (see, Summary, and detailed description, including, selecting a first local section from the reference image and a second local section from the test image; determining an estimated rotation offset and translation offset from the first local section and the second local section; performing a rough alignment on the test image thereby making a partially-aligned test image;
receiving an electron emitter geometry model from a memory unit (see, detailed description, including, the deep learning model used for the semiconductor inspection applications disclosed herein is configured as a Visual Geometry Group (VGG) network, (a network, specifically a VGG network, possess at least one memory unit for models, para. 0115);
transforming the received electron emitter geometry model to the items of image information of the electron emitter image dataset, wherein an item of electron emitter geometry information of the electron emitter inserted in the cathode head is calculated as an output parameter of the transformation (see, Fig. 3, and detailed description, including, of skew comparison 300. At 301, a fast Fourier transform is performed on the first local section from the reference image, yielding a reference scene function. At 302, a fast Fourier transform is performed on the second local section from the test image, yielding a test scene function. At 303, the test scene function is compared to the reference scene function to determine the skew angle. Skew comparison 300 can involve determining the phase, or the normalized product of the reference scene function and the test scene function. An inverse fast Fourier transform is performed on the phase to yield the offset. This offset can be determined at one or more locations on the image. The skew angle can then be the difference in offset magnitudes divided by the distance between them. Alternately, an inverse tangent of the difference in magnitudes divided by the distance between them can be the skew angle, para. 0045);
ascertaining a degree of similarity of the electron emitter inserted in the cathode head with at least one further electron emitter by using the electron emitter geometry information (see, as above, and At 303, the test scene function is compared to the reference scene function to determine the skew angle. Skew comparison 300 can involve determining the phase, or the normalized product of the reference scene function and the test scene function. An inverse fast Fourier transform is performed on the phase to yield the offset. This offset can be determined at one or more locations on the image, para. 0045); and
assessing the component quality as a function of the ascertained degree of similarity with the at least one further electron emitter (see, as above, and further, The skew angle can then be the difference in offset magnitudes divided by the distance between them. Alternately, an inverse tangent of the difference in magnitudes divided by the distance between them can be the skew angle, para. 0045)).
With regards to claim 13, Lauber discloses:
13. The cathode facility of claim 12, wherein the electron emitter is a flat emitter (see, detailed description, including, The electron beam source 903 may include, for example, a cathode source or emitter tip. The one or more elements 905 may include, for example, a gun lens, an anode, a beam limiting aperture, a gate valve, a beam current selection aperture, an objective lens, and a scanning subsystem, all of which may include any such suitable elements known in the art.
(see, 35 USC 101 Rejection Section)
14. A computer program product with program code means, when executed by a computing unit, cause the computing unit to perform the method of claim 1.
With regard to claim 15, claim 15 (a method claim) recites substantially similar limitations to claim 3 (a method claim) and is therefore rejected using the same art and rationale set forth above.
With regard to claim 16, claim 16 (a method claim) recites substantially similar limitations to claim 4 (a method claim) and is therefore rejected using the same art and rationale set forth above.
With regard to claim 17, claim 17 (a method claim) recites substantially similar limitations to claim 5 (a method claim) and is therefore rejected using the same art and rationale set forth above.
With regard to claim 18, claim 18 (a method claim) recites substantially similar limitations to claim 6 (a method claim) and is therefore rejected using the same art and rationale set forth above.
With regard to claim 19, claim 19 (a method claim) recites substantially similar limitations to claim 7 (a method claim) and is therefore rejected using the same art and rationale set forth above.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 8, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Lauber in view of U.S. Patent Application Publication No. 2024/0054632 A1 to Yu et al. (hereinafter Yu).
With regards to claim 8, Lauber fails to explicitly disclose:
8. The method of claim 6, wherein the ascertaining the degree of similarity comprises:
distancing a first category with electron emitter geometries with an artifact from a second category with electron emitter geometries without the artifact, and the ascertaining ascertains the degree of similarity via an allocation of the electron emitter geometry information to the first category or the second category.
Yu discloses:
the ascertaining the degree of similarity comprises:
distancing a first category with electron emitter geometries with an artifact from a second category with electron emitter geometries without the artifact, and the ascertaining ascertains the degree of similarity via an allocation of the electron emitter geometry information to the first category or the second category (see, Fig. 5, and detailed description, including, Two different reference images 504 and 506 may be generated for test image 500 as described further herein. Since the two reference images have at least one different characteristic, the reference images may include different nuisances or artifacts. For example, reference image 504 includes nuisance or artifact 508 while reference image 506 does not include any nuisances or artifacts. The reference images may be used to generate different difference images for the test image. For example, difference image 512 may be generated from Test.sub.i−Ref1.sub.i step 510, and difference image 516 may be generated from Test.sub.i−Ref2.sub.i step 514, Since the reference images are different, the two different difference images will also be different in important ways. For example, difference image 512 includes real defect 502 from test image 500 and nuisance or artifact 508 from reference image 504 since those are the two differences between test image 500 and reference image 504.para. 0088-0089).
It would have been obvious to one having ordinary skill at the time the invention was filed, and having the teachings of Lauber and Yu before her, to be motivated to combine the features from Yu, with Lauber, including, distancing a first category with electron emitter geometries with an artifact from a second category with electron emitter geometries without the artifact, and the ascertaining ascertains the degree of similarity via an allocation of the electron emitter geometry information to the first category or the second category (see, Fig. 5, and detailed description, including, Two different reference images 504 and 506 may be generated for test image 500 as described further herein. Since the two reference images have at least one different characteristic, the reference images may include different nuisances or artifacts. For example, reference image 504 includes nuisance or artifact 508 while reference image 506 does not include any nuisances or artifacts. The reference images may be used to generate different difference images for the test image. For example, difference image 512 may be generated from Test.sub.i−Ref1.sub.i step 510, and difference image 516 may be generated from Test.sub.i−Ref2.sub.i step 514, Since the reference images are different, the two different difference images will also be different in important ways. For example, difference image 512 includes real defect 502 from test image 500 and nuisance or artifact 508 from reference image 504 since those are the two differences between test image 500 and reference image 504.para. 0088-0089).
Therefore, a rationale to support a conclusion that a claim would have been obvious is that all the claimed elements were known in the prior art and one skilled in the art could have combined the elements as claimed by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art.1
With regard to claim 20, claim 20 (a method claim) recites substantially similar limitations to claim 8 (a method claim) and is therefore rejected using the same art and rationale set forth above.
A sampling of the prior art made of record and not relied upon and considered
pertinent to Applicants’ disclosure, includes: U.S. Patent No. 6,036,564 to Scholten et al. that discusses an assembly of electrodes for an electron is inspected. The relative positions of a number of apertures (at least three but preferably four) is determined by means of two optical systems, one for determining the positions of two apertures of electrodes, e.g. the G1 and the G2 electrode, the other for determining the position of the other aperture or apertures.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM D. TITCOMB whose telephone number is (571)270-5190. The examiner can normally be reached 9:30 AM - 6:30 PM (M-F).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen C. Hong can be reached at 571-272-4124. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
WILLIAM D. TITCOMB
Primary Examiner
Art Unit 2178
/WILLIAM D TITCOMB/Primary Examiner, Art Unit 2178 2-11-2026
1 KSR International Co. v. Teleflex Inc., 127 S.Ct. 1727, 82 U.S.P.Q.2d 1385 (2007).