Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/21/2025 has been entered.
Response to Amendment
This communication is in response to the action filed on 10/21/2025.
The claims 1, 6, and 12 are currently amended. Claims 1-16 are currently pending
Response to Arguments
Applicant’s arguments filed on 10/21/2025 on pages 7-10, under REMARKS with respect to 35 U.S.C. 103 claim rejections to claims 1-16 have been fully considered and are persuasive. The rejections to the claims have been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of US 2018/0197285 A1 which sums grey value projections of a single column or row into a single one-dimensional value rather than assigning an averaged sum value to an entire region as argued by the Applicant on pages 7-8 of REMARKS.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or non-obviousness.
Claims 1-4, 6-10, 12-15 are rejected under 35 § U.S.C. 103 as being obvious over US 6,292,582 B1 to Y, LIN et al. (hereinafter “LIN”) in view of US 2023/0005242 A1 to SONG et al (hereinafter “SONG”) in further view of US 2018/0197285 A1 to VINCIGUERRA et al. (hereinafter “VINCIGUERRA”).
As per claim 1, LIN discloses A method comprising: aligning a setup image to a runtime image at a target using a processor thereby generating aligned images (via a computing system comprising an onboard processor for using an alignment file 32 aligning a wafer for an image (run time image) via a camera 26 and comparing the image to pixel based representations of said image (setup image) in order to analyze the images of said wafer for defects and anomalies; figure 4; column 2, lines 27-56; column 9, lines 7-64), determining, using the processor, an image projection in perpendicular x and y directions for polygons in the aligned images (generating image mappings using the computing system in the x and y directions via decomposition window 98 wherein the window uses polygons 96 in order to map the image when searching for defects using the search pattern described and wherein search regions may be rotate 0 to 90 degrees (perpendicular); column 10, lines 5-65; column 14, lines 1-40), and aligning, using the processor, the image projections for the setup image and the runtime image (and aligning via the alignment file 32 ran by the computing system comprising the processor, in order to align primitive sets 172 and 173 (note; as defined by google definition image primitives are basic elements, shapes, lines, points and curves which make up the images and projections are formed by projecting 3D objects onto a 2D surfaces using elements provided by the image primitives); column 9, lines 26-61 column 13, line 55 – column 14, line 28; column 15, lines 1-33). LIN fails to disclose wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection; determining a normalized cross-correlation score for the aligned images, wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another; determining an image projection in perpendicular x and y directions for polygons in the aligned images wherein the image projection is a sum of grey level values along pixels of one of a column or row of one of the aligned images divided by a number of the pixels in the one of the column or the row.
SONG discloses wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection (a pattern identification and positioning system adapted for placing a semiconductor chip on a substrate by using a template image (setup image which acts as the golden/reference image) and an integral image (run time) is converted (generated) from the template image and the target image; figure 2; paragraphs [0016], [0020-0023], [0026]); determining, using the processor, a normalized cross-correlation score for the aligned images, wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another (similarity calculation module 400 calculates a similarity between the template image and the comparison target area of the target image similarity calculation module 400 calculates the similarity between the template image and the comparison target area (displacement) by performing a normalized cross-correlation calculation according to equation 1 which comprises multiple data types as inputs (a similarity of two types of data as a function of a displacement of one relative to another) on the brightness value for each region of the template image and the comparison target area in units of regions; figure 2; paragraphs [0010], [0036-0037], [0063]); determining, using the processor, an image projection in perpendicular x and y directions for polygons in the aligned images, (the template region module 210 may divide the template image into a total of 64 regions which are rectangles as seen in fig 6B (polygons) by dividing the template image into 8 equal parts horizontally and vertically (perpendicular x and y directions) as shown in FIG. 6B the template region module 210 sums the square of the brightness value (gray level value) of each of the pixels belonging to each of the regions equally divided as described above and sets the sum as the brightness value for each region which are arranged/displayed in columns and rows along the x and y axis of the image 6B; figures 4A-B & 6B; paragraphs [0010], [0031-0032], [0034]) of one of the aligned images divided by a number of pixels in the one of the column or row (the computing system using template region module 210 is adapted to sum the square of the brightness value of each of the pixels in each region of the template image and sets the sum as a brightness value for each region for example, the template region module 210 divides the template image into a total of 64 regions by dividing the template image into 8 equal parts horizontally and vertically (creating columns and rows) as shown in FIG. 6B, template region module 210 sums the square of the brightness value of each of the pixels belonging to each of the regions equally divided as described above and sets the sum as the brightness value for each region; figs 6A-6B; paragraphs [0031-0032], [0035]; Equation 1).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify LIN to have wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection; wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another; wherein the image projection is a sum of grey level values along a column or row of SONG reference. The Suggestion/motivation for doing so would have been to provide the ability to normalize and cross corelate image values of two comparison images one acting as a reference image via module 210 which converts the template image into an image in units of regions by such a method, the resolution of the template image may be lowered, but the speed of searching for a pattern of the template image from/within the target image would be improved as suggested by SONG at paragraphs [0016], [0032], and [0037]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine SONG with LIN to obtain the invention as specified in claim 1.
VINCIGUERRA discloses wherein the image projection is a sum of grey level values along pixels in one of a column or row (a sum of the value of the grey levels of the columns and rows of the raw or filtered three-dimensional image, forming a one-dimensional signature sum value; paragraph [0034]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify LIN to have image projection is a sum of grey level values along pixels in one of a column or row of VINCIGUERRA reference. The Suggestion/motivation for doing so would have been to provide as argued in REMARKS top of page 8 a one-dimensional value sum of the grey values in a particular column or row as suggested by VINCIGUERRA at paragraph [0034]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to further combine VINCIGUERRA with LIN in view of SONG to obtain the invention as specified in claim 1.
As per claim 2, LIN in view of SONG in further view of VINCIGUERRA discloses the method of claim 1. Modified LIN further discloses further comprising, using the processor, determining offsets between the setup image and the runtime image for an inspection frame after aligning the image projections (wherein the method carried out by the computing system which includes the processing components performing a comparison between a first and second image offset from one another seen in figures 10a and 10b and can compare offsets of two images of the same structure which would include the primitives of the images acting as the setup and runtime alignment images; figs 10a-10q; column 13, lines 55-67).
As per claim 3, LIN in view of SONG in further view of VINCIGUERRA discloses the method of claim 2. Modified LIN further discloses further comprising, using the processor, determining offsets between a design and the runtime image for the inspection frame (wherein the method carried out by the computing system which includes the processing components performing a comparison between a first and second image offset from one another seen in figures 10a and 10b for comparison (inspection) and can compare offsets of two images of the same structure; figs 10a-10q; column 13, lines 55-67; column 23, line 45-column 24, line 22).
As per claim 4, LIN in view of SONG in further view of VINCIGUERRA discloses the method of claim 3. Modified LIN further discloses comprising placing care areas on the runtime image based on an offset correction using the processor (diagnosing and classifying a detected defect/anomaly using alignment files in automatic modes via an anomaly and detecting and locating computer 30 as a repairable defect or not repairable (care area, which in the specification is defined as “areas identified which need offset correction”), an in tolerance or out of tolerance algorithm is used to determine repair; column 13, lines 55-67; column 23, line 45-column 24, line 22).
As per claim 6, LIN discloses a system comprising: a stage configured to hold a semiconductor wafer (an xy stage 22 which holds semiconductor wafer 20; column 24, lines 8-40); an energy source configured to direct a beam at the semiconductor wafer on the stage (an energy source 25 that can produce energy such as a laser and other light modalities to direct energy at the xy stage 22 holding wafer 20; column 24, lines 8-40); a detector configured to receive the beam reflected from the semiconductor wafer on the stage (wherein wafer anomaly detection system 1 (detector) which receives light energy from the energy source 25 after being directed toward the wafer 20 on stage 22; column 24, lines 8-40); and a processor in electronic communication with the detector (wherein the detection system 1 is caried out via a computer processor resident in computer 46; column 24, lines 8-40), wherein the processor is configured to: align a setup image to a runtime image at a target thereby generating aligned images (via the computing system comprising an onboard processor for using an alignment file 32 aligning a wafer for an image (run time image) via a camera 26 and comparing the image to pixel based representations of said image (setup image) in order to analyze the images of said wafer for defects and anomalies; figure 4; column 2, lines 27-56; column 9, lines 7-64; column 24, lines 8-65); determine an image projection in perpendicular x and y directions for polygons in the aligned images (generating image mappings using the computing system in the x and y directions via decomposition window 98 wherein the window uses polygons 96 in order to map the image when searching for defects using the search pattern described and wherein search regions may be rotate 0 to 90 degrees (perpendicular); column 10, lines 5-65; column 14, lines 1-40); and align the image projections for the setup image and the runtime image (and aligning via the alignment file 32 ran by the computing system comprising the processor, in order to align primitive sets 172 and 173 (note; as defined by google definition image primitives are basic elements, shapes, lines, points and curves which make up the images and projections are formed by projecting 3D objects onto a 2D surfaces using elements provided by the image primitives); column 9, lines 26-61 column 13, line 55 – column 14, line 28; column 15, lines 1-33). LIN fails to disclose wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection; determining a normalized cross-correlation score for the aligned images, wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another; determining an image projection in perpendicular x and y directions for polygons in the aligned images, wherein the image projection is a sum of grey level values along pixels of one of a column or row of one of the aligned images divided by a number of the pixels in the one of the column or the row.
SONG discloses wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection (a pattern identification and positioning system adapted for placing a semiconductor chip on a substrate by using a template image (setup image which acts as the golden/reference image) and an integral image (run time) is converted (generated) from the template image and the target image; figure 2; paragraphs [0016], [0020-0023], [0026]); determine a normalized cross-correlation score for the aligned images, wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another (similarity calculation module 400 calculates a similarity between the template image and the comparison target area of the target image similarity calculation module 400 calculates the similarity between the template image and the comparison target area (displacement) by performing a normalized cross-correlation calculation according to equation 1 which comprises multiple data types as inputs (a similarity of two types of data as a function of a displacement of one relative to another) on the brightness value for each region of the template image and the comparison target area in units of regions; figure 2; paragraphs [0010], [0036-0037], [0063]); determine an image projection in perpendicular x and y directions for polygons in the aligned images, (the template region module 210 may divide the template image into a total of 64 regions which are rectangles as seen in fig 6B (polygons) by dividing the template image into 8 equal parts horizontally and vertically (perpendicular x and y directions) as shown in FIG. 6B the template region module 210 sums the square of the brightness value (gray level value) of each of the pixels belonging to each of the regions equally divided as described above and sets the sum as the brightness value for each region which are arranged/displayed in columns and rows along the x and y axis of the image 6B; figures 4A-B & 6B; paragraphs [0010], [0031-0032], [0034]) of one of the aligned images divided by a number of the pixels in the one of a column or row (the computing system using template region module 210 is adapted to sum the square of the brightness value of each of the pixels in each region of the template image and sets the sum as a brightness value for each region for example, the template region module 210 divides the template image into a total of 64 regions by dividing the template image into 8 equal parts horizontally and vertically (creating columns and rows) as shown in FIG. 6B, template region module 210 sums the square of the brightness value of each of the pixels belonging to each of the regions equally divided as described above and sets the sum as the brightness value for each region; figs 6A-6B; paragraphs [0031-0032], [0035]; Equation 1).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify LIN to have wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection; wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another; wherein the image projection is a sum of grey level values along a column or row of SONG reference. The Suggestion/motivation for doing so would have been to provide the ability to normalize and cross corelate image values of two comparison images one acting as a reference image via module 210 which converts the template image into an image in units of regions by such a method, the resolution of the template image may be lowered, but the speed of searching for a pattern of the template image from/within the target image would be improved as suggested by SONG at paragraphs [0016], [0032], and [0037]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine SONG with LIN to obtain the invention as specified in claim 6.
VINCIGUERRA discloses wherein the image projection is a sum of grey level values along pixels in one of a column or row (a sum of the value of the grey levels of the columns and rows of the raw or filtered three-dimensional image, forming a one-dimensional signature sum value; paragraph [0034]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify LIN to have image projection is a sum of grey level values along pixels in one of a column or row of VINCIGUERRA reference. The Suggestion/motivation for doing so would have been to provide as argued in REMARKS top of page 8 a one-dimensional value sum of the grey values in a particular column or row as suggested by VINCIGUERRA at paragraph [0034]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to further combine VINCIGUERRA with LIN in view of SONG to obtain the invention as specified in claim 6.
As per claim 7, LIN in view of SONG in further view of VINCIGUERRA discloses the system of claim 6. Modified LIN further discloses wherein the energy source is a light source, and wherein the beam is a beam of light (a laser light source which produces a laser light beam; column 5, lines 52-58).
As per claim 8, LIN in view of SONG in further view of VINCIGUERRA discloses the system of claim 6. Modified LIN further discloses wherein the processor is further configured to determine offsets between the setup image and the runtime image for an inspection frame after the image projections are aligned (wherein the method carried out by the computing system which includes the processing components performing a comparison between a first and second image offset from one another seen in figures 10a and 10b; column 13, lines 55-67).
As per claim 9, LIN in view of SONG in further view of VINCIGUERRA discloses the system of claim 8. Modified LIN further discloses wherein the processor is further configured to determine offsets between a design and the runtime image for the inspection frame (wherein the method carried out by the computing system which includes the processing components performing a comparison between a first and second image offset from one another seen in figures 10a and 10b for comparison (inspection) and can compare offsets of two images of the same structure; figs 10a-10q; column 13, lines 55-67; column 23, line 45-column 24, line 22).
As per claim 10, LIN in view of SONG in further view of VINCIGUERRA discloses the system of claim 9. Modified LIN further discloses wherein the processor is further configured to place care areas on the runtime image based on an offset correction (diagnosing and classifying a detected defect/anomaly using alignment files in automatic modes via an anomaly and detecting and locating computer 30 as a repairable defect or not repairable (care area, which in the specification is defined as “areas identified which need offset correction”), an in tolerance or out of tolerance algorithm is used to determine repair; column 13, lines 55-67; column 23, line 45-column 24, line 22).
As per claim 12, LIN discloses a non-transitory computer-readable storage medium (a computing system comprising an onboard processor for using an alignment file 32 aligning a wafer for an image (run time image) via a camera 26 and comparing the image to pixel based representations of said image (setup image) in order to analyze the images of said wafer for defects and anomalies; figure 4; column 2, lines 27-56; column 9, lines 7-64), comprising one or more programs for executing the following steps on one or more computing devices: aligning a setup image to a runtime image at a target thereby generating aligned images (via the computing system comprising an onboard processor for using an alignment file 32 aligning a wafer for an image (run time image) via a camera 26 and comparing the image to pixel based representations of said image (setup image) in order to analyze the images of said wafer for defects and anomalies; figure 4; column 2, lines 27-56; column 9, lines 7-64; column 24, lines 8-65); determining an image projection in perpendicular x and y directions for polygons in the aligned images (generating image mappings using the computing system in the x and y directions via decomposition window 98 wherein the window uses polygons 96 in order to map the image when searching for defects using the search pattern described and wherein search regions may be rotate 0 to 90 degrees (perpendicular); column 10, lines 5-65; column 14, lines 1-40); and aligning the image projections for the setup image and the runtime image (and aligning via the alignment file 32 ran by the computing system comprising the processor, in order to align primitive sets 172 and 173 (note; as defined by google definition image primitives are basic elements, shapes, lines, points and curves which make up the images and projections are formed by projecting 3D objects onto a 2D surfaces using elements provided by the image primitives); column 9, lines 26-61 column 13, line 55 – column 14, line 28; column 15, lines 1-33). LIN fails to disclose wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection; determining a normalized cross-correlation score for the aligned images, wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another; determining an image projection in perpendicular x and y directions for polygons in the aligned images wherein the image projection is a sum of grey level values along pixels of one of a column or row of one of the aligned images divided by a number of the pixels in the one of the column or the row.
SONG discloses wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection (a pattern identification and positioning system adapted for placing a semiconductor chip on a substrate by using a template image (setup image which acts as the golden/reference image) and an integral image (run time) is converted (generated) from the template image and the target image; figure 2; paragraphs [0016], [0020-0023], [0026]); determining a normalized cross-correlation score for the aligned images, wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another (similarity calculation module 400 calculates a similarity between the template image and the comparison target area of the target image similarity calculation module 400 calculates the similarity between the template image and the comparison target area (displacement) by performing a normalized cross-correlation calculation according to equation 1 which comprises multiple data types as inputs (a similarity of two types of data as a function of a displacement of one relative to another) on the brightness value for each region of the template image and the comparison target area in units of regions; figure 2; paragraphs [0010], [0036-0037], [0063]); determining an image projection in perpendicular x and y directions for polygons in the aligned images, (the template region module 210 may divide the template image into a total of 64 regions which are rectangles as seen in fig 6B (polygons) by dividing the template image into 8 equal parts horizontally and vertically (perpendicular x and y directions) as shown in FIG. 6B the template region module 210 sums the square of the brightness value (gray level value) of each of the pixels belonging to each of the regions equally divided as described above and sets the sum as the brightness value for each region which are arranged/displayed in columns and rows along the x and y axis of the image 6B; figures 4A-B & 6B; paragraphs [0010], [0031-0032], [0034]) of one of the aligned images divided by a number of the pixels in the one of the column or the row (the computing system using template region module 210 is adapted to sum the square of the brightness value of each of the pixels in each region of the template image and sets the sum as a brightness value for each region for example, the template region module 210 divides the template image into a total of 64 regions by dividing the template image into 8 equal parts horizontally and vertically (creating columns and rows) as shown in FIG. 6B, template region module 210 sums the square of the brightness value of each of the pixels belonging to each of the regions equally divided as described above and sets the sum as the brightness value for each region; figs 6A-6B; paragraphs [0031-0032], [0035]; Equation 1).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify LIN to have wherein the setup image is a golden image or a reference image, and wherein the runtime image is an image generated during inspection; wherein the normalized cross-correlation score measures a similarity of two types of data as a function of a displacement of one relative to another; wherein the image projection is a sum of grey level values along a column or row of SONG reference. The Suggestion/motivation for doing so would have been to provide the ability to normalize and cross corelate image values of two comparison images one acting as a reference image via module 210 which converts the template image into an image in units of regions by such a method, the resolution of the template image may be lowered, but the speed of searching for a pattern of the template image from/within the target image would be improved as suggested by SONG at paragraphs [0016], [0032], and [0037]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine SONG with LIN to obtain the invention as specified in claim 12.
VINCIGUERRA discloses wherein the image projection is a sum of grey level values along pixels in one of a column or row (a sum of the value of the grey levels of the columns and rows of the raw or filtered three-dimensional image, forming a one-dimensional signature sum value; paragraph [0034]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify LIN to have image projection is a sum of grey level values along pixels in one of a column or row of VINCIGUERRA reference. The Suggestion/motivation for doing so would have been to provide as argued in REMARKS top of page 8 a one-dimensional value sum of the grey values in a particular column or row as suggested by VINCIGUERRA at paragraph [0034]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to further combine VINCIGUERRA with LIN in view of SONG to obtain the invention as specified in claim 12.
As per claim 13, LIN in view of SONG in further view of VINCIGUERRA discloses the non-transitory computer-readable storage medium of claim 12. Modified LIN further discloses wherein the steps further include determining offsets between the setup image and the runtime image for an inspection frame after aligning the image projections (wherein the method carried out by the computing system which includes the processing components performing a comparison between a first and second image offset from one another seen in figures 10a and 10b; column 13, lines 55-67).
As per claim 14, LIN in view of SONG in further view of VINCIGUERRA discloses the non-transitory computer-readable storage medium of claim 13. Modified LIN further discloses wherein the steps further include determining offsets between a design and the runtime image for the inspection frame (wherein the method carried out by the computing system which includes the processing components performing a comparison between a first and second image offset from one another seen in figures 10a and 10b for comparison (inspection) and can compare offsets of two images of the same structure; figs 10a-10q; column 13, lines 55-67; column 23, line 45-column 24, line 22).
As per claim 15, LIN in view of SONG in further view of VINCIGUERRA discloses the non-transitory computer-readable storage medium of claim 14. Modified LIN further discloses wherein the steps further include placing care areas on the runtime image based on an offset correction using the processor (diagnosing and classifying a detected defect/anomaly using alignment files in automatic modes via an anomaly and detecting and locating computer 30 as a repairable defect or not repairable (care area, which in the specification is defined as “areas identified which need offset correction”), an in tolerance or out of tolerance algorithm is used to determine repair; column 13, lines 55-67; column 23, line 45-column 24, line 22).
Claims 5, 11, and 16 are rejected under 35 § U.S.C. 103 as being obvious over US 6,292,582 B1 to Y, LIN et al. (hereinafter “LIN”) in view of US 2023/0005242 A1 to SONG et al (hereinafter “SONG”) in further view of US 2018/0197285 A1 to VINCIGUERRA et al. (hereinafter “VINCIGUERRA”) in further view of US 2006/0038987 A1 to S, MAEDA et al. (hereinafter “MAEDA”).
As per claim 5, LIN in view of SONG in further view of VINCIGUERRA discloses the method of claim 1. Modified LIN fails to disclose wherein aligning the image projections includes: determining projection peak locations for the polygons in the aligned images along the x direction; adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the x direction; determining projection peak locations for the polygons in the aligned images along the y direction; and adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the y direction.
MAEDA discloses wherein aligning the image projections includes: determining projection peak locations for the polygons in the aligned images along the x direction (determine peak to peak value shift and as it appears in the image in the x direction; figure 40; paragraphs [0210-0215]); adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the x direction (reducing (adjusting) the peak shift from 1/4th to ½ and may also be adjusted to 1/8th in the opposite direction if desired wherein the image shift is an adjustable value as desired in the x direction; figure 40; paragraphs [0210-0215]); determining projection peak locations for the polygons in the aligned images along the y direction (determine peak to peak value shift and as it appears in the image in the y direction; figure 40; paragraphs [0210-0215]); and adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the y direction (reducing (adjusting) the peak shift from 1/4th to ½ and may also be adjusted to 1/8th in the opposite direction if desired wherein the image shift is an adjustable value as desired in the y direction; figure 40; paragraphs [0210-0215]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify LIN to have the ability to determine projection peak locations; adjust the image runtime to cause peak overlap in the x and y direction of the image of the MAEDA reference. The Suggestion/motivation for doing so would have been to image matching precision is directly related to image pattern information wherein as the pattern information decreases so does image matching precision making it desirable to maximize the image pattern information as suggested by MAEDA at paragraphs [0212-0213]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine MAEDA with LIN in view of SONG to obtain the invention as specified in claim 5.
As per claim 11, LIN in view of SONG in further view of VINCIGUERRA discloses the system of claim 6. Modified LIN fails to disclose wherein aligning the image projections includes: determining projection peak locations for the polygons in the aligned images along the x direction; adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the x direction; determining projection peak locations for the polygons in the aligned images along the y direction; and adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the y direction.
MAEDA discloses wherein aligning the image projections includes: determining projection peak locations for the polygons in the aligned images along the x direction (determine peak to peak value shift and as it appears in the image in the x direction; paragraphs [0210-0215]); adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the x direction (reducing (adjusting) the peak shift from 1/4th to ½ and may also be adjusted to 1/8th in the opposite direction if desired wherein the image shift is an adjustable value as desired in the x direction; paragraphs [0210-0215]); determining projection peak locations for the polygons in the aligned images along the y direction (determining peak to peak value shift and as it appears in the image in the y direction; paragraphs [0210-0215]; paragraphs [0210-0215]); and adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the y direction (reducing (adjusting) the peak shift from 1/4th to ½ and may also be adjusted to 1/8th in the opposite direction if desired wherein the image shift is an adjustable value as desired in the y direction; paragraphs [0210-0215]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify LIN to have to determine projection peak locations; adjust the image runtime to cause peak overlap in the x and y direction of the image of the MAEDA reference. The Suggestion/motivation for doing so would have been to image matching precision is directly related to image pattern information wherein as the pattern information decreases so does image matching precision making it desirable to maximize the image pattern information as suggested by MAEDA at paragraphs [0212-0213]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine MAEDA with LIN in view of SONG to obtain the invention as specified in claim 11.
As per claim 16, LIN in view of SONG in further view of VINCIGUERRA discloses the non-transitory computer-readable storage medium of claim 12. Modified LIN fails to disclose wherein the steps further include: determining projection peak locations for the polygons in the aligned images along the x direction; adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the x direction; determining projection peak locations for the polygons in the aligned images along the y direction; and adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the y direction.
MAEDA discloses wherein the steps further include: determining projection peak locations for the polygons in the aligned images along the x direction (determine peak to peak value shift and as it appears in the image in the x direction; paragraphs [0210-0215]); adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the x direction (reducing (adjusting) the peak shift from 1/4th to ½ and may also be adjusted to 1/8th in the opposite direction if desired wherein the image shift is an adjustable value as desired in the x direction; paragraphs [0210-0215]); determining projection peak locations for the polygons in the aligned images along the y direction (determining peak to peak value shift and as it appears in the image in the y direction; paragraphs [0210-0215]; paragraphs [0210-0215]); and adjusting at least one of the runtime image or the setup image so the projection peak locations overlap along the y direction (reducing (adjusting) the peak shift from 1/4th to ½ and may also be adjusted to 1/8th in the opposite direction if desired wherein the image shift is an adjustable value as desired in the y direction; paragraphs [0210-0215]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to further modify LIN to have to determine projection peak locations; adjust the image runtime to cause peak overlap in the x and y direction of the image of the MAEDA reference. The Suggestion/motivation for doing so would have been to image matching precision is directly related to image pattern information wherein as the pattern information decreases so does image matching precision making it desirable to maximize the image pattern information as suggested by MAEDA at paragraphs [0212-0213]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine MAEDA with LIN in view of SONG to obtain the invention as specified in claim 16.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEVIN JACOB DHOOGE whose telephone number is (571) 270-0999. The examiner can normally be reached 7:30-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached on (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800- 786-9199 (IN USA OR CANADA) or 571-272-1000.
/Devin Dhooge/
USPTO Patent Examiner
Art Unit 2677
/ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677