Prosecution Insights
Last updated: April 19, 2026
Application No. 18/962,859

IMAGE PROCESSING APPARATUS, INSPECTION APPARATUS, REVIEW APPARATUS, IMAGE PROCESSING METHOD, INSPECTION METHOD, AND REVIEW METHOD

Non-Final OA §103
Filed
Nov 27, 2024
Examiner
DAGNEW, MEKONNEN D
Art Unit
2638
Tech Center
2600 — Communications
Assignee
Lasertec Corporation
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
604 granted / 728 resolved
+21.0% vs TC avg
Strong +16% interview lift
Without
With
+15.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
29 currently pending
Career history
757
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
63.7%
+23.7% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
6.3%
-33.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 728 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 10 are rejected under 35 U.S.C. 103 as being unpatentable over ONO (US 20150091714 A1) in view of Yuan et al. (US 20120314971 A1) As of Claim 1: ONO teaches an image processing apparatus comprising: an evaluation parameter acquisition unit (¶0039 and exhibit 160) configured to acquire an evaluation parameter for evaluating the information about each pixel of the plurality of pixels included in an evaluation image (¶0039), based on comparison between a reference image and the evaluation image (¶0039 and note that the evaluation value deriving module 160 extracts any one of blocks from one of the images (reference image) to be used as a reference, and extracts a multiple blocks from the other image (comparison image) to be used as a comparison target. The evaluation value deriving module 160 then derives the multiple evaluation values indicative of correlations between the reference block and the respective comparison blocks. ), the comparison being based on information about each pixel of a plurality of pixels included in the reference image and information about each pixel of a plurality of pixels included in the evaluation image corresponding to the reference image (¶0040 and note that the pattern matching may be a comparison in luminance per block between the pair of images.); a statistical value acquisition unit configured to acquire a statistical value of the evaluation parameters for a plurality of predetermined specific pixels in the evaluation image (¶0045 and note that The evaluation value deriving module 160 then subtracts the above-described average value Ab from the luminance Eb(i, j) of the pixels 202 within the reference block 204 to derive an average value difference luminance EEb(i, j) as the following Equation 2: EEb(i,j)=Eb(i,j)-Ab (Equation 2)). Yuan is a similar or analogous system to the claimed invention as evidenced Yuan teaches the luminance modification parameters may be applied to the non-linear function to push the luminance values of the regions of the image to the luminance values associated with the respective optimal zones of the regions. Additional techniques may be applied to the modified image in order to reduce artifacts, such as the halo effect, overamplified dark or bright tones, and so on that would have prompted a predictable variation of ONO by applying Yuan’s known principal of a standardized evaluation parameter acquisition unit configured to acquire a standardized evaluation parameter for evaluating the information about each pixel of the plurality of pixels included in the evaluation image by standardizing the evaluation parameter based on the statistical value (¶¶0042-0046 and note that the optimal zone assignment module 432 may generate or retrieve from the data store 416 histograms showing luminance values with respect to intensity for a first region of the input image 422 that is adjacent to a second region. The optimal zone assignment module 432 may determine a distance between the mean luminance value of the first region and the mean luminance value of the second region. In a particular example, the first region may be assigned to zone II and have a mean luminance value of 51 and the second region may be assigned to zone VII and have a mean luminance value of 179. In this example, the difference between the mean luminance values of the first region and the second region is 128); and an evaluation unit configured to evaluate the standardized evaluation parameter (¶¶0020,0025 and note that FIG. 2 illustrates an image that has been segmented into a number of regions and each region has been assigned a respective zone indicating a luminance of the respective region. In particular, FIG. 2 includes an input image 202 and a segmented image 204 that includes a number of regions 206-230. Each of the regions 206-230 has been assigned a particular zone of a plurality of zones. Each of the zones is associated with a range of luminance values on a scale of luminance values 232. The pixels of each respective region 206-230 have been grouped together because of similarities between the luminance values of the pixels. For example, pixels associated with the region 216 shown in the input image 202 have been assigned to zone VII because these pixels have luminance values in the range associated with zone VII in the scale of luminance values 232). In view of the motivations such as produce a modified version of the input image that improves the appearance of overexposed and/or underexposed regions of the input image thereby further improving the exposure of an input image by automatically modifying a non-linear function that characterizes the luminance of shadow, mid-tone, and highlight portions of the image as disclosed in ¶0004 of Yuan and one of ordinary skill in the art would have implemented the claimed variation of the prior art system of ONO. Therefore, the claimed invention would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. As of Claim 2: the evaluation parameter acquisition unit acquires, as the evaluation parameter of each pixel, a difference value obtained by comparing luminance of each pixel in the reference image with the luminance of each pixel in the evaluation image (Yuan ¶¶0042-0046). As of Claim 3: ONO in view of Yuan further teaches the specific pixels include at least one pixel among the pixels corresponding to pixels whose luminance in the reference image belongs to a specific luminance range and the pixels corresponding to pixels whose luminance in the evaluation image belongs to the specific luminance range (ONO ¶¶0016,0066,0067 and note that if the average value of the luminance of the pixels within the predetermined area is low, the evaluation range setting module 162 sets the evaluation range smaller, and, on the other hand, if the average value is high, the evaluation range setting module 162 sets the evaluation range larger). As of Claim 10: ONO in view of Yuan further teaches a processing unit configured to acquire a processed value through arithmetic processing based on the standardized evaluation parameter and the evaluation parameter, wherein the evaluation unit evaluates the processed value (¶¶0058-0060 of Yuan and note that the process 600 includes identifying one or more regions of the image that are adjacent to the particular region of the image, and at 612, a relative contrast between the particular region and the one or more regions adjacent to the particular region is determined. A distance between a mean luminance value of the particular region and the mean luminance value of each of the adjacent regions may be determined. ). Claims 4-13 are rejected under 35 U.S.C. 103 as being unpatentable over ONO (US 20150091714 A1) in view of Yuan et al. (US 20120314971 A1), and further in view of KASAHARA (US 20170262974 A1) As of Claim 4:KASAHARA is a similar or analogous system to the claimed invention as evidenced KASAHARA teaches an image processing apparatus 130, and an output device 140. The camera 120 captures an image of a target 110. The image processing apparatus 130 determines a state of the target 110 using the captured image that would have prompted a predictable variation of ONO by applying KASAHARA’s known principal of the statistical value acquisition unit acquires a specific statistical value of the evaluation parameter for the specific pixels corresponding to the pixels belonging to the specific luminance range, and the standardized evaluation parameter acquisition unit acquires the standardized evaluation parameter by standardizing the evaluation parameters of the specific pixels based on the specific statistical value (¶¶0046-0050 and note that he EM algorithm that uses the mean pixel value and the pixel variance as parameters is performed at S402. In the learning algorithm of the present embodiment, a z-score is used as a first feature. The z-score is given by the following Equation (1). The z-score has a function of giving the deviation value of the n.sup.th image from the k.sup.th model image at the pixel (x,y); the z-score is a value representing the distance from the mean value on an assumption that the probabilistic model follows a normal deviation. In the present embodiment, an example where the z-score is used as the first feature is described; however, the first feature is not limited to the z-score, but can be any feature that allows probabilistic model calculation using the pixel values). In view of the motivations such as using the parameters can be used in inspection, recognition, determination, and the like of the target thereby enabling efficient detection of an anomaly in an image even when a plurality of shapes are learned simultaneously as disclosed in ¶0074 of KASAHARA and one of ordinary skill in the art would have implemented the claimed variation of the prior art system of ONO. Therefore, the claimed invention would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. As of Claim 5: ONO in view of Yuan in view of KASAHARA further teaches the statistical value acquisition unit divides the specific luminance range into a plurality of specific luminance range parts, divides the specific pixels into a plurality of specific pixel parts corresponding to the pixel parts belonging to the respective luminance range parts, and acquires, for each specific pixel part (¶¶0039-0040 of ONO), a specific statistical value part that is a statistical value of the evaluation parameter for the specific pixel parts, and the standardized evaluation parameter acquisition unit acquires the standardized evaluation parameter for the pixels corresponding to each specific pixel part by standardizing the evaluation parameter of each specific pixel part based on each specific statistical value part (¶¶0036-0038 of KASAHARA). As of Claim 6: ONO in view of Yuan in view of KASAHARA further teaches at least one of the reference image and the evaluation image includes an image of a specimen having a pattern (¶¶0039-0040 of ONO), and the specific pixels include at least one pixel among the pixels corresponding to the pixels belonging to specific regions partitioned based on the pattern in the reference image and the pixels corresponding to the pixels belonging to the specific regions partitioned based on the pattern in the evaluation image (Yuan¶¶0042-0046). As of Claim 7: ONO in view of Yuan in view of KASAHARA further teaches the statistical value acquisition unit acquires a specific statistical value of the evaluation parameter for the specific pixels corresponding to the pixels belonging to the specific regions (¶¶0036-0038 of KASAHARA), and the standardized evaluation parameter acquisition unit acquires the standardized evaluation parameter by standardizing the evaluation parameters of the specific pixels based on the specific statistical value (¶¶0036-0038 of KASAHARA). As of Claim 9: ONO in view of Yuan in view of KASAHARA further teaches the statistical value acquisition unit divides the specific regions into a plurality of specific region parts, divides the specific pixels into a plurality of specific pixel parts corresponding to the pixel parts belonging to the respective specific region parts, and acquires, for each specific pixel part (¶¶0036-0038 of KASAHARA and note a statistical process pixel values of pixels(x,y) are acquired from the images. Mean pixel values, each of which is a mean value of the pixel values at a pixel(x,y) acquired from the plurality of images, and variances at the pixels are obtained. The mean pixel value corresponds to the first parameter. The pixel variance corresponds to the second parameter. Although the statistical process of the present invention is performed using the mean values and the variances in this example, any parameter that describes probability distribution of pixel values may be used), a specific statistical value part that is a statistical value of the evaluation parameter for the specific pixel parts, and the standardized evaluation parameter acquisition unit acquires the standardized evaluation parameter for the pixels corresponding to each specific pixel part by standardizing the evaluation parameter of each specific pixel part based on each specific statistical value part (¶¶0063-0067 of KASAHARA). As of Claim 9: ONO in view of Yuan in view of KASAHARA further teaches the evaluation unit performs at least one of evaluation of error existence for a subject specimen in the evaluation image based on comparison between the standardized evaluation parameter and a predetermined threshold value (¶0062 of KASAHARA and note that because Z.sub.nk can be an index of the deviation distance from the mean. Accordingly, Z.sub.nk has a characteristic that when the target has a defect at pixel (x,y), the value of large, whereas when the pixel value is close to the mean value at pixel (x,y), the value of S.sub.n (x,y) approaches zero. Accordingly, at S604, the clustering unit 135 clusters the values of the S.sub.n (x,y) to determine a defect of the target. The defect area determiner 136 may eliminate, from the defect detection, pixels lying in an area of a size equal to or smaller than a minimum size of defect that can occur in the target, in the light of computational error or the like. After the determination at S604, processing proceeds to S605 where an inspection result is generated and processing ends. ), evaluation of, for the pixel, a correction aspect to be performed on the evaluation image based on the standardized evaluation parameter, and evaluation of error existence for a corrected image obtained by correcting the evaluation image based on the standardized evaluation parameter (¶¶0033 and a display unit 122 of KASAHARA). As of Claim 11: ONO in view of Yuan in view of KASAHARA further teaches An inspection apparatus comprising the image processing apparatus according to claim 1, for a subject specimen in the evaluation image (¶0057 of KASAHARA). As of Claim 12: ONO in view of Yuan in view of KASAHARA further teaches a review apparatus comprising: the image processing apparatus according to claim 1; and a monitor for reviewing the reference image, the evaluation image, the comparison image, and the standardized image (¶¶0033,0040 and a display unit 122 of ONO). As of Claim 13: Claim 13 is an image processing method for claim 1 and is addressed in Claim 1. As of Claim 14: ONO in view of Yuan in view of KASAHARA further teaches an inspection method comprising the image processing method according to claim 13, for inspecting a subject specimen in the evaluation image(¶¶0052,0060 of KASAHARA). As of Claim 15: ONO in view of Yuan in view of KASAHARA further teaches a review method comprising: the image processing method according to claim 13; and reviewing the reference image, the evaluation image, the comparison image, and the standardized image with a monitor (¶¶0039-0040 of ONO and ¶¶0036-0038 of KASAHARA). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEKONNEN D DAGNEW whose telephone number is (571)270-5092. The examiner can normally be reached on 8:00AM-5:00PM M-Th. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lin Ye can be reached on 571-272-7372. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MEKONNEN D DAGNEW/Primary Examiner, Art Unit 2638
Read full office action

Prosecution Timeline

Nov 27, 2024
Application Filed
Feb 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593143
SOLID-STATE IMAGING DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12586142
IMAGE CAPTURING METHOD AND DISPLAY METHOD FOR RECOGNIZING A RELATIONSHIP AMONG A PLURALITY OF IMAGES DISPLAYED ON A DISPLAY SCREEN
2y 5m to grant Granted Mar 24, 2026
Patent 12585173
LENS BARREL
2y 5m to grant Granted Mar 24, 2026
Patent 12581022
DATA CREATION METHOD AND DATA CREATION PROGRAM
2y 5m to grant Granted Mar 17, 2026
Patent 12574662
THRESHOLD VALUE DETERMINATION METHOD, THRESHOLD VALUE DETERMINATION PROGRAM, THRESHOLD VALUE DETERMINATION DEVICE, PHOTON NUMBER IDENTIFICATION SYSTEM, PHOTON NUMBER IDENTIFICATION METHOD, AND PHOTON NUMBER IDENTIFICATION PROCESSING PROGRAM
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
99%
With Interview (+15.8%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 728 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month