Prosecution Insights
Last updated: April 19, 2026
Application No. 18/383,452

OPTICAL DETECTION DEVICE OF DETECTING WHETHER A TARGET OBJECT HAS DIFFERENT SURFACE TREATMENT FEATURES AND METHOD OF FORMING MARKERS

Non-Final OA §102§103
Filed
Oct 24, 2023
Examiner
AZIMA, SHAGHAYEGH
Art Unit
2671
Tech Center
2600 — Communications
Assignee
Pixart Imaging Inc.
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
93%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
286 granted / 350 resolved
+19.7% vs TC avg
Moderate +11% lift
Without
With
+11.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
36 currently pending
Career history
386
Total Applications
across all art units

Statute-Specific Performance

§101
15.8%
-24.2% vs TC avg
§103
42.5%
+2.5% vs TC avg
§102
13.9%
-26.1% vs TC avg
§112
14.5%
-25.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 350 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This action is in response to the applicant's communication filed on 11/26/2025. In virtue of this communication, claims 1-16 as elected by applicant filled on 11/26/2025 are currently pending in the instant application. Claims 17-18 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected embodiment, there being no allowable generic or linking claim. Drawings The drawings were received on 10/24/2023 have been reviewed by Examiner and they are acceptable. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Friedhoff (US 2006/0177137). As per claim 1, An optical detection device of detecting whether a target object has different surface treatment features, the optical detection device comprising: “an image sensor adapted to acquire a detection image containing the target object;”(Friedhoff, ¶[0020] discloses an imaging system that includes an image-capture device capable of capturing a multi-spectral image of a scene. Further ¶[0022] discloses utilizing the imaging device for identifying objects and shadows. ¶[0042] discloses digital image of a scene provides a two-dimensional matrix of pixels each having a brightness value that is a product of illumination intensity corresponding to a portion of the scene imaged onto that pixel and the surface reflectance of that portion.) “ and an operation processor electrically connected to the image sensor,”(¶[0020] discloses an imaging system that includes an image-capture device capable of capturing a multi-spectral image of a scene and a processor programmed to detect edges in the image and analyze the changes in brightness values across such boundaries to differentiate illumination boundaries from reflection boundaries.) “and adapted to compute variation of an image shading parameter of the detection image and further to determine whether a boundary of the surface treatment features are detected by the image sensor in accordance with the variation of the image shading parameter.”(Friedhoff, ¶[0010] discloses a brightness boundary can be identified as a reflectance boundary if the brightness shifts across all of the spectral bands do not exhibit concordance, e.g., the brightness value measured in one or more of the bands does not increase or decrease along with the values measured in the other bands. ¶[0016] discloses distinguishing reflectance boundaries from illumination boundaries by identifying one or more brightness boundaries in the image, and identifying each brightness boundary as a reflection or an illumination boundary based on a comparison of a spectral signature related to brightness values. ¶[0017] discloses The brightness boundary can then be identified as a reflectance or an illumination boundary based on correlation of changes of brightness values across the boundary in the wavelength bands. For example, the boundary can be identified as a reflectance boundary when at least one of the wavelength bands exhibits a decrease, and others exhibit an increase, in brightness from a bright side, i.e., a side having a higher brightness, to the other side of the boundary. Alternatively, the boundary can be identified as an illumination boundary when the wavelength bands exhibit a decrease in brightness from a bright side of the boundary to the other. ¶[0018] discloses for each of the wavelength bands and each of the brightness boundaries, a brightness difference across the boundary is calculated. ¶[0020] discloses an imaging system that includes an image-capture device capable of capturing a multi-spectral image of a scene and a processor programmed to detect edges in the image and analyze the changes in brightness values across such boundaries to differentiate illumination boundaries from reflection boundaries. Further see ¶[0048] and ¶[0051]). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 2, 14, 15, and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137), in view of Rueb (US 2022/0003536). As per claim 2, The optical detection device of claim 1, However Friedhoff is silent on the following which would have been obvious in view of Rueb from similar filed of endeavor “wherein the image shading parameter is an image contrast value or a speckle size of the detection image.” (Rueb, Figure. 4, ¶[0024] discloses a high contrast speckle pattern that is primarily the result of fine, almost imperceivable scratches or micro-scratches on the surface of the aluminum. Further Figure 5, ¶[0025] discloses a low contrast of speckle pattern.¶[0030] discloses The demarcation between two different speckle characteristics define the actual boundary of the placed piece 26 on the work surface 24. The actual boundary is then compared to the required boundary visibly defined by the laser template during placement.) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Rueb technique of detection of different characteristic of placed piece into Friedhoff technique to provide the known and expected uses and benefits of Rueb technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Rueb to Friedhoff in order to improve identifying correct component placement on a workplaces. (Refer to Rueb paragraph [0002].) As per claim 14, The optical detection device of claim 1, However Friedhoff is silent on the following which would have been obvious in view of Rueb from similar filed of endeavor “wherein the optical detection device further comprises a laser light source adapted to project an illumination beam onto the target object.” (Rued, ¶[0008] discloses a laser beam provides a very intense illumination of the piece or component and the work surface source, overcoming any poor reflectivity of the surface. Thus, the resulting interference characteristic of the returned light or speckle is capable of distinguishing the microstructure of the surface of the inspected piece or component from the work surface. ¶[0024] discloses The interaction of coherent light, in this example generated by a laser with a surface of material may be examined through various modes.) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Rueb technique of detection of different characteristic of placed piece into Friedhoff technique to provide the known and expected uses and benefits of Rueb technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Rueb to Friedhoff in order to improve identifying different component on a workplaces. (Refer to Rueb paragraph [0002].) As per claim 15, The optical detection device of claim 14, Friedhoff as modified by Rueb further discloses “wherein a size of the surface treatment features is greater than or equal to an illumination range of the illumination beam projected onto the target object.” (Rueb, ¶[0021] discloses when a laser illuminates an area that is large relative to an area that is imaged by a camera. Thus, the camera images an area that is smaller than, for example, a diameter of a laser beam resulting in an image that appears as a fine speckle pattern. In this arrangement, a transition of a laser beam from one surface to a new surface that includes a texture transition causes an abrupt change in laser speckle pattern. Variable speckle pattern response is shown varying with surface changes as the scan moves from one surface to another surface. ) As per claim 16, The optical detection device of claim 1, However Friedhoff is silent on the following which would have been obvious in view of Rueb from similar filed of endeavor “wherein the optical detection device is applied to detect whether the target object made by metal material has the different surface treatment features.” (Rueb, ¶[0008] discloses the resulting interference characteristic of the returned light or speckle is capable of distinguishing the microstructure of the surface of the inspected piece or component from the work surface. The process and system of the present invention may be used to identify alignment of any materials, whether visibly distinguishable or not. This includes placement of trusses and aligning nail plates, fabrics, metallic components, mating metallic structures and any two components that may be assembled or mated in a manner that requires verification that the assembly was performed properly. Further see ¶[0021].) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Rueb technique of detection of different characteristic of placed piece into Friedhoff technique to provide the known and expected uses and benefits of Rueb technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Rueb to Friedhoff in order to improve identifying different component on a workplaces. (Refer to Rueb paragraph [0002].) Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137), in view of Mcelvain et al. (US 2021/0360139). As per claim 3, The optical detection device of claim 1, However Friedhoff is silent on the following which would have been obvious in view of Mcelvain from similar filed of endeavor “wherein the operation processor computes the variation of the image shading parameter of the detection image when an auto exposure mode of the image sensor is actuated.”(Mcelvain, ¶[0006] discloses an auto exposure method for a spatially-multiplexed-exposure (SME) high-dynamic-range (HDR) image sensor includes (a) retrieving raw image data from an exposure of the SME HDR image sensor, ¶[0008] discloses an auto exposure method for an image sensor includes (a) evaluating variance, for each of a plurality of histograms of the pixel values from a respective plurality of individual exposures of the image sensor. [0109] FIG. 11 illustrates one auto exposure method 1100, for a color SME HDR image sensor, that generates and considers an HDR histogram of luminance values. ¶[0112] In a step 1130, method 1100 performs steps 520 and 530 to generate an HDR histogram of luminance values based upon the exposure-time-specific luminance values obtained in step in step 1120. Further see ¶[0216] discloses Step 4310 determines an optimal exposure time based at least in part upon minimization of entropy variance Var(S.sub.i), and step 4320 outputs the optimal exposure time to the single-exposure-time image sensor. For example, exposure time controller 4200 may (a) in step 4310, process raw image data 4280 from single-exposure-time image sensor 4210 and employ entropy variance optimizer 4250 to determine an optimal exposure time 4290. ) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Mcelvain technique of autoexposure of image sensor into Friedhoff technique to provide the known and expected uses and benefits of Mcelvain technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Mcelvain to Friedhoff in order to optimize exposure time and eliminate edge artifacts and motion artifacts. (Refer to Mcelvain paragraph [0003-0005].) Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137), in view of Hyland et al. (US 2020/0193212). As per claim 4, The optical detection device of claim 1, However Friedhoff is silent on the following which would have been obvious in view of Hyland from similar filed of endeavor “wherein the operation processor is adapted to further compare the image shading parameter with a predefined threshold, and determine the boundary is detected by the image sensor when the image shading parameter crosses the predefined threshold.” (Hyland, ¶[0019] discloses measuring a range of intensity in the image; determining a threshold value with reference to a percentage of the range of intensity in the image. ¶[0057] discloses an intensity based method is applied to determine a first boundary. As already discussed an intensity based method is any method that identifies a boundary based on the intensity of the image, such as a thresholding method. ¶[0058] discloses where each pixel is categorized as ‘particle’ if ‘above’ the threshold, and ‘not particle’ if ‘below’ the threshold. The term ‘above’ may mean having a higher intensity value than a threshold intensity. The first boundary may be determined from the boundaries of the regions identified as ‘particle’ in the logical image. ¶[0087] discloses The intensity based method, which in this case is an adaptive thresholding method (as already described), correctly identifies the boundary 160 of the particle 140 by categorizing pixels in the image as particle 161 and not-particle 162 based on a threshold intensity. ) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Hyland technique of boundary identification into Friedhoff technique to provide the known and expected uses and benefits of Hyland technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Hyland to Friedhoff in order to accurately identify edges of particles in an image. (Refer to Hyland, paragraph [0002].) Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137) in view of Hyland et al. (US 2020/0193212), further in view of Guha et al. (US 6,750,466). As per claim 5, The optical detection device of claim 4, However Friedhoff as Hyland does not explicitly disclose on the following which would have been obvious in view of Guha from similar filed of endeavor “wherein the operation processor is adapted to further compute a period length of the image shading parameter crossing the predefined threshold, and utilize the period length to acquire a size of one of the surface treatment features.” (Guha, Col. 3, lines 60-62, 65-67 discloses The outputs of the multi-group thresholders are video signals that include potential web flaw data. The prioritized signal from the multi-pipeline flaw detection pre-processor is sent to a run line encoder to determine the start and stop pixels for the detected web flaws. Further Col. 8, lines 14-21 discloses The group information from the multi-level thresholder 172 is sent to a run length encoder ("RLE") 184 to generate data regarding the location of the pixels that are on the leading and the following edge of a group. For example, a first group that exceeds a threshold, as determined by the multi-level thresholder 172, may be identified as located on a first line number along the machine direction, and starting at pixel 1000 and ending at pixel 1010 along the cross direction. Further Col. 8, lines 57-60 disclose 2D blob analysis data 192, including the bounding box data, the area, the length, the width and the aspect ratio, is analyzed according to a predetermined inspect/reject criteria 194 to determine whether each identified blob is a flaw or defect. Col. 9, line 55-57 discloses the multi-level thresholder utilizes up to 16 designation groups. Upon completion of the inspect/reject criteria analysis 230, the flaw classifications and statistics, flaw dimensions, and image data 232 is transmitted to the host computer.) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Guha technique of Web material inspection in a machine into Friedhoff as modified by Hyland technique to provide the known and expected uses and benefits of Guha technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff as modified by Hyland. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Guha to Friedhoff as modified by Hyland in order to accurately identify web material flaws and defects a system. (Refer to Guha, Col. 1, background line 11-19.) Claim(s) 6 and 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137) in view of Hyland et al. (US 2020/0193212), further in view of Palmen et al. (US 2017/0060494). As per claim 6, The optical detection device of claim 4, However Friedhoff as Hyland does not explicitly disclose on the following which would have been obvious in view of Palmen from similar filed of endeavor “wherein the operation processor is adapted to further compute a number of times that the image shading parameter crosses the predefined threshold, and utilize the number of times to acquire a number of the boundary detected by the image sensor.” (Palmen, ¶[0019], further ¶[0105] discloses Edge Determination can then be done by counting the number of crossings at the Global Threshold confirming whether the count conforms to or is considered non-conforming to a known bar code symbology. ¶[0108] discloses Each transition from a bar to a space, or a space to a bar, treating the quiet zones as spaces, is an “edge” whose contrast is determined as the difference between the peak values of space reflectance and bar reflectance in that space and that bar. Each edge in the scan profile may be measured.) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Palmen technique of printing process inspection into Friedhoff as modified by Hyland technique to provide the known and expected uses and benefits of Palmen technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff as modified by Hyland. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Palmen to Friedhoff as modified by Hyland in order to provide faster printing and scanning with higher quality. (Refer to Palmen ¶[0016].) As per claim 7, The optical detection device of claim 6, Friedhoff as modified by Hyland as modified by Palmen “wherein the operation processor is adapted to further compute a period length of the image shading parameter crossing the predefined threshold, and utilize the period length and the number of times to acquire barcodes or encode data represented by arrangement of the surface treatment features.”(Palmen, ¶[0019], further ¶[0105] discloses Edge Determination can then be done by counting the number of crossings at the Global Threshold confirming whether the count conforms to or is considered non-conforming to a known bar code symbology. ¶[0111] discloses Decode: bar code will Pass on Decode when the established bar and space widths can be converted into the correct series of valid characters using the standard Reference Decode algorithm. ¶[0112] discloses each symbology has published dimensional relationships for element widths and its decode algorithm provides margins or tolerances for errors in the printing and reading process. Decodability measures the amount of margin left for the reading process after printing the bar code, in the widths of elements or element combinations that are measured by the symbology decode algorithm. ) Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137), in view of Huang et al. (US 2013/0215257). As per claim 8, The optical detection device of claim 1, However Friedhoff does not explicitly disclose the following which would have been obvious in view of Huang from similar filed of endeavor “wherein the operation processor is adapted to further detect a structural feature formed on the target object for computing relative position change of the image sensor and the target object.” (Huang, ¶[0034] discloses rk surface S in a first brightness value and a second brightness value. The image sensor 13 receives reflected light from the work surface S and outputs first image frames I.sub.1 corresponding to the first brightness value and second image frames I.sub.2 corresponding to the second brightness value. The processing unit 15 calculates a differential image (I.sub.1-I.sub.2) of the first image frames I.sub.1 and the second image frames I.sub.2 temporally adjacent to each other, calculates a displacement according to the differential image (I.sub.1-I.sub.2), and identifies an operating state according to an average intensity (B.sub.1-B.sub.2) of the differential image (I.sub.1-I.sub.2), wherein the average intensity (or the average intensity difference) may be calculated by: (1) calculating a first intensity B.sub.1 of the first image frame I.sub.1 and a second intensity B.sub.2 of the second image frame I.sub.2 at first and then calculating a difference (or an average intensity difference) of the first intensity B.sub.1 and the second intensity B.sub.2; or (2) calculating the differential image (I.sub.1-I.sub.2) at first and then directly calculating the average intensity (B.sub.1-B.sub.2) of the differential image (I.sub.1-I.sub.2). ) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Huang technique of optical navigation into Friedhoff technique to provide the known and expected uses and benefits of Huang technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Huang to Friedhoff in order to accurately detect displacement position on the workspace . (Refer to Huang, paragraph [0006].) Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137), in view of Huang et al. (US 2013/0215257), further in view of Cahill et al. (US 2006/0054608). As per claim 9, The optical detection device of claim 8, However Friedhoff as modified by Huang does not explicitly disclose the following which would have been obvious in view of Cahill from similar filed of endeavor “wherein the operation processor is adapted to further analyze and acquire an interval between plural boundaries of the surface treatment features, and utilize the interval to calibrate the relative position change.” (Cahill, ¶[0122] discloses calibrate the laser marking field by marking a grid on test mirror and measuring the grid. ¶[0130] discloses the calibration target may be within the "tool area" as shown. A correction for offset, scale, and rotation is applied. Further see ¶[0134], ¶[0147]. ¶[0210] discloses The images are superimposed and, using pattern-matching software, a correction offset, angle, and scale is calculated to align the bottom camera's image to the top camera. FIG. 17A illustrates a calibration target, the image of which is to vary with offset, scale and rotation. ) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Cahill technique of calibration laser marking system into Friedhoff as modified by Huang technique to provide the known and expected uses and benefits of Cahill technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff as modified by Huang. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Cahill to Friedhoff as modified by Huang in order to accurately position laser beam in a desired position. (Refer to Cahill paragraph [0008].) Claim(s) 10 and 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137), in view of Miller et al. (US 2025/0052979). As per Claim 10, The optical detection device of claim 1, However Friedhoff does not explicitly disclose the following which would have been obvious in view of Miller from similar filed of endeavor “wherein the operation processor is adapted to further compute a difference between a first pixel having maximum pixel intensity and a second pixel having minimum pixel intensity of the detection image for being the image shading parameter.” (Miller, ¶[0087] discloses the images are analyzed by determining a maximum and minimum pixel intensity in each image, and then calculating the difference between the maximum and minimum pixel intensity to yield a contrast metric. Among a set of images, the images with the highest contrast metric (i.e., the largest difference between maximum and minimum pixel intensities) may be determined as focus part.) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Miller technique of surface sensing into Friedhoff technique to provide the known and expected uses and benefits of Miller technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Miller to Friedhoff in order to accurately identify positioning of an object pattern. (Refer to Miller paragraph [0006].) As per claim 11, The optical detection device of claim 1, However Friedhoff does not explicitly disclose the following which would have been obvious in view of Miller from similar filed of endeavor “wherein the operation processor is adapted to further divide all pixels of the detection image into a first pixel group and a second pixel group in accordance with each pixel intensity of theforesaid all pixels, and compute a difference between first average intensity of the first pixel group and second average intensity of the second pixel group for being the image shading parameter.” (Miller, ¶[0087] discloses the images are analyzed by determining a maximum and minimum pixel intensity in each image, and then calculating the difference between the maximum and minimum pixel intensity to yield a contrast metric. Among a set of images, the images with the highest contrast metric (i.e., the largest difference between maximum and minimum pixel intensities) may be determined as focus part. The maximum and/or minimum pixel intensities may be determined from a single pixel intensity, or may be averaged over a group of pixels.) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Miller technique of surface sensing into Friedhoff technique to provide the known and expected uses and benefits of Miller technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Miller to Friedhoff in order to accurately identify positioning of an object pattern. (Refer to Miller paragraph [0006].) Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Friedhoff (US 2006/0177137), in view of Menon et al. (US 20190228507). As per claim 13 , The optical detection device of claim 1, However Friedhoff does not explicitly disclose the following which would have been obvious in view of Menon from similar filed of endeavor “wherein the image sensor excludes an optical lens.” (Menon, ¶[0004] discloses The light can be exposed onto the image sensor without passing the light through an image modification element. For example, the light may not be passed through a lens. The image sensor can typically be a complementary metal-oxide-semiconductor (CMOS), ¶[0029] discloses a lensless imaging device that does not use a lens or other image modification element to focus the light exposed onto the imaging sensor. ¶[0030] discloses the image sensor or imaging sensor is exposed to light without passing the light through an image modification element. The excluded image modification element may be described as a lens, an aperture, a diffractive grating, a mask, a filter, or another element designed to filter, focus, or otherwise adjust, the focal point of light on the image sensor.) Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Menon technique of Lens-less imaging into Friedhoff technique to provide the known and expected uses and benefits of Menon technique over differentiating of reflection and illumination of different surface boundary technique of Friedhoff. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement. Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Menon to Friedhoff in order to reduce expense and complexity of imaging systems. (Refer to Menon paragraph [0003].) Allowable Subject Matter Claim 12 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims and on the pending conditions of the rejected and objected matter set forth in this action. The following is a statement of reasons for the indication of allowable subject matter: the prior art of record, alone or in combination, fails to teach or suggest the limitations set forth by each of claim 12. Contact Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAGHAYEGH AZIMA whose telephone number is (571)272-1459. The examiner can normally be reached Monday-Friday, 9:30-6:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vincent Rudolph can be reached at (571)272-8243. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHAGHAYEGH AZIMA/Examiner, Art Unit 2671
Read full office action

Prosecution Timeline

Oct 24, 2023
Application Filed
Jan 23, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586350
DETERMINING AUDIO AND VIDEO REPRESENTATIONS USING SELF-SUPERVISED LEARNING
2y 5m to grant Granted Mar 24, 2026
Patent 12573209
ROBUST INTERSECTION RIGHT-OF-WAY DETECTION USING ADDITIONAL FRAMES OF REFERENCE
2y 5m to grant Granted Mar 10, 2026
Patent 12561989
VEHICLE LOCALIZATION BASED ON LANE TEMPLATES
2y 5m to grant Granted Feb 24, 2026
Patent 12530867
Action Recognition System
2y 5m to grant Granted Jan 20, 2026
Patent 12525049
PERSON RE-IDENTIFICATION METHOD, COMPUTER-READABLE STORAGE MEDIUM, AND TERMINAL DEVICE
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
93%
With Interview (+11.4%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 350 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month