Prosecution Insights
Last updated: April 19, 2026
Application No. 17/766,338

MOTION IN IMAGES USED IN A VISUAL INSPECTION PROCESS

Non-Final OA §103
Filed
Apr 04, 2022
Examiner
PATEL, JAYESH A
Art Unit
2677
Tech Center
2600 — Communications
Assignee
Siemens Aktiengesellschaft
OA Round
5 (Non-Final)
83%
Grant Probability
Favorable
5-6
OA Rounds
3y 0m
To Grant
88%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
739 granted / 887 resolved
+21.3% vs TC avg
Moderate +5% lift
Without
With
+5.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
33 currently pending
Career history
920
Total Applications
across all art units

Statute-Specific Performance

§101
11.1%
-28.9% vs TC avg
§103
40.9%
+0.9% vs TC avg
§102
14.5%
-25.5% vs TC avg
§112
25.0%
-15.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 887 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/27/2026 has been entered. Response to Arguments Applicant's arguments filed on 02/09/2026 have been fully considered but they are not persuasive. Applicant argues on the remarks page 8 [Independent claim 1 has been amended to recite, inter alia, the limitation "analyze a distribution of the presence of motion within the image to obtain a determination of whether the motion originated from the camera or from the item on the inspection line, the processor obtaining the determination of the motion originating from the camera when the distribution is within a majority of the image". Claim 25 has been correspondingly amended. Support for these amendments may be found, for example, at paragraph [0055] of U.S. Pub. No. 2023/0138331, i.e., the published application. No new matter has been added. The combination of cited references fails to teach or suggest at least this limitation.], examiner respectfully disagrees. Tzur discloses, the processor obtaining the determination of the motion originating from the camera when the distribution is within a majority of the image (Fig 11, para 0073 discloses “if the image data acquisition device includes a motion detector 111, the existence or absence of motion of the device or objects within the image may be utilized by the pruning function 109. Motion is typically detected in digital cameras between preview images in order to then compensate for it, or as part of a compression algorithm (i.e the processor executing the algorithm and obtaining the determination) for the resulting image data, or possibly other purposes. If the user is shaking the camera while an image is being captured, motion of the entire image is detected (i.e motion in the majority of the image is detected) from one preview image to the next (i.e distribution) meeting the amended limitations claims 1 and 25). Therefore, the reference (Tzur) used in the last office action still read on the amended limitations of claims 1 and 25. The rejections are presented below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 4, 11 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over NPL1 (Restoration of blurred images for surface roughness evaluation using machine vision, B. Dhanasekar et al., ELSEVIER, 2010, Pages 268-276) hereafter NPL1 in view of Tzur et al. (US2009016964) hereafter Tzur. 1. Regarding claim 1, NPL1 discloses a visual inspection system (Fig 3, pages 270- 271 shows and discloses a visual inspection system) comprising a processor which applies an image processing algorithm to an image (fig 3 shows a computer with a display and the keyboard which obviously has a processor and pages 270- 271 also discloses in sections 6- 7 “application of the Richardson-Lucy technique” (i.e the algorithm) to process images “an image of 200x200 pixels) is processed of an item on an inspection line (i.e an item 2) is being inspected on the inspection line meeting he above claim limitations), the processor being configured to: receive an image of the item on the inspection line, captured by a camera (fig 3 shows the images captured by a camera 3 of the specimen 2 (i.e the item) on the inspection line received by the computer for processing meeting the above claim limitations, see pages 270-271 sections 6-7 for the above explanation); detect blurring in the image, said blurring indicating a presence of motion when the image was captured by the camera (page 271 section 7 discloses only the 200x200 pixels were chosen and the captured images (i.e the image) are blurred due to the uniform motion of the linear stage as shown in fig 3 and figs 5 (b), 7(b) shows the “blurred image” meeting the limitations of detect blurring in the image, said blurring indicating a presence of motion when the image of the specimen 2 (i.e the item) was captured by the camera); (page 2 271 section 7 and fig 3 shows and discloses “the captured images are blurred due to uniform motion of the linear stage on which the specimen 2 (i.e the item is placed for evaluation) as shown in fig 3 “i.e obtaining the determination of the motion) control a display of a user interface device based on the determination (fig 3 shows the control of a display of the user interface device “i.e a computer” used in the specimen vision and figs 5, 7 shows the display of the original, blurred (i.e the control of display based on the determination on the computer shown in fig 3 (i.e a user interface device)) and the restored image output meeting the above claim limitations of controlling a display, examiner notes that the specifics of “control” are not required by the current claim). As seen above NPL1 discloses determination of the motion when the image was captured and restoring the image. NPL1 also discloses in section 7 that “the captured images are blurred due to uniform motion of the linear stage as shown in fig 3” which would obviously mean that the motion originated from the item on the inspection line. NPL1 however fails to disclose analyzing a distribution of the presence of motion and whether the motion originated from the camera or from the item on the inspection line and the processor obtaining the determination of the motion originating from the camera when the distribution is within a majority of the image. Tzur discloses analyzing a distribution of the presence of motion within the image and whether the motion originated from the camera or from the object within the image (Fig 11, para 0073 discloses “if the image data acquisition device includes a motion detector 111, the existence or absence of motion of the device or objects within the image may be utilized by the pruning function 109. Motion is typically detected in digital cameras between preview images in order to then compensate for it, or as part of a compression algorithm for the resulting image data, or possibly other purposes. If the user is shaking the camera while an image is being captured, motion of the entire image is detected from one preview image to the next (i.e distribution). But motion may also be detected in individual portions or windows of an image (i.e distribution), which then detects motion of one or more objects within the scene. (i.e the item on the inspection line) meeting the above claim limitations of analyzing a distribution of the presence of motion within the image and whether the motion originated from the camera or from the object within the image) and the processor obtaining the determination of the motion originating from the camera when the distribution is within a majority of the image (Fig 11, para 0073 discloses “if the image data acquisition device includes a motion detector 111, the existence or absence of motion of the device or objects within the image may be utilized by the pruning function 109. Motion is typically detected in digital cameras between preview images in order to then compensate for it, or as part of a compression algorithm (i.e the processor executing the algorithm and obtaining the determination) for the resulting image data, or possibly other purposes. If the user is shaking the camera while an image is being captured, motion of the entire image is detected (i.e motion in the majority of the image is detected) from one preview image to the next (i.e distribution)). Before the effective filing date of the invention was made, NPL1 and Tzur are combinable because they are from the same filed of endeavor and analogous art of image processing. The suggestion/motivation would be an efficient, reduced hardware (i.e lower cost) and faster processing (i.e processing in the short amount of time) device/system at para 0008. Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Tzur in the system of NPL1 to obtain the invention as specified in claim 1. 2. Regarding claim 2, NPL1 and Tzur discloses the system of claim 1. NPL1 shows and discloses wherein the camera is mounted on the inspection line (Fig 3 shows the camera 3 mounted on the inspection line). 3. Regarding claim 4, NPL1 and Tzur discloses the system of claim 1. Tzur discloses further wherein the processor is further configured to receive input from a motion detector attached to the camera; and wherein the determination of the origin of motion is obtained based on the input from the motion detector (figs 1, 11, paras 0008, 0073 shows the device with the processor in the camera 11 and the motion detector 111 and paras 0008, 0073 disclose receive input from a motion detector attached to the camera; and wherein the determination of the origin of motion is obtained based on the input from the motion detector meeting the claim limitations). 4. Regarding claim 11, NPL1 and Tzur discloses the system of claim 1. NPL1 discloses further configured to control the image processing algorithm when the blurring in the image is detected (page 271 section 7 discloses apply/performing/control the image processing algorithm (i.e the RL algorithm) when the blurring is detected and producing the restored image See figs 5 and 7 meeting the above claim limitations, examiner notes that the specifics of “control” are not required by the current claim). 5. Regarding claim 14, NPL1 and Tzur discloses the system of claim 1. NPL1 discloses further wherein the processor is further configured to detect the blurring in the image by applying an image processing algorithm (figs 3-7 and page 271 shows and discloses wherein the processor is further configured to detect the blurring in the image (i.e the blurred image has a PSNR 0f 21.032 and 21.132 in figs 5 and 7) by applying an image processing algorithm and producing the restored image after 100th iteration meeting the claim limitations). Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over NPL1 in view of Tzur and in further view of Nachman (US20150170367) hereafter Nachman. 6. Regarding claim 5, NPL1 and Tzur discloses the system of claim 4. Tzur shows and discloses a motion detector 111 in fig 11. NPL1 and Tzur are however silent and fail to disclose wherein the motion detector comprises at least one of a gyroscope and an accelerator. Nachman discloses further wherein the motion detector comprises at least one of a gyroscope and an accelerator (paras 0026, 0037-0038 and fig 1 element 165 shows and discloses wherein the motion detector comprises at least one of a gyroscope and an accelerator meeting the claim limitations). Before the effective filing date of the invention was made, NPL1, Tzur and Nachman are combinable because they are from the same filed of endeavor and analogous art of image processing. The suggestion/motivation would be an accurate device/system at para 0052. Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Nachman in the system of NPL1 and Tzur to obtain the invention as specified in claim 5. Claims 6 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over NPL1 in view of Tzur and in further view of Sbihli et al. (US20140182373) hereafter Sbihli. 7. Regarding claim 6, NPL1 and Tzur discloses the system of claim 1. NPL1 discloses the processor and the display as seen in fig 3. Tzur discloses whether the origin of the motion is from the camera of the object (i.e the item on the inspection line) fig 11 and para 0073. NPL1 and Tzur however are silent and fails to disclose further wherein the processor is further configured to cause a notification to be displayed on the display of the user interface device, the notification indicating the origin of the motion. Sbihli discloses wherein the processor is configured to cause a notification to be displayed on the display of the user interface device (see para [0017]; “FIG. 8 is a front view of an embodiment of a display screen of the distributed NDT system of FIG. 1 with motion feedback”), the notification indicating the origin of the motion (see para [0064]; “The distributed NDT system 10 may utilize the motion data to provide feedback to the probe operator 253. The feedback may notify the probe operator 253 of spatial factors that affect the sensor data 174, such as the speed, the position, the angle 264, the orientation of the NDT probe 250 relative to the workpiece 256, and the spacing 266 of the NDT probe 250 from the workpiece 256. Using the feedback, the probe operator 253 may adjust the NDT probe 250 on subsequent inspections to produce desirable sensor data 174 results, such as sensor data 174 that is obtained while the spatial factors are within one or more reference ranges. The feedback may enable the probe operator 253 to improve the quality and consistency of the sensor data 174 by adjusting the position, movement, and/or orientation of the NDT probe 250”). Before the effective filing date of the invention was made, NPL1, Tzur and Sbihli are combinable because they are form the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be to provide feedback to the probe operator to adjust the motion, and enables to filter out some sensor data from inclusion in a recorded data set (see para [0062]). Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Sbihli in the system of NPL1 and Tzur to obtain the invention as specified in claim 6. 8. Regarding claim 8, NPL1 and Tzur discloses the system of claim 1. NPL1 discloses the processor and the display as seen in fig 3. Tzur discloses whether the origin of the motion is from the camera of the object (i.e the item on the inspection line) fig 11 and para 0073. NPL1 and Tzur however are silent and fails to disclose further wherein the processor is further configured to cause a notification to be displayed on the display of the user interface device, the notification indicating an action to be completed by a user to reduce the motion. Sbihli discloses further wherein the processor is configured to cause a notification to be displayed on the display of the user interface device, the notification indicating an action to be done by a user, to reduce the motion (see Fig. 7-8, para [0024]; “The distributed NDT system 10 may utilize the motion data to provide feedback to the probe operator 253. The feedback may notify the probe operator 253 of spatial factors that affect the sensor data 174, such as the speed, the position, the angle 264, the orientation of the NDT probe 250 relative to the workpiece 256, and the spacing 266 of the NDT probe 250 from the workpiece 256. Using the feedback, the probe operator 253 may adjust the NDT probe 250 on subsequent inspections to produce desirable sensor data 174 results, such as sensor data 174 that is obtained while the spatial factors are within one or more reference ranges. The feedback may enable the probe operator 253 to improve the quality and consistency of the sensor data 174 by adjusting the position, movement, and/or orientation of the NDT probe 250 (i.e reduce the motion or movement) of the sensor by the user”). Before the effective filing date of the invention was made, NPL1, Tzur and Sbihli are combinable because they are form the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be to provide feedback to the probe operator to adjust the motion, and enables to filter out some sensor data from inclusion in a recorded data set (see para [0062]). Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Sbihli in the system of NPL1 and Tzur to obtain the invention as specified in claim 8. Claims 7 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over NPL1 in view of Tzur and in further view of Kiuchi et al. (US 20110050889 A1) hereafter Kiuchi. 9. Regarding claim 7, NPL1 and Tzur disclose the system of claim 1. NPL1 discloses the processor and the display as seen in fig 3. NPL1 and Tzur however are silent and fails to disclose further wherein the processor is further configured to cause a notification to be displayed on the display of the user interface device, the notification indicating the item was not inspected. Kiuchi discloses further wherein the processor is configured to cause a notification to be displayed on the display of the user interface device, the notification indicating the item was not inspected (see para [0119]; “In the flow analysis result area 410, a support message such as a hint and a word of caution is displayed such that the user who has little expert knowledge can appropriately assemble or adjust the flows. In the flow analysis result area 410, the user is notified of a warning message in a situation in which the parallel processing is not efficiently performed”, see also para [0128]; “when determining that the resources are not efficiently utilized, the image processing apparatus 100 notifies the user of contents that the resources are not efficiently utilized in the flow analysis result area 410 of the user interface screen 400... At this point, a message that "processing is not efficiently allocated because of large difference in processing time between sections" is displayed in the flow analysis result area 410. A highlight 405 of the processing item of the cause is displayed when the user selects the message with a cursor CRS or the like. That is, the blocks of the item of "labeling" in the inspection flow area 404 and the item of "labeling" in the timing chart area 420 are highlighted”). Before the effective filing date of the invention was made, NPL1, Tzur and Kiuchi are combinable because they are from the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be a fastest speed processing system at para 0082. Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Kiuchi in the system of NPL1 and Tzur to obtain the invention as specified in claim 7. 10. Regarding claim 10, NPL1 and Tzur disclose the system of claim 1. NPL1 discloses the processor and the display as seen in fig 3 and the determination as seen in claim 1. NPL1 and Tzur however are silent and fails to disclose further wherein the processor is further configured to control a programmable logic controller (PLC) based on the determination. Kiuchi discloses further wherein the processor is configured to control a programmable logic controller (PLC), based on the determination (see para [0053]; “More specifically, the I/O controller 118 is connected to the hard disk 120, the camera interface 122, the input interface 124, the PLC interface 126, the communication interface 128, and the data reader/writer 130”). Before the effective filing date of the invention was made, NPL1, Tzur and Kiuchi are combinable because they are from the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be a fastest speed processing system at para 0082. Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Kiuchi in the system of NPL1 and Tzur to obtain the invention as specified in claim 10. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over NPL1 in view of Tzur and in further view of Jo et al. (US 20180232870A1) hereafter Jo. 11. Regarding claim 9, NPL1 and Tzur discloses the system of claim 1. NPL1 discloses the processor and the display of the user interface device (“i.e a computer with the display screen and keyboard”) as seen in fig 3. NPL1 and Tzur however are silent and fails to disclose further wherein the processor is further configured to cause a notification to be displayed on the display of the user interface device, the notification to be displayed during a set up stage performed prior to an inspection stage. Jo et al teach the notification to be displayed during a set up stage, prior to an inspection stage (see paras [0045, 0047]; “the inspection apparatus 120 according to the present embodiment may predict inspection accuracy of cap sealing of a particular container through a setting process, before (prior) performing an actual inspection. For example, the predicted inspection accuracy may be notified to a user by predicting the inspection accuracy according to a result of the previous learning based on the types of a good product and a not good production” and the operator is notified by displaying good product and not good product as disclosed in paras 0060-0066). Before the effective filing date of the invention was made, NPL1, Tzur and Jo are combinable because they are from the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be to determine whether the cap sealing corresponding to the thermal image data is a good product, or a user check product (see para [0045]). Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Jo in the system of NPL1 and Tzur to obtain the invention as specified in claim 9. Claims 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over NPL1 in view of Tzur and in further view of Cote et al. (US 20170048442A1) hereafter Cote. 12. Regarding claim 12, NPL1 and Tzur discloses the system of claim 11. NPL1 discloses the image processing algorithm as see on page 271 and in the disclosure. Tzur discloses processing the image to provide a high-resolution image in para 0068. NPL1 and Tzur are silent and however fails to disclose further wherein the image processing algorithm comprises obtaining a high dynamic range (HDR) image of the item and inspecting the item in the HDR image. Cote teaches wherein the image processing algorithm comprises obtaining a high dynamic range (HDR) image of the item and inspecting the item in the HDR image (see para [0084]; “the image processing module of some embodiments examines several normally exposed images that the image capturing module returns. From this group, the image processing module selects the normally exposed image that is the sharpest and that best matches the captured overexposed and underexposed images. Alternatively, in some embodiments, rather than using normally exposed images from the frame buffer, after an HDR capture command is received, the mobile device takes one or more images at a normal exposure as well as the overexposed and underexposed images. In some such embodiments, the image processing module selects one of the normally exposed images (e.g., the sharpest, the last, etc.) to use for generating a composite HDR image. In some embodiments, the normally exposed images are taken after the HDR capture command rather than from the frame buffer”). Before the effective filing date of the invention was made, NPL1, Tzur and Cote are combinable because they are from the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be to perform the search at a different level in the search hierarchy (see para [0084]). Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Cote in the system of NPL1 and Tzur to obtain the invention as specified in claim 12. 13. Regarding claim 13, NPL1, Tzur and Cote disclose the system of claim 12. Cote discloses further wherein the processor is configured to cause a display notification regarding use of the HDR image (see para [0101]; “the controller module 535 simply notifies the processing pipeline 525 that the overexposed and underexposed images are being captured, and the processing pipeline 525 retrieves the correct images from the frame buffer 520”), to be displayed on the display of the user interface device (see para [0083]. Also paras 0098, 0102 and fig 15 discloses the HDR mode command and HDR image generation instructions (as seen in fig 15) meeting the limitations of display notification regarding use of the HDR image, examiner notes that the specifics of display are not required by the current claim). Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over NPL1 in view of Tzur and in further view of Ben-Ezra et al., (US20050047672A1) hereafter Ben-Ezra. 14. Regarding claim 16, NPL1 and Tzur discloses the system of claim 1. NPL1 disclose the processor and the image processing algorithm processing the image of the specimen/object on the stage as seen in fig 3 and the disclosure. NPL1 and Tzur are silent and however fails to disclose further wherein the processor is further configured to detect the item in the image and detect the blurring at a location of the item in the image. Ben-Ezra discloses wherein the processor is further configured to detect the item in the image and detect the blurring at a location of the item in the image (paras 0003, 0016, 0028-0029 and fig 8c, d shows and discloses detecting an item/object in the image and also detect the blurring at the location of the item/object in the image). Before the effective filing date of the invention was made, NPL1, Tzur and Ben-Ezra are combinable because they are from the same filed of endeavor and are analogous art of image processing. The suggestion/motivation would be an advantageous, low-cost image de-blurring system at para 0068. Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Ben-Ezra in the system of NPL1 and Tzur to obtain the invention as specified in claim 16. Claims 25-29 are rejected under 35 U.S.C. 103 as being unpatentable over NPL1 in view of Stavely (US20040130628) hereafter Stavely and in further view of Tzur. 15. Regarding claim 25, NPL1 discloses a method for visual inspection of an item from images of the item on an inspection line, the images being captured during a current inspection window (figs 3-7 and pages 270-272 shows a method of visual inspection of an item (specimen surface) on an inspection line (as seen in fig 3) by capturing the images (i.e during the current inspection window (time) as seen in fig 3, page 271 section 7 discloses blurred images of the machined surfaces) and by processing the images to remove the blurring (i.e deblurring or deblurred image or restored image as seen in figs 5c, 7c,) the method comprising: determining, via a processor, (figs 3 shows the computer (i.e via a processor), and page 270 section 7 discloses the processing of the captured blurred images of the machined surface during an inspection (5c and 7c shows the blurred images of the machined surface (i.e an item) on the inspection line, Examiner notes that NPL1 discloses capturing the blurred images during the inspection (i.e capturing using the camera 3, the blurred images as the stage 1 on which the specimen 2 is moving (i.e uniform motion of the stage as shown in fig 3 or the presence of motion), examiner notes that as the stage is moving and the images i.e the plurality of images of the inspection specimen window images over time which would include a previous inspection window (time) and the current inspection window (time) are being captured, the blurred images being captured would include a previous inspection window (time) and also the current inspection window (time)), the blurred image (figs 3 shows the computer (i.e via a processor), and page 270 section 7 discloses the processing of the captured blurred images of the machined surface during an inspection (5c and 7c shows the blurred images of the machined surface (i.e an item) on the inspection line, Examiner notes that NPL1 discloses capturing the blurred images during the inspection (i.e capturing using the camera 3, the blurred images as the stage 1 on which the specimen 2 is moving (i.e uniform motion of the stage as shown in fig 3 or the presence of motion as the images are being captured meets the limitations of the blurred image indicating a presence of motion when the images are captured in a previous inspection window (time), examiner notes that as the stage is moving and the images i.e the plurality of images of the inspection specimen window images over time which would include a previous inspection window (time) and the current inspection window (time) are being captured, the blurred images being captured would include a previous inspection window (time) and also the current inspection window (time)) Stavely discloses determining, via a processor, a blurred pattern indicating the presence of the motion in the previous image (figs 1, 1B, 2A-2B, 3B and 4, paras 0017-0019, 0021-0023, 0025-0026 and 0031-0034 shows and discloses “a video preview mode in which digital frames are acquired at a rate of 30 frames per second and shown on display 125 and the presence of motion is indicated by processing the blurred region/pattern (fig 2A-2B) in the preview image mode (i.e the previous inspection window) and (figs 3B, 4, paras 0021-0023, 0031, 0034 shows and discloses controlling the delay in the capture of the digital image i.e timing of the capture of digital image (i.e with in the current window) based on the blurred patten (i.e the measurement of the motion or blurring in the region as seen in figs 2A and 2B, i.e the blur caused by the motion is minimum) meeting the claim limitations). As seen above both NPL1 and Stavely discloses capturing and processing images. NPL1 and Stavely are however silent and fails to disclose a determination of whether the motion originated from the camera or from the item on the inspection line being obtained based on an analysis of a distribution of presence of motion within the image, the processor obtaining the determination of the motion originating from the camera when the distribution is within a majority of the image. Tzur discloses determining whether the motion originated from the camera or from the item on the inspection line based on an analysis of a distribution of presence of motion within the image (Fig 11, para 0073 discloses “if the image data acquisition device includes a motion detector 111, the existence or absence of motion of the device or objects within the image may be utilized by the pruning function 109. Motion is typically detected in digital cameras between preview images in order to then compensate for it, or as part of a compression algorithm for the resulting image data, or possibly other purposes. If the user is shaking the camera while an image is being captured, motion of the entire image is detected from one preview image to the next (i.e distribution). But motion may also be detected in individual portions or windows of an image (i.e distribution), which then detects motion of one or more objects within the scene. (i.e the item on the inspection line) meeting the above claim limitations of determining whether the motion originated from the camera or from the item on the inspection line based on an analysis of a distribution of presence of motion within the image), the processor obtaining the determination of the motion originating from the camera when the distribution is within a majority of the image (Fig 11, para 0073 discloses “if the image data acquisition device includes a motion detector 111, the existence or absence of motion of the device or objects within the image may be utilized by the pruning function 109. Motion is typically detected in digital cameras between preview images in order to then compensate for it, or as part of a compression algorithm (i.e the processor executing the algorithm and obtaining the determination) for the resulting image data, or possibly other purposes. If the user is shaking the camera while an image is being captured, motion of the entire image is detected (i.e motion in the majority of the image is detected) from one preview image to the next (i.e distribution)). Before the effective filing date of the invention was made, NPL1, Stavely and Tzur are combinable because they are from the same filed of endeavor and analogous art of image processing. The suggestion/motivation would be an advantageous and reduced cost motion tracking apparatus/system/method at paras 0004-0006 as taught by Stavely and an efficient, reduced hardware (i.e lower cost) and faster processing (i.e processing in the short amount of time) device/system at para 0008 as taught by Tzur. Therefore, it would be obvious and within one of ordinary skill in the art to have recognized the advantages of Tzur and Stavely in the system of NPL1 to obtain the invention as specified in claim 25. 16. Regarding claim 26, NPL1, Stavely and Tzur disclose the method of claim 25. Stavely discloses further wherein determining the blurred pattern in the images comprises receiving, at the processor, input from a motion detector in operative communication with the camera (figs 1A-1C, 2A-2C and paras 0021-0022 shows and discloses wherein determining the blurred pattern in the images comprises receiving, at the processor, input from a motion detector in operative communication with the camera). 17. Regarding claim 27, NPL1, Stavely and Tzur disclose the method of claim 25. Stavely disclose further wherein determining the blurred pattern in the images comprises applying, by the processor, image processing to the images (para 0023, figs 2A-2c discloses the processing of the pixels in determining the motion/blur pattern in the images meeting the above claim limitations). 18. Regarding claim 28, NPL1, Stavely and Tzur disclose the method of claim 27. NPL1 discloses the images of the item (specimen surface) and Stavely discloses further comprising: determining the(figs 1, 1B, 2A-2B, 3B and 4, paras 0017-0019, 0021-0023, 0025-0026 and 0031-0034). NPL1 and Stavely together would therefore meet the limitations of further comprising: determining the 19. Regarding claim 29, NPL1, Stavely and Tzur disclose the method of claim 25. Stavely discloses determining the blurring pattern (as seen in figs 2A-2C) in the video preview mode (i.e during the set-up stage and not in the inspection stage) as seen in (figs 1, 1B, 2A-2B, 3B and 4, paras 0017-0019, 0021-0023, 0025-0026 and 0031-0034) meeting the limitations of further comprising determining a motion pattern during a set up stage, prior to an inspection stage. Examiner's Note: Examiner has cited figures, and paragraphs in the references as applied to the claims above for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested for the applicant, in preparing the responses, to fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. Examiner has also cited references in PTO892 but not relied on, which are relevant and pertinent to the applicant’s disclosure, and may also be reading (anticipatory/obvious) on the claims and claimed limitations. Applicant is advised to consider the references in preparing the response/amendments in-order to expedite the prosecution. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAYESH PATEL whose telephone number is (571)270-1227. The examiner can normally be reached IFW Mon-FRI. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached at 571-270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JAYESH PATEL Primary Examiner Art Unit 2677 /JAYESH A PATEL/Primary Examiner, Art Unit 2677
Read full office action

Prosecution Timeline

Apr 04, 2022
Application Filed
Nov 14, 2024
Non-Final Rejection — §103
Jan 30, 2025
Response Filed
May 15, 2025
Final Rejection — §103
Jul 18, 2025
Response after Non-Final Action
Aug 12, 2025
Request for Continued Examination
Aug 13, 2025
Response after Non-Final Action
Aug 19, 2025
Non-Final Rejection — §103
Nov 21, 2025
Response Filed
Dec 09, 2025
Final Rejection — §103
Feb 09, 2026
Response after Non-Final Action
Feb 27, 2026
Request for Continued Examination
Mar 04, 2026
Response after Non-Final Action
Mar 17, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597170
METHOD AND APPARATUS FOR IMMERSIVE VIDEO ENCODING AND DECODING, AND METHOD FOR TRANSMITTING A BITSTREAM GENERATED BY THE IMMERSIVE VIDEO ENCODING METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12579770
DETECTION SYSTEM, DETECTION METHOD, AND NON-TRANSITORY STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12561949
CONDITIONAL PROCEDURAL MODEL GENERATION
2y 5m to grant Granted Feb 24, 2026
Patent 12555346
Automatic Working System, Automatic Walking Device and Control Method Therefor, and Computer-Readable Storage Medium
2y 5m to grant Granted Feb 17, 2026
Patent 12536636
METHOD AND SYSTEM FOR EVALUATING QUALITY OF A DOCUMENT
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
83%
Grant Probability
88%
With Interview (+5.2%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 887 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month