Prosecution Insights
Last updated: April 19, 2026
Application No. 18/951,050

METHOD OF DETECTING AND ACCOUNTING FOR TRUE MOTION OF IMAGERY SUBJECT TO ATMOSPHERIC TURBULENCE

Non-Final OA §DP
Filed
Nov 18, 2024
Examiner
BHUIYAN, FAYEZ A
Art Unit
2638
Tech Center
2600 — Communications
Assignee
Government Of The United States AS Represented By The Secretary Of The Air Force
OA Round
1 (Non-Final)
84%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
96%
With Interview

Examiner Intelligence

Grants 84% — above average
84%
Career Allow Rate
470 granted / 559 resolved
+22.1% vs TC avg
Moderate +12% lift
Without
With
+12.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
12 currently pending
Career history
571
Total Applications
across all art units

Statute-Specific Performance

§101
2.6%
-37.4% vs TC avg
§103
36.4%
-3.6% vs TC avg
§102
43.5%
+3.5% vs TC avg
§112
7.4%
-32.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 559 resolved cases

Office Action

§DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory obviousness-type double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the conflicting application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. A terminal disclaimer signed by the assignee must fully comply with 37 CFR 3.73(b). Application No. 18/579557, Claims 1-20 are rejected on the ground of nonstatutory obviousness-type double patenting as being unpatentable over claims 1-20 of application no. US 12192626 B2 (17/861171). Although the conflicting claims are not identical, they are not patentably distinct from each other because all the limitation of Claims 1-20 of 18/579557 application is anticipated by the claims 1-20 of application no. US 12192626 B2 (17/861171). Claims 1-20 of 18/579557 are therefore not patentably distinct from the patented applications US 12192626 B2 (17/861171), and unpatentable for obvious-type double patenting. Instant Application No.18/579557 Patent NO. US 12192626 B2 (17/861171) 1. A method of determining the probability a pixel in a video image subject to optical turbulence represents true motion, the method comprising the steps of: a. capturing incoming light from a scene with an image capture device to yield an original video having a plurality of frames; b. creating a prototype image from the plurality of frames, the prototype image being a geometrically accurate representation of the scene and substantially free of contamination caused by true scene motion; c. measuring a local atmospheric turbulence profile from statistical data; d. selecting a subject pixel location from each pixel location available in the prototype image; e. determining a turbulence tilt variance for the subject pixel location based upon the measured local atmospheric turbulence profile; f. extracting a finite window for each subject pixel location based on the turbulence tilt variance, whereby the subject pixel location is contained within the window; g. establishing a unique mode for each pixel in the finite window whereby the intensity of each pixel is a mean of the respective mode and a weight for each unique mode is based upon the turbulence tilt variance to yield a Gaussian mixture model; h. evaluating each pixel at the subject pixel location in the original image plurality of frames against the probability density function from the Gaussian mixture model to determine a probability that each pixel at the subject pixel location represents true motion within the original video; and i. repeating steps e through h until all pixel locations within the prototype image have been evaluated and probabilities therefor have been determined. 1. A method of determining the probability a pixel in a video image subject to optical turbulence represents true motion, the method comprising the steps of: a. capturing incoming light from a scene with an image capture device to yield an original video having a sequence of frames; b. creating a prototype image from the sequence of frames, the prototype image being a geometrically accurate representation of the scene and substantially free of contamination caused by true scene motion; c. measuring a local atmospheric turbulence profile from local statistical data; d. selecting a subject pixel location from each pixel location available in the prototype image; e. determining a turbulence tilt variance for the subject pixel location based upon the measured local atmospheric turbulence profile; f. extracting a finite window for each subject pixel location based on the turbulence tilt variance, whereby the subject pixel location is contained within the window; g. establishing a unique mode for each pixel in the finite window whereby the intensity of each pixel is a mean of the respective mode and a weight for each unique mode is based upon the turbulence tilt variance to yield a Gaussian mixture model; h. evaluating each pixel at the subject pixel location in the original image sequence against the probability density function from the Gaussian mixture model to determine a probability that each pixel at the subject pixel location represents true motion within the original video; and i. repeating steps e through h until all pixels locations within the prototype image have been evaluated and probabilities therefor have been determined. 2. The method according to claim 1 further comprising the step of using the probabilities that a plurality of pixels in the original image for subsequent image analysis or subsequent image processing. 2. The method according to claim 1 further comprising the step of using the probabilities that a plurality of pixels in the original image for subsequent image analysis or image processing. 3. The method according to claim 2 wherein the step of using the probabilities that a plurality of pixels in the original image for subsequent image or image processing analysis comprises at least one of turbulence mitigation, image restoration and target tracking. 3. The method according to claim 2 wherein the step of using the probabilities that a plurality of pixels in the original image for subsequent image or image processing analysis comprises at least one of turbulence mitigation, image restoration and target tracking. 4. A method of determining the probability a pixel in a video image subject to optical turbulence represents true motion, the method comprising the steps of: a. capturing incoming light from a scene with an image capture device to yield an original video having a sequence of frames; b. creating a prototype image from the sequence of frames, the prototype image being a geometrically accurate representation of the scene and substantially free of contamination caused by true scene motion; c. measuring a local atmospheric turbulence profile from local statistical data; d. selecting a subject pixel location from a plurality of pixel locations available in the prototype image; e. determining a turbulence tilt variance for the subject pixel location based upon the measured local atmospheric turbulence profile; f. extracting a finite window for each subject pixel location based on the turbulence tilt variance, whereby the subject pixel location is contained within the window; g. establishing a unique mode for each pixel in the finite window whereby the intensity of each pixel is a mean of the respective mode and a weight for each unique mode yields a Gaussian mixture model; h. evaluating a plurality of pixels at the respective subject pixel locations in the original image sequence against the probability density function from the Gaussian mixture model to determine a probability that each pixel of the plurality of pixels at the respective subject pixel location represents true motion within the original video; repeating steps e through h until all pixels locations within the prototype image have been evaluated and probabilities therefor have been determined and further comprising the step of converting the probability density function for each pixel of the plurality of pixels, to a negative log likelihood value proportional to the likelihood that the subject pixel represents true motion within the original image. 4. A method of determining a probability a pixel in a video image subject to optical turbulence represents true motion, the method comprising the steps of: a. capturing incoming light from a scene with an image capture device to yield an original video having a sequence of frames; b. creating a prototype image from the sequence of frames, the prototype image being a geometrically accurate representation of the scene and substantially free of contamination caused by true scene motion; c. measuring a local atmospheric turbulence profile from local statistical data; d. selecting a subject pixel location from each pixel location available in the prototype image; e. determining a turbulence tilt variance for the subject pixel location based upon the measured local atmospheric turbulence profile; f. extracting a finite window for each subject pixel location based on the turbulence tilt variance, whereby the subject pixel location is contained within the window; g. establishing a unique mode for each pixel in the finite window whereby the intensity of each pixel is a mean of the respective mode and a weight for each unique mode is based upon the turbulence tilt variance to yield a Gaussian mixture model; h. evaluating each pixel at the subject pixel location in the original image sequence against the probability density function from the Gaussian mixture model to determine a probability that each pixel at the subject pixel location represents true motion within the original video; repeating steps e through h until all pixels locations within the prototype image have been evaluated and probabilities therefor have been determined and further comprising the step of converting the probability density function for each pixel, to a negative log likelihood value proportional to the likelihood that the subject pixel represents true motion within the original image. 5. The method according to claim 4 wherein the negative log likelihood ranges from 50 to 1000. 5. The method according to claim 4 wherein said negative log likelihood ranges from 50 to 1000. 6. The method according to claim 1 further comprising the step of spatially registering the frames to a reference frame via a whole-pixel translation. 6. The method according to claim 1 further comprising the step of spatially registering the frames within the sequence of frames to a reference frame via a whole-pixel translation. 7. The method according to claim 6 wherein the translation is limited to 3X the tilt variance. 7. The method according to claim 6 wherein the translation is limited to 3× the tilt variance. 8. The method according to claim 7 wherein the tilt variance is a differential tilt variance. 8. The method according to claim 7 wherein the tilt variance is a global tilt variance. 9. The method according to claim 1 further comprising the step of combining modes having a common mean into a single mode, while preserving the mean. 9. The method according to claim 1 further comprising the step of combining modes having a common mean into a single mode, while preserving the mean. 10. The method according to claim 9 further comprising the step of determining the weighting of the single mode according to a summation of the modes. 10. The method according to claim 9 further comprising the step of determining the weighting of the single mode according to a summation of the modes having the common mean. 11. The method according to claim 10 further comprising the step of combining modes using Lloyd's quantization to determine the means of such combined modes. 11. The method according to claim 1 further comprising the step of combining modes using Lloyd's quantization to determine the means of such combined modes. 12. The method according to claim 11 further comprising the step of determining the weighting of each combined mode according to a summation of the modes from a common quantization bin. 12. The method according to claim 11 further comprising the step of determining the weighting of each combined mode according to a summation of the modes from a common quantization bin. 13. The method according to claim 1 whereby the window is a M x M matrix, the subject pixel is generally centered within the window wherein the M x M matrix has a minimum size of 9 x 9 and a maximum size of the entire image. 13. The method according to claim 1 whereby the window is a M×M matrix, the subject pixel is generally centered within the window wherein the M×M matrix has a minimum size of 9×9 and a maximum size of the entire image. Regarding Claim 14 is rejected same reason as Claim 1. 15. The method according to claim 14 wherein the size of the window is based upon the differential tilt variance. 15. The method according to claim 14 wherein the size of the window is based upon the turbulence tilt variance. 16. The method according to claim 14 further comprising the step of using the probabilities that a plurality of pixels in the original image for subsequent image analysis or image processing comprising at least one of turbulence mitigation, image restoration, target tracking and operating cueing systems. 16. The method according to claim 14 further comprising the step of using the probabilities that a plurality of pixels in the original image for subsequent image analysis or image processing comprising at least one of turbulence mitigation, image restoration, target tracking and operating cueing systems. 17. The method according to claim 14 further comprising the step of forming a finite superpixel for a subject frame, the superpixel defining a collection of pixels including the subject pixel and statistically combining the probabilities of all pixels within the superpixel to yield a new probability for the subject pixel. 17. The method according to claim 14 further comprising the step of forming a finite superpixel for a subject frame, the superpixel defining a collection of pixels including the subject pixel and statistically combining the probabilities of all pixels within the superpixel to yield a new probability for the subject pixel. 18. A method for determining the likelihood a pixel in a video image subject to optical turbulence represents true motion, the method comprising the steps of: capturing incoming light with an image capture device to yield an original image having a plural frames; creating a prototype image from the original image as a geometrically accurate representation of a scene, the prototype image being substantially free of contamination caused by true scene motion; creating a tilt variance-Gaussian mixture model using optical turbulence statistics from the scene; deriving a likelihood function from the tilt variance-Gaussian mixture model; and evaluating plural pixels in the original image against the prototype image using the likelihood function to yield a likelihood that the respective pixel represents true motion. 18. A method for determining the likelihood a pixel in a video image subject to optical turbulence represents true motion, the method comprising the steps of: capturing incoming light from a scene with an image capture device to yield an original image having a sequence of frames; creating a prototype image from the original image as a geometrically accurate representation of the scene, the prototype image being substantially free of contamination caused by true scene motion; creating a tilt variance-Gaussian mixture model based upon optical turbulence statistics associated with the scene; deriving a likelihood function from the tilt variance-Gaussian mixture model; and evaluating each pixel in the original image sequence against the prototype image using the likelihood function to yield a likelihood that the respective pixel represents true motion. 19. The method according to claim 18 further comprising the step of using the likelihood to perform at least one of turbulence mitigation, image restoration, target tracking and operating cueing systems. 19. The method according to claim 18 further comprising the step of using said likelihood to perform at least one of turbulence mitigation, image restoration, target tracking and operating cueing systems. 20. The method according to claim 18 wherein the step of evaluating plural pixels in the original image against the prototype image comprises evaluating each pixel in the original image against the prototype image. 20. ……and evaluating each pixel in the original image sequence against the prototype image using the likelihood function to yield a likelihood that the respective pixel represents true motion, wherein the likelihood is negative log likelihood. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to FAYEZ A BHUIYAN whose telephone number is (571)270-1562. The examiner can normally be reached on 9:00 - 6:00 M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lin Ye can be reached on 571-272-7372. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /FAYEZ BHUIYAN/ Examiner, Art Unit 2639 /LIN YE/Supervisory Patent Examiner, Art Unit 2638
Read full office action

Prosecution Timeline

Nov 18, 2024
Application Filed
Mar 06, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598380
ELECTRONIC DEVICE AND CONTROL METHOD THEREOF
2y 5m to grant Granted Apr 07, 2026
Patent 12585169
IMAGING APPARATUS
2y 5m to grant Granted Mar 24, 2026
Patent 12554146
THIN DUAL-APERTURE ZOOM DIGITAL CAMERA
2y 5m to grant Granted Feb 17, 2026
Patent 12549857
SHAKE CORRECTION DEVICE, IMAGING APPARATUS, SHAKE CORRECTION METHOD, AND SHAKE CORRECTION PROGRAM
2y 5m to grant Granted Feb 10, 2026
Patent 12542981
PHOTORECEPTOR MODULE AND SOLID-STATE IMAGING DEVICE
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
84%
Grant Probability
96%
With Interview (+12.0%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 559 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month