Prosecution Insights
Last updated: April 19, 2026
Application No. 18/449,634

Geodetic Motion Compensated Integration (GMCI) Target Enhancement Filters

Non-Final OA §103
Filed
Aug 14, 2023
Examiner
SAFAIPOUR, BOBBAK
Art Unit
2665
Tech Center
2600 — Communications
Assignee
The Boeing Company
OA Round
3 (Non-Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 8m
To Grant
97%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
933 granted / 1085 resolved
+24.0% vs TC avg
Moderate +11% lift
Without
With
+10.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
30 currently pending
Career history
1115
Total Applications
across all art units

Statute-Specific Performance

§101
8.7%
-31.3% vs TC avg
§103
43.6%
+3.6% vs TC avg
§102
26.6%
-13.4% vs TC avg
§112
6.6%
-33.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1085 resolved cases

Office Action

§103
DETAILED ACTION Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/28/2026 has been entered. Information Disclosure Statement The information disclosure statements submitted on 01/28/2026 have been considered by the Examiner and made of record in the application file. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bandwar (US 2024/0193789 A1) in view of Lundgren (US 2010/0014769 A1) and in further view of Paxton (US 2021/0049773 A1). Regarding claims 1, 12 and 20, Bandwar discloses a computer-implemented method comprising: [claim 12: A system comprising: a memory comprising executable instructions; and a processor in data communication with the memory and configured to execute the executable instructions to perform an operation comprising: (paragraphs 55-61)] [claim 20: A computer-readable storage medium having computer-readable program code embodied therewith for performing an operation comprising: (paragraph 57)] obtaining a geodetic motion-compensated integrated (GMCI) output image of a target; (paragraphs 8 and 77; The temporal filtering process applied for correcting the image frame is motion-compensated temporal filtering (MCTF) process. MCTF reduces motion artifacts in video by filtering areas of motion based on global or local motion determinations for a current frame relative to a previous frame. Selectively performing MCTF filtering only on identified regions meeting certain criteria may reduce core power consumption when processing scenes with limited motion within the field of view.) generating an enhanced image, (read as corrected image frame) based at least in part on the approximation and the GMCI output image; and (paragraphs 74 and 85; The McTF engine receives the motion hotspots and the image transform matrix to generate a corrected image frame. The McTF applies a temporal filtering process to the to the second image frame to generate the corrected image frame. The image transformer applies a temporal filtering process to portions of the second image frame located within motion hotspots to generate the corrected image frame. The corrected image frame may then be added to the output image frames.) detecting and tracking (paragraph 16; frames for object tracking) the target using the enhanced image. (paragraphs 66 and 75; The corrected image frame serves as a reference image frame for correcting future image frames. The first image frame may be an image frame that was previously corrected.) Although Bandwar discloses obtaining a geodetic motion-compensated integrated (GMCI) output image of a target (paragraphs 8 and 77), Bandwar fails to specifically disclose the target has a low contrast-to-noise ratio in an image and generating an approximation to a matched filter, based on information associated with the target, wherein the information associated with the target includes one or more of: a peak intensity response associated with the target, or a trough intensity response associated with the target. In related art, Lundgren discloses obtaining a motion-compensated integrated output image by using a motion compensated integration processor to re-register (stabilize) minor frames with inertial measurement data and combine them into a higher-resolution master frame/combined image (see paragraphs 17-21, 27-28 and 30) and operates on data with low signal-to-background ratios, i.e., low contrast-to-noise targets in the original imagery (see paragraphs 5 and 30). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Lundgren into the teachings of Bandwar to effectively improve the resolution of a digitally stabilized image. Furthermore, in related art, Paxton discloses generating an approximation to a matched filter, based on information associated with the target, (paragraphs 2, 18, 20, 40 and 85: Match filtering is used to compare potential targets detected in a series of images to reduce false detections.) wherein the information associated with the target includes one or more of: a peak intensity response associated with the target, or a trough intensity response associated with the target. (paragraphs 3, 20, 40 and 85: Paxton discloses the peak intensity response by stating that linear filtering reduces peak target intensity in the image and that its motion-compensated stacking approach reduces clutter without reducing a peak target intensity.) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Paxton into the teachings of Bandwar and Lungren to effectively preserve the target’s peak response for detection and tracking. Regarding claims 2 and 13, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein further comprising: prior to generating the approximation, determining that a portion of the information associated with the target is unavailable; and in response to the determination, generating an estimate of the portion of the information associated with the target. (paragraphs 68-69; global motion estimates may be calculated based on sensor data; Local motion estimates may be captured by comparing the first and second image frames) Regarding claims 3 and 14, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein the portion of the information associated with the target comprises a heading of the target. (paragraphs 68-69; The global motion estimates may include one or both of a magnitude and direction of movement for the image capture device.) Regarding claims 4 and 15, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein the heading of the target is estimated based on a first gradient of the GMCI output image along a first direction and a second gradient of the GMCI output image along a second direction. (paragraph 70; the motion map may be computed by the motion map engine by comparing the second image frame to the first image frame to generate an estimate of motion vectors between the image frames) Regarding claims 5 and 16, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein the approximation comprises a direction aided gradient filter (DAGF). (paragraphs 69 and 71; motion map and direction) Regarding claims 6 and 17, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein the approximation comprises a gradient magnitude filter (GMF). (paragraph 80) Regarding claims 7 and 18, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein generating the enhanced image comprises applying the approximation to the GMCI output image when the information associated with the target satisfies a set of predetermined conditions. (paragraphs 80-81; the motion hotspots may be identified as locations within the second image frame corresponding to portions of the motion map indicating motion that exceeds a predetermined threshold.) Regarding claims 8 and 19, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein generating the enhanced image comprises: generating a decimated GMCI image based on the GMCI output image, when the information associated with the target does not satisfy a set of predetermined conditions; and applying the approximation to the decimated GMCI image. (paragraphs 75 and 81-83; the McTF engine analyzes the motion hotspots and determines corresponding locations (e.g., corresponding pixels) of the second image frame that are contained within the motion hotspots. The McTF engine applies the image transform matrix to the corresponding locations to generate the corrected image frame.) Regarding claim 9, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein the set of predetermined conditions comprises a speed of the target being within predefined limits. (paragraphs 69, 73 and 80) Regarding claim 10, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein a signal-to-noise ratio (SNR) of the target in the enhanced image is greater than a SNR of the target in the GMCI output image. (paragraph 33; reduce noise) Regarding claim 11, Bandwar, as modified by Lundgren and Paxton, discloses the claimed invention wherein the information associated with the target comprises at least one of a heading of the target or a speed of the target. (paragraphs 69-70) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BOBBAK SAFAIPOUR whose telephone number is (571)270-1092. The examiner can normally be reached Monday - Friday, 8:00am - 5:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BOBBAK SAFAIPOUR/Primary Examiner, Art Unit 2665
Read full office action

Prosecution Timeline

Aug 14, 2023
Application Filed
Jul 27, 2025
Non-Final Rejection — §103
Oct 01, 2025
Interview Requested
Oct 02, 2025
Interview Requested
Oct 22, 2025
Examiner Interview Summary
Oct 22, 2025
Applicant Interview (Telephonic)
Oct 24, 2025
Response Filed
Nov 14, 2025
Final Rejection — §103
Jan 05, 2026
Interview Requested
Jan 13, 2026
Examiner Interview Summary
Jan 13, 2026
Applicant Interview (Telephonic)
Jan 14, 2026
Response after Non-Final Action
Jan 28, 2026
Request for Continued Examination
Jan 31, 2026
Response after Non-Final Action
Feb 06, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597155
TRACKING THREE-DIMENSIONAL GEOMETRIC SHAPES
2y 5m to grant Granted Apr 07, 2026
Patent 12597113
FABRIC DEFECT DETECTION METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12591987
System and Method for Simultaneously Registering Multiple Lung CT Scans for Quantitative Lung Analysis
2y 5m to grant Granted Mar 31, 2026
Patent 12586140
Automated Property Inspections
2y 5m to grant Granted Mar 24, 2026
Patent 12586240
IMAGE PROCESSING APPARATUS AND CONTROL METHOD FOR SAME
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
97%
With Interview (+10.7%)
2y 8m
Median Time to Grant
High
PTA Risk
Based on 1085 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month