Prosecution Insights
Last updated: April 19, 2026
Application No. 18/425,873

DETERMINING A POSITION OF A COMPONENT OF AN AIRCRAFT LANDING GEAR ASSEMBLY

Non-Final OA §103
Filed
Jan 29, 2024
Examiner
GARCIA, PAULO ANDRES
Art Unit
2669
Tech Center
2600 — Communications
Assignee
Airbus Operations Limited
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
34 granted / 41 resolved
+20.9% vs TC avg
Strong +17% interview lift
Without
With
+17.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
13 currently pending
Career history
54
Total Applications
across all art units

Statute-Specific Performance

§101
16.7%
-23.3% vs TC avg
§103
54.3%
+14.3% vs TC avg
§102
12.9%
-27.1% vs TC avg
§112
10.4%
-29.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 41 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Notice to Applicants 2. This communication is in response to the application filled on 01/29/2024. 3. Claims 1-19 are pending. 4. Limitations appearing inside {} are intended to indicate the limitations not taught by said prior art(s)/combinations. Information Disclosure Statement 5. The information disclosure statements (IDS) submitted on 01/29/2024 and 08/12/2024 have been considered by the examiner. Claim Rejections - 35 USC § 103 6. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 7. Claims 1, 4-11, 14-19 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Publication No. 2022/0332437 to Shapoury et al. (hereinafter Shapoury) and further in view of GB-2563851-A to Garaygay et al. (hereinafter Garaygay). 8. Regarding Claim 1, Shapoury discloses a computer-implemented method of determining a position of a component of an aircraft landing gear assembly, the method comprising ([Fig. 6], [par. 0025, ln. 2-4] “…an aircraft inspection method that is configured to inspect one or more components of an aircraft before a flight…”, [par. 0044, ln. 1-15] “Inspection and determination of aircraft components may be accomplished using one or more sensors including infrared, visible, and multispectral cameras, hydrocarbon sensors… the mobile robot is configured to use laser range sensors (LIDAR) to detect and identify aircraft components in the vicinity of the mobile robot (such as landing gear) where the physical geometry of these components at a given height above the tarmac (as determined by laser scanning) may be used to identify the components, while the relative distance between the mobile robot and these components may be used to determine the relative location of the mobile robot with respect to the aircraft as a whole.”): obtaining an image of the aircraft landing gear assembly ([Fig. 6], [par. 0081, ln. 1-15] “FIG. 6… diagram of the inspection robot 100 proximate to landing gear 24… shows a rectangular border box 11 in which… 24 is shown. Referring to FIGS. 1-3 and 6, the visual sensor 124b (such as a camera) may be used to acquire image(s) of… 24. The acquired image of… 24 is compared to prior data (such as prior images) of… 24 stored in the central database 204 in order to determine whether portions of… 24 are in alignment, wheels 26 have appropriate tire pressure, and/or the like… 124b may acquire images of the wheels 26, which may differ from those stored in the… 204, thereby indicating an anomaly in tire pressure.”); determining a region of interest within the image, the region of interest containing the component of the aircraft landing gear assembly ([Fig. 8, see 400’], [par. 0086, ln. 1-5] “The horizontal plane 400 shown represents a path swept by the laser scanners 126a. The horizontal plane 400 includes portions 400′ that intersect the landing gear 24a, 24b, and 24c… 126a are able to detect… 24a, 24b, 24c via the intersecting portions 400′.”, [par. 0087, ln. 1-25] “The emitted laser energy from the laser scanners 126a is used to detect the positions of the landing gear 24a, 24b, and 24c. An optional image processing enhancement may be used to improve the ability to distinguish between landing gear and other objects that may be in the environment. Image processing techniques, such as sliding window scanning approach, may be used with video from an on-board camera along with on- or off-board computation… the inspection robot 100 may travel around… 24a, 24b, and 24c (or simply emit laser energy allowing a sweep angle from one position) to determine the three positions (for example, the centroids of the bounding boxes) of… 24a, 24b, and 24c. The detected positions of… 24a, 24b, and 24c are compared to stored data regarding… 24a, 24b, and 24c, which correlates the detected positions with a center point 25 of each landing gear 24a, 24b, and 24c… the central database 204 (shown in FIG. 3) may include a data map of… 24a, 24b, and 24c that correlates detected positions with the center points 25 of… 24a, 24b, and 24c. By detecting the three positions of… 24a, 24b, 24c, the detected positions may be mapped against data stored in… 204, which correlates the detected positions with the center points 25.”); calculating a coordinate of a centroid of the region of interest ([par. 0087, ln. 1-25]); comparing the coordinate of the centroid to a corresponding predetermined {threshold} coordinate ([par. 0087, ln. 1-25]), and determining a position {of the component of the aircraft landing gear assembly} based on the comparison of the coordinate of the centroid to the corresponding predetermined {threshold} coordinate ([Fig. 9], [par. 0088, ln. 1-19] “…The center points 25 of the landing gear 24a, 24b, and 24c are correlated with the various components of the aircraft 10 to be inspected. For example, a component map of the aircraft 10 is stored in the central database 204 or the memory 118 (shown in FIG. 1). The three center points 25 of… 24a, 24b, and 24c are registered with the components to be inspected (such as the engines 14)… the systems and methods described herein may be configured to solve for alignment between measured and known values, also referred to as solving the pose problem. In general, the inspection control unit 116 of the inspection robot 100 is able to locate the components to be inspected, and move the inspection robot 100 to the various components by way of the navigation sub-system 114 locating the three center points 25 of… 24a, 24b, and 24c in relation to the inspection robot 100.”). Shapoury does not specifically disclose a predetermined threshold coordinate, or determining a position of the component of the aircraft landing gear assembly based on the threshold coordinate, though one of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize that Shapoury discloses comparing a coordinate of the centroid of the landing gear to a corresponding predetermined coordinate, and determining a position of the inspection robot based on the comparison of the coordinate to a predetermined coordinate. However, Garaygay teaches a predetermined threshold coordinate ([par. 0053, ln. 1-5] “At step 502, the controller 300 determines whether a predetermined condition is satisfied in relation to a position of the wheel arrangement 202. For example, the predetermined condition may be that the position of the wheel arrangement 202 is greater than a threshold value (e.g. 1.5 degrees or 2 degrees) away from a centre position (i.e. that the magnitude of the angle A shown in figure 2e is greater than a threshold value).”), and determining a position of the aircraft landing gear assembly based on a comparison of the coordinate of the centroid to the predetermined threshold coordinate ([Fig. 3], [par. 0053, ln. 1-5], [par. 0054, ln. 1-8] “If the predetermined condition is satisfied, the controller 300 performs an adjustment process to control the steering system 301 to adjust the position of the wheel arrangement 202 at step 504. This may involve implementing processes as described above in relation to figure 3, for example. The adjustment process may be a centering process to return the wheel arrangement 202 to or near the centre position. The controller 300 may monitor the position of the wheel arrangement 202 based on signals received from the sensors 202a, 202b during the adjustment process and end the process once the wheel arrangement 202 is at or near the centre position.”). One of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize Shapoury and Garaygay as within the same filed of aircraft landing gear inspection systems, and as analogous to the claimed invention. The motivation to combine would have been obvious to one of ordinary skill in the art, and is disclosed in Garaygay, wherein malfunctioning and/or misalignment of equipment can be determined by the comparison by a predetermined threshold coordinate ([par. 0049, ln. 1-5] “The wheel arrangement 202 is typically centred prior to the NLG 102 being retracted into the landing gear bay 200, under the control of the controller 300, for example. However, the wheel arrangement 202 may move away from the centre position due to, for example, equipment malfunction or accidental use of the steering controller 314 during a flight…”). One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the predetermined threshold position determination of Garaygay through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 1. 9. Regarding Claim 4, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury further discloses wherein the position of the component of the landing gear comprises at least one of: an extended position, a retracted position, an open position, a closed position, a locked position, or an unlocked position ([Fig. 6] see extended and locked landing gear). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 4. 10. Regarding Claim 5, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury further discloses wherein the component of the aircraft landing gear assembly comprises at least one of: a torque link, a tyre, a landing gear bay door, or a lock ([Fig. 6] see tyre 26). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 5. 11. Regarding Claim 6, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury further discloses wherein the obtaining the image of the aircraft landing gear assembly comprises obtaining the image using a digital camera ([Fig. 6], see 124b, [par. 0044, ln. 1-15], [par. 0078, ln. 1-11] “The sensing sub-system 110 may also include a visual sensor 124b, such as a multispectral imaging system, thermal imaging system, or other such camera. The visual sensor 124b is configured to acquire one or more images of the engine 14 and/or other components of the aircraft. The acquired images from the visual sensor 124b are compared with stored images or other such data of the component(s) within the central database 204. The acquired images and the stored data within the central database 204 are analyzed by the monitoring control unit 202 to determine whether or not anomalies exist therebetween.”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 6. 12. Regarding Claim 7, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury further disclosed wherein the obtaining the image of the aircraft landing gear assembly comprises scanning the aircraft landing gear assembly with an imaging device to determine a set of position data points indicative of a distance that features of the aircraft landing gear assembly are from the imaging device ([par. 0044, ln. 1-15], [par. 0082, ln. 1-23] “The laser scanners 126a are used to navigate the inspection robot 100 between various locations and avoid collisions with other objects. The laser scanner(s) 126b are used to detect distances to the engine 14 and/or surface features of the engine 14. The laser scanners 126a and/or 126b may be used for navigation and sensing of features of the aircraft 10. That is, the laser scanners 126a and/or 126b may form part of the navigation sub-system 114 and the sensing sub-system 110. Optionally, the laser scanners 126a and 126b may be used exclusively by the navigation sub-system 114. Also, optionally, the location estimate may be continuously refined over time by using multiple scans acquired as the inspection robot 100 moves through an environment.”, [par. 0083, ln. 1-7] “The laser scanners 126a and 126b may be part of a LIDAR navigation sub-system. The inspection robot 100 may include general purpose camera for navigation work lights, and/or the like. The laser scanners 126a and 126b may be used in relation to three-dimensional optical template matching or laser-based triangulation to locate various components and features of the aircraft 10.”, [par. 0086, ln. 1-5], [par. 0087, ln. 1-25], [par. 0088, ln. 1-19]). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 7. 13. Regarding Claim 8, a combination of Shapoury and Garaygay teaches the method of claim 7. Shapoury further discloses wherein the imaging device is a lidar system and the step of determining the set of position data points comprises scanning the aircraft landing gear using the lidar system to determine the set of position data points ([par. 0044, ln. 1-15], [par. 0082, ln. 1-23], [par. 0083, ln. 1-7], [par. 0086, ln. 1-5], [par. 0087, ln. 1-25], [par. 0088, ln. 1-19]). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 8. 14. Regarding Claim 9, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury further discloses providing an indication of the position of the component of the aircraft landing gear assembly to an operator ([par. 0065, ln. 14-23] “The monitoring control unit 202 compares the acquired inspection data from the sensing sub-system 110 with the reference data stored in the central database 204 to determine if anomalies exist between the acquired inspection data and the stored reference data. If anomalies do not exist… 202 determines that the components have passed the inspection tests. If, however, anomalies do exist… 202 outputs an alert signal indicating that one or more components of the aircraft may need further inspection.”, [par. 0066, ln. 1-8] “The central monitoring station 200 outputs inspection test results to an operator of aircraft… 202 outputs inspection test results via the communication device 206 to a flight computer and/or a handheld device (for example, a smart phone, tablet, or the like) indicating the results of the various pre-flight inspection tests of the various components of the aircraft.”, [par. 0067, ln. 1-16] “…the inspection robot 100 outputs the inspection data to the central monitoring station as the inspection data is acquired from the sensor(s) 124… 100 may store the inspection data within the memory 118 until all of the pre-flight inspection checks and tests performed by… 100 are completed, and then output the inspection data covering all of the pre-flight inspection checks and tests to the central monitoring station 200… 100 may include the central database 204 (such as within the memory 118), or at least a portion therefore relating to a particular aircraft, which allows… 100 itself to compare the acquired inspection data to stored data regarding the aircraft and output test results to the operator of the aircraft.”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 9. 15. Regarding Claim 10, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury further discloses controlling operation of an {aircraft} system based on the position of the component of the landing gear assembly ([Fig. 9], [par. 0082, ln. 1-23], [par. 0083, ln. 1-7], [par. 0088, ln. 1-19]). Specifically, one of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize that Shapoury controls the operation of the robot system based on the positions of the component of the landing gear assembly, though Shapoury does not specifically disclose a controlling operation for an aircraft. However, Garaygay teaches a controlling operation of an aircraft system based on the position of the component of the landing gear assembly ([par. 0054, ln. 1-8]). The motivation to combine would have been obvious to one of ordinary skill in the art, and is analogous to that provided in claim 1, in that misalignment can be corrected using a controlling operation ([par. 0049, ln. 1-5]). Further, one of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize that this allows for on the fly correction wherein immediate action is not required or permitted to fix a misaligned or anomalous component of the landing gear assembly. One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the predetermined threshold position determination and aircraft controlling operations of Garaygay through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination and aircraft controlling operations of Garaygay to obtain the invention as specified in claim 10. 16. Regarding Claim 11, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury discloses based on the position of the component of the aircraft landing gear assembly, determining a remedial action to be taken ([par. 0065, ln. 14-23], [par. 0066, ln. 1-8], [par. 0067, ln. 1-16]) and providing an indication of the remedial action to an operator ([par. 0065, ln. 14-23], [par. 0066, ln. 1-8], [par. 0067, ln. 1-16]). Specifically, the examiner notes that the broadest reasonable interpretation of “remedial action” would encompass the indication to an operator for “further inspection” as disclosed in Shapoury. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 11. 17. Regarding Claim 14, the claim language is analogous to claim 1, with the exception of “An aircraft controller configured to:”. Shapoury does not disclose an aircraft controller. However, one of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize that Garaygay disclose an aircraft controller configured to operate an analogous inspection of a landing gear assembly to that in Shapoury ([Garaygay, Fig. 3], [Garaygay, par. 0053, ln. 1-5], [Garaygay, par. 0054, ln. 1-8]). Specifically, the motivation to combine is analogous to claim 1 and 10 ([Garaygay, par. 0049, ln. 1-5]). One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the aircraft controller and predetermine threshold position determination of Garaygay through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the aircraft controller and the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 14. 18. Regarding Claim 15, the claim language is analogous to claim 1 and 14. Specifically, Shapoury discloses “A system for determining a position of a component of an aircraft landing gear assembly, the system comprising: an imaging device” ([par. 0044, ln. 1-15]), and does not specifically disclose “an aircraft controller configured to:”. However, one of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize that Garaygay disclose an aircraft controller configured to operate an analogous inspection of a landing gear assembly to that in Shapoury ([Garaygay, Fig. 3], [Garaygay, par. 0053, ln. 1-5], [Garaygay, par. 0054, ln. 1-8]). The motivation to combine is analogous to claim 1 and 10 ([Garaygay, par. 0049, ln. 1-5]). One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the aircraft controller and predetermine threshold position determination of Garaygay through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the system of Shapoury with the aircraft controller and the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 15. 19. Regarding Claim 16, a combination of Shapoury and Garaygay teaches the system of claim 15. Rejections analogous to claim 6 are further applicable to claim 16 in view of the system of the combination of Shapoury and Garaygay. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the system of Shapoury with the aircraft controller and the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 16. 20. Regarding Claim 17, a combination Shapoury and Garaygay teaches the method of claim 15. Rejections analogous to claim 7 and 8 are further applicable to claim 17 in view of the system of the combination of Shapoury and Garaygay. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the system of Shapoury with the aircraft controller and the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 17. 21. Regarding Claim 18, rejections analogous to claim 15 are further applicable to claim 18. Specifically, Shapoury discloses a non-transitory computer-readable storage medium storing instruction that, when executed by a controller, cause the controller to carry out the computer-implemented method ([par. 0074, ln. 3-10] “It is to be understood that the processing or control units may represent circuits, circuitry, or portions thereof that may be implemented as hardware with associated instructions (e.g., software stored on a tangible and non-transitory computer readable storage medium, such as a computer hard drive, ROM, RAM, or the like) that perform the operations described herein.”). Shapoury does not discloses wherein the controller is an aircraft controller. However, Garaygay discloses wherein the controller can be an aircraft controller ([Garaygay, Fig. 3], [Garaygay, par. 0053, ln. 1-5], [Garaygay, par. 0054, ln. 1-8]). The motivation to combine is analogous to claim 1 and 10 ([Garaygay, par. 0049, ln. 1-5]). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the non-transitory computer-readable storage medium of Shapoury with the aircraft controller and the predetermined threshold position determination of Garaygay to obtain the invention as specified in claim 18. 22. Regarding Claim 19, a combination of Shapoury and Garaygay teaches the method of claim 14. Garagay teaches wherein the aircraft controller is part of the aircraft ([par. 0046, ln. 1-5] “The steering controller 314 is typically located in a flight deck of the aircraft 100 and enables, for example, manual control of the steering system 301. The steering controller 314 may comprise a wheel, tiller or joystick, for example, with mechanical, electrical or hydraulic connections transmitting the steering controller 314 input movement to a steering control unit.”, [par. 0047, ln. 1-3] “The controller 300 is typically a computerized device, and may form part of an avionic system of the aircraft 100, for example. The controller is connected via wired or unwired electronic connections to the meter valve 306 and the sensors 302a, 302b.”). The motivation to combine is analogous to claim 1 and 10. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the non-transitory computer-readable storage medium of Shapoury with the aircraft controller, the predetermined threshold position determination, and the aircraft of Garaygay to obtain the invention as specified in claim 19. 23. Claims 2-3 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Publication No. 2022/0332437 to Shapoury, in view of GB-2563851-A to Garaygay, and further in view of “A Real-Time Wrong-Way Vehicle Detection Based on YOLO and Centroid Tracking” to Rahman et al. (hereinafter Rahman). 24. Regarding Claim 2, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury teaches obtaining a plurality of images of the aircraft landing gear assembly in operation ([Fig. 5-8] see deployed landing gear supporting aircraft frame, [par. 0044, ln. 1-15]) {and tracking movement of the region of interest shown in the plurality of images}. Specifically, the examiner notes that the broadest reasonable interpretation of “in operation” would encompass the continued deployment of the landing gear to support an aircraft, as opposed to only deployment/stowing or traversal of the landing gear. However, should a narrower interpretation of “in operation” be taken that only encompasses the active deployment/stowing or traversal using the landing gear, the examiner notes that Garaygay specifically teaches wherein the sensors operate to determine threshold during operation of the landing gear ([Fig. 3], [par. 0049, ln. 1-5], [par. 0054, ln. 1-8]). The motivation to combine the “in operation” sensing of Garaygay with the method of Shapoury is specifically disclosed in Garaygay, wherein it allows for active adjustment during operation using the landing gear to avoid misalignment ([par. 0049, ln. 1-5], [par. 0054, ln. 1-8]). One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the in operation predetermine threshold position determination of Garaygay through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. However, a combination of Shapoury and Garaygay does not specifically disclose tracking movement of the region of interest shown in the plurality of images. However, Rahman specifically teaches tracking movement of the region of interest shown in the plurality of images ([pg. 916, col. 2, par. 2, ln. 1-11] “The system we present here has three stages. In the first stage, every vehicle in the video frame is detected using the YOLO object detection algorithm and a bounding box is generated for each detected vehicle. Then, the bounding boxes are fed to the centroid based moving object tracking algorithm. The algorithm tracks each vehicle independently in a specified region of interest (ROI). Finally, the direction of the vehicle is determined by calculating its centroid's height in each frame and detect whether it moves in the wrong direction or not. If the vehicle is on the wrong side, then the system will capture an image of the vehicle.”, [pg. 917, col. 2, B. Vehicle Tracking, par. 1, ln. 1 to pg. 918, col. 1, par. 1, ln. 13] “To track each vehicle, we use the centroid tracking algorithm. This algorithm takes the bounding box as the input. So first, the bounding boxes are generated using YOLO. Then, those boxes are fed to the centroid tracker. When the center of each vehicle that means the center of the corresponding bounding box enters the region of interest, it is given a unique identification number which is shown in Fig. 4a. In the next frame, the center of all the objects move in another place or maybe not have any movement which is shown in Fig. 4b. The centroid tracking algorithm is based on an assumption which is that each object will move very little in between the subsequent frame. So, if we can relate any new centroid which has the minimum distance with an old centroid, we can say that this object is previously identified and the new centroid of that object will be updated. This is shown in Fig. 5a. To do this, all possible Euclidean distance between each pair of the new centroids (yellow color) and the old centroids (red color) are computed…”). Specifically, one of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize Shapoury and Rahman as within the same field of ROI detection for vehicles, and as analogous to the claimed invention. Specifically, the motivation to combine the combination of Shapoury and Garaygay with the tracking of the region of interest in Rahman would have been obvious to one of ordinary skill in the art, and is disclosed in Rahman, in that it allows for determination of erroneous and/or unwanted movement within a series of images ([pg. 916, col. 2, par. 2, ln. 1-11], [par. 918, col. 2, C. Wrong-way vehicle detection, par. 1, ln. 1 to pg. 919, col. 1, par. 1, ln. 1-14] “The final step of our proposed system is to detect the wrongway vehicle. Our system already can track each vehicle that is in our specified region of interest. The centroid of every tracked vehicle has a height from the top of the frame. When the vehicle is registered first and given an identity number, the height of the centroid H1 is computed and stored in a file corresponding to the identity number. In the next frame, the height of the centroid H2 is computed and stored in another file along with its identity number. This H2 will be updated in each consecutive frame. If the vehicles move, the H1 and H2 of a vehicle will not be equal. By comparing these two heights, our system will predict the direction of the vehicle. In our system, we defined that if the vehicle moves away from the camera, it will be detected as a wrongway vehicle. So, if H1 < H2 then, the vehicle is coming towards the camera and is in the right way. If otherwise, our system will detect it as a wrong-way vehicle. The opposite can also be defined just by changing the condition. After the detection of such a vehicle, an image of the frame will be captured automatically for further inspection. The flowchart of the whole system is shown in Fig. 7.”). Specifically, one of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize that by tracking landing gear in an analogous manner, you could detect erroneous movement and/or deployment based of the centroids already disclosed in Shapoury (e.g., landing gear getting stuck in a different position during deployment). One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the in operation predetermine threshold position determination of Garaygay, and further combined the combination of Shpaoury and Garaygay with the region of interest tracking of Rahman through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the in operation predetermine threshold position determination of Garaygay, and further combine the combination of Shpaoury and Garaygay with the region of interest tracking of Rahman to obtain the invention as specified in claim 2. 25. Regarding Claim 3, a combination of Shapoury, Garaygay, and Rahman teaches the method of claim 2. Shapoury and Garaygay do not specifically disclose tracking movement of the region of interest comprises tracking movement of the coordinate of the centroid between the plurality of images. However, Rahman specifically teaches wherein the tracking movement of the region of interest comprises tracking movement of the coordinate of the centroid between the plurality of images ([pg. 916, col. 2, par. 2, ln. 1-11], [par. 918, col. 2, C. Wrong-way vehicle detection, par. 1, ln. 1 to pg. 919, col. 1, par. 1, ln. 1-14]). The motivation to combine remains analogous to claim 2, wherein tracking allows for determination of erroneous and/or unwanted movement within a series of images ([pg. 916, col. 2, par. 2, ln. 1-11], [par. 918, col. 2, C. Wrong-way vehicle detection, par. 1, ln. 1 to pg. 919, col. 1, par. 1, ln. 1-14]). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the in operation predetermine threshold position determination of Garaygay, and further combine the combination of Shpaoury and Garaygay with the region of interest tracking of Rahman to obtain the invention as specified in claim 3. 26. Claims 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Publication No. 2022/0332437 to Shapoury, in view of GB-2563851-A to Garaygay, and further in view of “Research on Algorithm of fighter landing gear in bad video image” to Miao Tang et al. (hereinafter Tang). 27. Regarding Claim 12, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury and Garaygay do not specifically disclose performing image pre-processing to the image of the aircraft landing gear assembly to improve clarity of the image. However, Tang discloses performing image pre-processing to the image of the aircraft landing gear assembly to improve clarity of the image ([pg. 6, Fig. 2], [pg. 3, 2. Image enhancement algorithm, par. 1, ln. 1 to par. 4, ln. 4] “At present, several typical algorithms to change the image quality by adjusting the pixel value of the image mainly include histogram equalization, multi-scale retinal enhancement with color recovery (MSRCR), automatic color equalization (ACE) and so on. This paper mainly uses ACE to enhance the video image of fighter landing. ACE corrects the pixel value by calculating the difference between the target point and other pixel points to obtain the brightness difference information between pixel points. Firstly, the color difference correction of the image is completed by adjusting the color domain / spatial domain of the image, and the spatial domain reconstructed image is obtained. The calculation formula is R c = ∑ j ∈ s u b j e c t ,   j ≠ p r ( I c p - I c j ) d ( p , j ) (1) Where R c ( p ) is the intermediate result, I c p - I c j is the brightness difference of 2 points, d ( p , j ) represents the distance measurement function, r ( * ) is the brightness expression function, it needs to be an odd function, this step can adapt to the local image contrast, r ( * ) can enlarge small differences, enrich large differences, and expand or compress the dynamic range according to local content. In general, r ( * ) is: r x = 1 ,   x < - T x T ,   - T ≤ x ≤ T - 1 ,   x > T (2) Then the corrected image is dynamically expanded. Finally, the three RGB channels are linearly expanded and dynamically stretched to obtain the final image. ACE can better process image details, realize color correction and improve image brightness and contrast, and has a good image enhancement effect.”). One of ordinary skill in the art, before the effective filling date of the claimed invention, would recognize Shapoury and Tang as within the same field of aircraft landing gear image sensing, and as analogous to the claimed invention. The motivation to combine is obvious to one of ordinary skill in the art, and is further disclosed in Tang, wherein applying pre-processing improved the image quality and allows for better accuracy ([pg. 3, 2. Image enhancement algorithm, par. 1, ln. 1 to par. 4, ln. 4], [pg. 7, Table 2], [pg. 7, Fig. 3], [pg. 7, par. 5, ln. 1-7] “As can be seen from table 1, compared with the original image, the accuracy of the image enhanced by ACE is improved by 1.19%, the recall is improved by 4.17%, and the map is improved by 1.08%. It can be seen from the comparison diagram of detection effect in Figure 3 that the front landing gear of the fighter is missed in Figure 3 (a) and (b), which is mainly due to the loss and distortion of the front landing gear caused by the change of flight angle during the movement of the fighter; The rear landing gear can be well detected. The enhanced detection probability is higher and the detection effect is better.”). One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the redetermined threshold position determination of Garaygay, and further combined the combination of Shapoury and Garaygay with the image pre-processing of Tang, through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the redetermined threshold position determination of Garaygay and the image pre-processing of Tang to obtain the invention as specified in claim 12. 28. Regarding Claim 13, a combination of Shapoury and Garaygay teaches the method of claim 1. Shapoury and Garaygay do not specifically disclose wherein determining the region of interest comprises using a machine learning algorithm. However, Tang teaches wherein determining the region of interest comprises using a machine learning algorithm ([pg. 4, Fig. 1], [pg. 4, 3.2. K-means clustering adjusts a priori box size, par. 1, ln. 1 to pg. 5, par. 1, ln. 3] “The proposal of a priori frame transforms target detection into two problems at the same time, that is, whether there is a target in the fixed grid and judging the distance between the prediction frame and the real frame. The size of the a priori frame is very important for the YOLOv4 target detection network. The original YOLOv4 a priori frame is clustered on the VOC dataset. However, the type of target to be detected on the video image data set of fighter landing is only the landing gear. The direct use of the original a priori frame can not meet the recognition requirements. In order to improve the detection accuracy of landing gear, The K-means clustering algorithm is re used to cluster the width and height of the data set in this paper, and a new a priori frame size is obtained. Firstly, cluster the labeled data sets, continuously increase the number of cluster centers, and calculate the distance from each data to each cluster center through iteration. The distance calculation formula is d = 1 - I o U (3) The distance function is used as the optimization criterion function for clustering until the criterion function reaches the minimum, and the clustering ends. This will improve the coincidence rate between the prediction boundary box and the marking box of the real target to a certain extent.”, [pg. 5, 3.3. Box regression function, par. 1, ln. 1-6] “Yolov4 uses the Complete-IoU (CIoU) [9] as the box regression function. CIoU can obtain better convergence speed and accuracy on the rectangular box regression problem. CIoU takes into account the distance between the target and the center point, overlapping area, aspect ratio and penalty term, so that the regression of the target frame becomes more stable, and the function of the penalty term is to control the width and height of the prediction frame to be close to the width and height of the real frame as quickly as possible.”). The motivation to combine would have been obvious to one of ordinary skill in the art, and is disclosed in Tang, wherein it automates the process of landing gear detection, reduces human error, and improves detection accuracy ([pg. 3, par. 3, ln. 1-4] “Therefore, this paper combines automatic color equalization [6] (ACE) in image enhancement with YOLOv4 for fighter landing gear detection, so as to reduce the probability of human error and improve the detection accuracy.”). One of ordinary skill in the art, before the effective filling date of the claimed invention, would have combined the method of Shapoury with the redetermined threshold position determination of Garaygay, and further combined the combination of Shapoury and Garaygay with the machine learning region of interest determination of Tang, through known means, with no change to their respective function, and the combination would have yielded nothing more than predicable results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to combine the method of Shapoury with the redetermined threshold position determination of Garaygay and the image pre-processing of Tang to obtain the invention as specified in claim 13. Conclusion 29. The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. See PTO-892. Any inquiry concerning this communication or earlier communications from the examiner should be directed to PAULO ANDRES GARCIA whose telephone number is (703)756-5493. The examiner can normally be reached Mon-Fri, 8-4:30PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chan Park can be reached on (571)272-7409. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PAULO ANDRES GARCIA/Examiner, Art Unit 2669 /CHAN S PARK/Supervisory Patent Examiner, Art Unit 2669
Read full office action

Prosecution Timeline

Jan 29, 2024
Application Filed
Jan 23, 2026
Non-Final Rejection — §103
Mar 31, 2026
Interview Requested
Apr 13, 2026
Applicant Interview (Telephonic)
Apr 13, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602823
RE-LOCALIZATION OF ROBOT
2y 5m to grant Granted Apr 14, 2026
Patent 12597280
IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND PROGRAM
2y 5m to grant Granted Apr 07, 2026
Patent 12597161
SYSTEMS AND METHODS FOR OBJECT TRACKING AND LOCATION PREDICTION
2y 5m to grant Granted Apr 07, 2026
Patent 12586400
IMAGE PROCESSING APPARATUS, CONTROL METHOD THEREOF, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12586176
SYSTEMS AND METHODS FOR PREDICTING AN INCOMING ROTATIONAL BALANCE OF AN UNFINISHED WORKPIECE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
99%
With Interview (+17.2%)
3y 2m
Median Time to Grant
Low
PTA Risk
Based on 41 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month