Prosecution Insights
Last updated: April 19, 2026
Application No. 18/334,566

AIRCRAFT MAINTENANCE SYSTEM AND METHODS

Final Rejection §101§103
Filed
Jun 14, 2023
Examiner
ZHANG, WAYNE
Art Unit
2672
Tech Center
2600 — Communications
Assignee
Honeywell International Inc.
OA Round
2 (Final)
50%
Grant Probability
Moderate
3-4
OA Rounds
3y 3m
To Grant
94%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
8 granted / 16 resolved
-12.0% vs TC avg
Strong +44% interview lift
Without
With
+43.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
22 currently pending
Career history
38
Total Applications
across all art units

Statute-Specific Performance

§101
19.2%
-20.8% vs TC avg
§103
42.4%
+2.4% vs TC avg
§102
11.0%
-29.0% vs TC avg
§112
25.1%
-14.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 16 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 09/25/2025 have been fully considered but they are not persuasive. On page 9 of Remarks, the Applicant states “Applicant respectfully submits that the claimed invention cannot be grouped as a mental process as alleged in the Office Action, because the claimed invention cannot be performed in the human mind. The MPEP defines the mental processes grouping as "concepts performed in the human mind, and examples of mental processes include observations, evaluations, judgments, opinions. "MPEP § 2106.04(a)(2)(III). It further notes that "[c]laims do not recite a mental process when they do not contain limitations that can practically be performed in the human mind, for instance when the human mind is not equipped to perform the claim limitation." Id. Here, Applicant submits that the claimed invention cannot be performed in the human mind, due to, for example, requirements such as, inter alia, "generating, using the image data, isolated aircraft feature image data indicative of the aircraft, wherein generating the isolated aircraft feature image data comprises removing the environmental feature from the image data." That is, while a human mind may be able to focus on particular portions of an image, it is not possible for the human mind to remove particular features (e.g., environmental features) of an image. Accordingly, the claims do not recite a judicial exception and thus do not recite an abstract idea as alleged in the Office Action.” The Applicant states that while a human mind is able to focus in on a portion of an image, it is not possible for them to remove a particular feature, as recited in the amended claims. However, MPEP 2106.04(a)(2) states “The courts do not distinguish between mental processes that are performed entirely in the human mind and mental processes that require a human to use a physical aid (e.g., pen and paper or a slide rule) to perform the claim limitation. See, e.g., Benson, 409 U.S. at 67, 65, 175 USPQ at 674-75, 674 (noting that the claimed "conversion of [binary-coded decimal] numerals to pure binary numerals can be done mentally," i.e., "as a person would do it by head and hand."); Synopsys, Inc. v. Mentor Graphics Corp., 839 F.3d 1138, 1139, 120 USPQ2d 1473, 1474 (Fed. Cir. 2016) (holding that claims to a mental process of "translating a functional description of a logic circuit into a hardware component description of the logic circuit" are directed to an abstract idea, because the claims "read on an individual performing the claimed steps mentally or with pencil and paper"). Mental processes performed by humans with the assistance of physical aids such as pens or paper are explained further below with respect to point B”. The MPEP does not distinguish a mental process that is performed by a human mind and the use of a physical aid, and a human with the assistance of a physical aid in this scenario, such as with a pair of scissors, is able to cut an image and remove an environmental feature of an image. Due to this, the amended limitation does not overcome the abstract idea/mental process of a person. On page 10 of Remarks, the Applicant states “Moreover, Applicant respectfully submits that even assuming, arguendo, the claims were deemed to recite an abstract idea under the first prong of the Step 2A analysis, the claims, as a whole, integrate the elements of the independent claims into a practical application under the second prong of the Step 2A analysis. First, the elements of the independent claims include specific additional elements that integrate any alleged abstract idea into a practical application. For example, the elements of the independent claims recite one or more computing devices that a maintenance plan is provided to and using a deep neural network to determine one or more anomalies. Second, the generation of isolated aircraft feature image data by removing an environmental feature from image data reflects a transformation or reduction of a particular article (e.g., an original image) to a different state or thing (e.g., isolated aircraft features). Further, Applicant respectfully submits that even if assuming, arguendo, the claims were deemed directed to the abstract idea alleged in the Office Action, the specific claim elements recited would constitute "significantly more" than the alleged abstract idea such that the claims recite an "inventive concept" over the alleged abstract idea. Under Step 2B of the Alice Mayo framework, the examiner should evaluate the additional elements individually and in combination to determine whether they provide an inventive concept, i.e., whether the claims recite significantly more than the judicial exception to which it is directed. See MPEP 2106”. The Applicant states that under the assumption the claim is an abstract idea, it would still integrate into a practical application, such as providing a maintenance plan to a computing device, using a neural network to determine one or more anomalies, and generating isolated data by removing an environmental feature from the image data. However, providing a maintenance plan to a computing device is a generically recited insignificant extra-solution activity of data outputting. A person can mentally determine an anomaly or defect on isolated image data, and a neural network is adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea. See MPEP 2106.05(f). Generating isolated data by removing an environmental feature from the image data is a form of segmentation because it involves finding a piece of an image that the user wants, and by nature, removing everything else around it. Thus, this limitation is an additional element of data gathering, and it is a well-understood, routine, and conventional activity of segmentation. Applicant’s arguments with respect to claim(s) 1-5, 7-15, 17-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-5, 7-15, 17-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1 recites: “determining, using a deep neural network and based on a comparison of the isolated aircraft feature image data and the stored isolated aircraft feature image data, one or more anomalies in the isolated aircraft feature image data, wherein the anomaly corresponds to a possible maintenance requirement at a determined location;” which can be reasonably interpreted as a human observer mentally determining if there is a defect on an aircraft feature and if it is related to a maintenance requirement of the aircraft. The neural network is an additional element that merely ads the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea. “comparing the determined location of the possible maintenance requirement to a maintenance standard to generate a maintenance plan including one or more steps for repairing the possible maintenance requirement” which can be reasonably interpreted as a human observer mentally comparing a locations’ maintenance requirement with a maintenance standard, and determining a plan to fix the issues/problems. This judicial exception is not integrated into a practical application because additional elements of: “obtaining image data indicative of an aircraft and an environmental feature of an environment surrounding the aircraft, wherein the image data is captured from one or more fixed locations in the environment surrounding the aircraft” is a generically recited insignificant extra-solution activity of data gathering. “generating, using the image data, isolated aircraft feature image data indicative of the aircraft, wherein generating the isolated aircraft feature image data comprises removing the environmental feature from the image data” is a generically recited insignificant extra-solution activity of data gathering. “obtaining stored isolated aircraft feature image data” is a generically recited insignificant extra-solution activity of data gathering. “And providing the maintenance plan to one or more computing devices” is a generically recited insignificant extra-solution activity of data outputting. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because of additional elements: “obtaining image data indicative of an aircraft and an environmental feature of an environment surrounding the aircraft, wherein the image data is captured from one or more fixed locations in the environment surrounding the aircraft” is a well-understood, routine, and conventional insignificant extra-solution activity of data gathering. “generating, using the image data, isolated aircraft feature image data indicative of the aircraft, wherein generating the isolated aircraft feature image data comprises removing the environmental feature from the image data” is a well-understood, routine, and conventional insignificant extra-solution activity of segmentation. “obtaining stored isolated aircraft feature image data” is a well-understood, routine, and conventional insignificant extra-solution activity of data gathering. “And providing the maintenance plan to one or more computing devices” is a well-understood, routine, and conventional insignificant extra-solution activity of data outputting. Claim 2 recites: “wherein a pose of the isolated aircraft feature image data is adjusted to rectify the image of the isolated aircraft feature to match stored image data of one or more other similar aircraft features.” which can be reasonably interpreted as a human observer mentally imagining a pose of an aircraft feature that matches another pose of a similar aircraft feature. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claim 3 recites: “wherein the pose of the isolated aircraft feature image data is adjusted such that the image of the isolated aircraft feature is comparable to a maintenance plan image of a similar aircraft feature” which can be reasonably interpreted as a human observer mentally imagining a pose of an aircraft feature that matches a pose of an aircraft feature in a maintenance plan. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claim 4 recites: This judicial exception is not integrated into a practical application because additional elements of: “wherein the image data further includes image data captured from one or more moveable cameras” is a generically recited insignificant extra-solution activity of data gathering. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because of additional elements: “wherein the image data further includes image data captured from one or more moveable cameras” is a well-understood, routine, and conventional insignificant extra-solution activity of data gathering. Claim 5 recites: “wherein the one or more moveable cameras are mounted on one or more drones or robotic arms” which can be reasonably interpreted as a human observer mentally observing a camera mounted on drone or robotics arms. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claim 7 recites: “wherein the fixed locations are known with respect to a location of the aircraft based on one or more reference dimensions.” which can be reasonably interpreted as a human observer mentally estimating a location of an aircraft based on how far an aircraft is from an object. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claim 8 recites: “wherein the reference dimensions are referenced with respect to an immobile object and the aircraft is configured to park at a reference point with respect to the immobile object while image data is obtained.” which can be reasonably interpreted as a human observer mentally observing an aircraft being parked in a place with an unmoving object in their sight. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claim 9 recites: “wherein the maintenance plan identifies the determined location with respect to one or more maintenance zones of the aircraft” which can be reasonably interpreted as a human observer mentally determining the location of an aircraft based on their residing maintenance zone. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claim 10 recites: “wherein the deep neural network is trained using image data including false defects to reinforce the determination of actual defects” which can be reasonably interpreted as a human observer mentally reinforcing their knowledge of defects by seeing false defects. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claims 11-15 additionally recite image data of an aircraft including one or more of visual, thermal, or x-ray image data, comparing the isolated aircraft feature image data to a training set of data including stored isolated aircraft feature image data based on an aircraft model, and corresponds to the steps recited in claims 1-6. Therefore, the recited elements of this claim are mapped to the analogous steps in the corresponding method claim and are rejected for the same reasons. Claim 17 additionally recites a system for generating one or more maintenance plans comprising: one or more cameras at fixed locations in an environment surrounding an aircraft; a processing device; and a memory communicatively coupled to the processing device and storing one or more instructions. Therefore, the recited elements of this claim are mapped to the analogous steps in the corresponding method claim and are rejected for the same reasons. Claim 18 recites: This judicial exception is not integrated into a practical application because additional elements of: “wherein the cameras include one or more visual, thermal, and x-ray cameras.” is a generically recited insignificant extra-solution activity of data gathering. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because of additional elements: “wherein the cameras include one or more visual, thermal, and x-ray cameras.” is a well-understood, routine, and conventional insignificant extra-solution activity of a camera. Claim 19 recites: “wherein the cameras are fixed above and below the aircraft” which can be reasonably interpreted as a human observer mentally observing a camera above below the aircraft. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Claim 20 recites: “wherein the aircraft is positioned in a hangar” which can be reasonably interpreted as a human observer mentally observing an aircraft positioned in a hangar. This judicial exception is not integrated into a practical application. There are no additional elements, therefore there is no claim limitation to amount to “significantly more”. Applicant’s arguments with respect to claim(s) 1-5, 7-15, 17-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 4-5, 7, 9, 11, 14-15, 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Claybrough (US 20180170540 A1), in view of Hibi (US 20210350570 A1), Deng (US 20170262732 A1), Summers (US 20050065842 A1). Regarding claim 1, Claybrough discloses an automated method of detecting and reporting structural defects (Claybrough, paragraph [0020], “For this purpose, the invention relates to a system for automatically inspecting a surface of an object such as an aircraft, transport vehicle, building or engineering structure, said surface being liable to contain a defect”). While Claybrough discloses obtaining image data indicative of an aircraft (Claybrough, paragraph [0085], “The robot 14 comprises a module 26 for acquiring images of the surface to be inspected and a module 28 for processing the images acquired”), wherein the image data is captured from one or more fixed locations in an environment surrounding the aircraft (Claybrough, paragraph [0126], "Each result is associated with a location provided by the location determination module 36. Said location is expressed according to a coordinate system relative to the surface to be inspected in order to be easily found by a human operator"), they do not explicitly teach “obtaining image data indicative of an environmental feature of an environment surrounding the aircraft”. However, Hibi teaches obtaining image data indicative of an aircraft and an environmental feature of an environment surrounding the aircraft (Hibi, paragraph [0051], Fig. 2 below, "FIG. 2 is a diagram showing an example of a high-resolution image. The high-resolution image shown in FIG. 2 is an image captured by a camera installed in a remote control tower, and shows multiple aircraft parked in an apron of an airport. Moreover, although not shown, the high-resolution image of FIG. 2 also shows multiple aircraft flying over the airport."). PNG media_image1.png 378 514 media_image1.png Greyscale It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to capture an image of Claybrough’s plane and the environment around it, as taught by Hibi. The suggestion/motivation for doing so would have been because zooming out and capturing the entire image of the plane will result in potentially finding more defects, as opposed to a closely shotted images of a plane part. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. While Claybrough in view of Hibi discloses generating, using the image data, isolated aircraft feature image data indicative of the aircraft (Claybrough, paragraph [0110], “a third step of segmenting the image and extracting the contours of all of the shapes capable of representing potential defects, and of generating a sub-image containing the potential defect, also called a zone of interest, for each of said shapes. One image can lead to the generation of no zone of interest or of a plurality of zones of interest”), they do not explicitly state “wherein generating the isolated aircraft feature image data comprises removing the environmental feature from the image data”. However, Deng teaches wherein generating the isolated aircraft feature image data comprises removing the environmental feature from the image data (Deng, paragraph [0150], "An aircraft exists in a foreground of an image, and in order to accurately capture the aircraft from the image, the background of the image needs to be removed first to eliminate interference."). It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to remove the background of Claybrough’s (in view of Hibi) image, as taught by Deng. The suggestion/motivation for doing so would have been to reduce extraneous details of the image and process the image faster. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Claybrough in view of Hibi and Deng discloses obtaining stored isolated aircraft feature image data (Claybrough, paragraph [0106], "The processing of the image by the processing module 28 consists of providing a result representative of the state of the inspected surface. The processing module 28 thus determines, from the acquired image, the presence of a potential defect on the surface, for example by comparing the image acquired with a previous image of the same surface (recovered from the web server or provided by the management module), or by detecting sudden variations in colour or appearance (fineness, grain, blurring, shine, etc.), etc."), determining, using a deep neural network (Claybrough, paragraph [0122], “The classification and characterisation can be performed by a known classifier, such as a linear classifier, a naïve Bayesian classifier, a support vector machine (SVM) classifier, or neural networks, etc") and based on a comparison of the isolated aircraft feature image data and the stored isolated aircraft feature image data (Claybrough, paragraph [0106], "The processing of the image by the processing module 28 consists of providing a result representative of the state of the inspected surface. The processing module 28 thus determines, from the acquired image, the presence of a potential defect on the surface, for example by comparing the image acquired with a previous image of the same surface (recovered from the web server or provided by the management module), or by detecting sudden variations in colour or appearance (fineness, grain, blurring, shine, etc.), etc."), one or more anomalies in the isolated aircraft feature image data, wherein the anomaly corresponds to a possible maintenance requirement at a determined location (Claybrough, paragraph [0121-0122], "The final step consists of classifying and characterising the zone of interest from the set of calculated parameters. The classification consists of determining the type of the potential defect, for example from the following categories: “oil stain”, “corrosion”, “missing element”, “lightning strike”, “scratch”, “not a defect”, “unknown”, etc. The characterisation consists of determining a category of the zone of interest from a predetermined set, for example; “acceptable defect, unacceptable defect”, in addition to the size of said potential defect. The classification and characterisation can be performed by a known classifier, such as a linear classifier, a naïve Bayesian classifier, a support vector machine (SVM) classifier, or neural networks, etc", oil stain, corrosion, etc. and determining if they are acceptable or unacceptable are all forms of maintenance requirements). Claybrough in view of Hibi and Deng does not teach “comparing the determined location of the possible maintenance requirement to a maintenance standard to generate a maintenance plan including one or more steps for repairing the possible maintenance requirement”. However, Summers teaches comparing the determined location of the possible maintenance requirement to a maintenance standard to generate a maintenance plan including one or more steps for repairing the possible maintenance requirement (Summers, paragraph [0104], "This will allow all AIMIS partners to benchmark their current Maintenance Baselines with the current AIMIS Aircraft Manufacturer's Standard Maintenance Baseline, which will have been under continuous improvement by S/AIIs from all other AIMIS members. Based upon this comparison the airlines have the opportunity to upgrade their Baselines by processing AIIs"). It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to compare Claybrough’s (in view of Hibi and Deng) maintenance requirement with a standard maintenance requirement and generate a maintenance plan, as taught by Summers. The suggestion/motivation for doing so would have been to take notice to any potentially forgotten maintenances, leading to better safety and regulations. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Claybrough in view of Hibi, Deng, and Summers discloses providing the maintenance plan to one or more computing devices (Summers, paragraph [0105], “AIMIS will also allow an airline to make a direct comparison of their Non-routines to AIMIS Non-routine data from all other maintenance providers. This comparison will allow the airlines to analyze the effectiveness of their maintenance processes, identifying opportunities for improvement.” When Aircraft Improvement Initiative (AII) is shared with other airlines, different computer systems are involved). Therefore, it would have been obvious to combine Claybrough in view of Hibi, Deng, and Summers to obtain the invention as specified in claim 1. Claim 17 corresponds to claim 1, additionally reciting a system (Claybrough, paragraph [0019], “The invention further aims to provide, in at least one embodiment, an inspection system and method providing an increased repeatability of the inspections and detection of defects”) for generating one or more maintenance plans, comprising: one or more cameras at fixed locations in an environment surrounding an aircraft (Claybrough, paragraph [0109], "a second step of locating the image which, based on a positioning of the robot (i.e. the position of the robot in space and the angle of orientation of the robot), on the position and angle of the acquisition sensor of the acquisition module 26, for example a camera, and on the distance between the camera and the surface, determines the coordinates in a coordinate system relative to the surface of the set of points of the image"), a processing device (Claybrough, paragraph [0094], “The modules can also be implemented in the form of a computer programme executed by one or more electronic components, for example a computer processor, a microcontroller, a digital signal processor (DSP), or a field gate programmable array (FGPA), etc”), and a memory communicatively coupled to the processing device and storing one or more instructions (Claybrough, paragraph [0049], “Advantageously and according to the invention, each robot of the fleet comprises a buffer memory module, suitable for storing a plurality of processing results”). Thus, claim 17 is rejected for the same reasons of obviousness as claim 1. Regarding claim 4, Claybrough in view of Hibi, Deng, Summers discloses the method of claim 1, wherein the image data further includes image data captured from one or more moveable cameras (Claybrough, paragraph [0036], "Advantageously and according to the invention, the image acquisition module of at least one robot of said fleet comprises at least one camera suitable for acquiring images within the visible light spectrum"*) *The robot is a flying robot, as stated in paragraph [0080] (“The flying robots 14a, 14b, 14c are also commonly called drones or unmanned aerial vehicles (UAV) and take on the form of a helicopter, quadrotor or multi-rotor capable of stationary flight"). Thus, the camera is moveable. Claim 14 corresponds to claim 4 and thus is rejected for the same reasons of obviousness as claim 4. Regarding claim 5, Claybrough in view of Hibi, Deng, Summers discloses the method of claim 4, wherein the one or more moveable cameras are mounted on one or more drones or robotic arms (Claybrough, paragraph [0080], "The flying robots 14a, 14b, 14c are also commonly called drones or unmanned aerial vehicles (UAV) and take on the form of a helicopter, quadrotor or multi-rotor capable of stationary flight", drones are flying robots). Claim 15 corresponds to claim 5 and thus is rejected for the same reasons of obviousness as claim 5. Regarding claim 7, Claybrough in view of Hibi, Deng, Summers discloses the method of claim 1, wherein the fixed locations are known with respect to a location of the aircraft based on one or more reference dimensions (Claybrough, paragraph [0109], "a second step of locating the image which, based on a positioning of the robot (i.e. the position of the robot in space and the angle of orientation of the robot), on the position and angle of the acquisition sensor of the acquisition module 26, for example a camera, and on the distance between the camera and the surface, determines the coordinates in a coordinate system relative to the surface of the set of points of the image", fixed locations is the coordinates, distance between the camera and the surface is the reference dimensions). Regarding claim 9, Claybrough in view of Hibi, Deng, Summers discloses the method of claim 1, wherein the maintenance plan identifies the determined location with respect to one or more maintenance zones of the aircraft (Claybrough, paragraph [0098], "the environment of the object: the human operator 24 can input whether the object is situated outdoors, or in a hangar, etc. in order to determine whether the robots are subjected to specific restrictions regarding the displacement thereof about the object (for example obstacles)", hangar is a maintenance zone). Regarding claim 18, Claybrough in view of Hibi, Deng, Summers discloses the system of claim 17, wherein the cameras include one or more visual, thermal, and x-ray cameras (Claybrough, paragraph [0036], “Advantageously and according to the invention, the image acquisition module of at least one robot of said fleet comprises at least one camera suitable for acquiring images within the visible light spectrum”). Regarding claim 19, Claybrough in view of Hibi, Deng, Summers discloses the system of claim 17, wherein the cameras are fixed above and below the aircraft (Claybrough, paragraph [0131], “For example, in the case of an aircraft with fixed wings, a flying robot situated above a wing will distance itself laterally before landing vertically, whereas a robot situated beneath the aircraft will immediately land”). Regarding claim 20, Claybrough in view of Hibi, Deng, Summers discloses the system of claim 17, wherein the aircraft is positioned in a hangar (Claybrough, paragraph [0098], “the environment of the object: the human operator 24 can input whether the object is situated outdoors, or in a hangar, etc. in order to determine whether the robots are subjected to specific restrictions regarding the displacement thereof about the object (for example obstacles)”). Claim(s) 2 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Claybrough (US 20180170540 A1) Hibi (US 20210350570 A1), Deng (US 20170262732 A1), Summers (US 20050065842 A1), and in further view of Cramblitt (US 20200202559 A1). Regarding claim 2, Claybrough in view of Hibi, Deng, Summers discloses the method of claim 1. Claybrough in view of Hibi, Deng, Summers does not teach “wherein a pose of the isolated aircraft feature image data is adjusted to rectify the image of the isolated aircraft feature to match stored image data of one or more other similar aircraft features”. However, Cramblitt teaches wherein a pose of the isolated aircraft feature image data is adjusted to rectify the image of the isolated aircraft feature to match stored image data of one or more other similar aircraft features (Cramblitt, paragraph [0045], "The image capture device 111 can acquire a real-time image 305 of the airfield runway. At block 306, the image comparator 104 can perform a comparison of the real-time image and the plurality of images to identify a best-match image from the plurality of images. The comparison can be based on a measure of mutual objects identified in the real-time image and the plurality of images such as the airfield runway, as described above. The refined pose estimator 105 can determine the best pose 307 as an updated current pose estimate of the aircraft 131 based on the comparison"). It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to update the pose of Claybrough’s (in view of Hibi, Deng, Summers) aircraft feature image data with its stored data, as taught by Cramblitt. The suggestion/motivation for doing so would have been to use image data poses that are more standard and common, resulting in more comprehensible photos. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Claybrough in view of Hibi, Deng, Summers, and in further view of Cramblitt to obtain the invention as specified in claim 2. Claim 12 corresponds to claim 2 and is rejected for the same reasons of obviousness as claim 2. Claim(s) 3 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Claybrough (US 20180170540 A1) in view of Hibi (US 20210350570 A1), Deng (US 20170262732 A1), Summers (US 20050065842 A1), Cramblitt (US 20200202559 A1), and Engelbart (US 20180357613 A1). Regarding claim 3, Claybrough in view of Hibi, Deng, Summers, and Cramblitt discloses the method of claim 2. While Claybrough in view of Hibi, Deng, Summers, and Cramblitt teaches adjusting the pose of the isolated aircraft feature image data (as shown in claim 2), they do not teach “a maintenance plan image of a similar aircraft feature”. However, Engelbart teaches a maintenance plan image of a similar aircraft feature (Engelbart, paragraph [0048], Fig. 5 below, “As shown in FIG. 5 in which the structure 13 is an aircraft, several areas of the image of the aircraft are highlighted to represent respective maintenance actions, such as those areas 60 for which the inspection interval has been modified, those areas 62 in which a repair is scheduled to be made and those areas 64 in which the corresponding portion of the aircraft is scheduled to be replaced as indicated by the legend of FIG. 5”). PNG media_image2.png 557 432 media_image2.png Greyscale It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to use Engelbart’s maintenance images as a reference to adjust the pose of Claybrough’s (in view of Hibi, Deng, Summers, and Cramblitt) aircraft feature image data. The suggestion/motivation for doing so would have been to reduce the guesswork of maintenance operators on which planes correspond to which maintenance plan. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Claybrough in view of Hibi, Deng, Summers, Cramblitt, and in further view of Engelbart to obtain the invention as specified in claim 3. Claim 13 corresponds to claim 3 and thus is rejected for the same reasons of obviousness as claim 3. Claim(s) 8 are rejected under 35 U.S.C. 103 as being unpatentable over Claybrough (US 20180170540 A1) in view of Hibi (US 20210350570 A1), Deng (US 20170262732 A1), Summers (US 20050065842 A1), and in further view of Kirk (US 20150329217 A1). Regarding claim 8, Claybrough in view of Hibi, Deng, Summers discloses the method of claim 7. Claybrough in view of Hibi, Deng, Summers does not teach “wherein the reference dimensions are referenced with respect to an immobile object and the aircraft is configured to park at a reference point with respect to the immobile object while image data is obtained”. However, Kirk teaches wherein the reference dimensions are referenced with respect to an immobile object and the aircraft is configured to park at a reference point with respect to the immobile object while image data is obtained (Kirk, paragraph [0045], Fig. 2a/2b below “For example, processor 14 may determine a distance range to a detected object using a stereo vision technique, in which cases two cameras 16 may be mounted side-by-side on wing 32 or another structure of aircraft 10 to generate the stereo images. In this example, the two cameras 16 may be mounted to capture the same region of interest from two different viewpoints points; the two images captured by the cameras at substantially the same time and from different viewpoints may be referred to as stereo images. Using the stereo images captured by the cameras, processor 14 can determine the location of a detected object, and, therefore, the approximate distance relative to aircraft 10, using triangulation”, the aircraft is parked on a ground surface, as shown by Fig. 2a). PNG media_image3.png 525 474 media_image3.png Greyscale It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to mount 2 cameras on the wing of Claybrough’s (in view of Hibi, Deng, Summers) aircraft to determine the location of an object and calculate the distance from the aircraft to the object, as taught by Kirk. The suggestion/motivation for doing so would have been to understand the separation between the aircraft and obstacle, resulting in better safety for flight crew members and passengers. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Claybrough in view of Hibi, Deng, Summers, and Kirk to obtain the invention as specified in claim 8. Claim(s) 10 are rejected under 35 U.S.C. 103 as being unpatentable over Claybrough (US 20180170540 A1) in view of Hibi (US 20210350570 A1), Deng (US 20170262732 A1), Summers (US 20050065842 A1), and in further view of Laszlo (US 20230196541 A1). Regarding claim 10, Claybrough in view of Hibi, Deng, Summers discloses the method of claim 1. Claybrough in view of Hibi, Deng, Summers does not teach wherein the deep neural network trained to determine one or more anomalies in the isolated aircraft feature image data is trained using image data including false defects to reinforce the determination of actual defects. However, Laszlo teaches wherein the deep neural network trained to determine one or more anomalies in the isolated aircraft feature image data is trained using image data including false defects to reinforce the determination of actual defects (Laszlo, paragraph [0185], “In this specification, a false positive prediction is a prediction, generated by a neural network in response to processing a network input representing a manufactured article, that incorrectly predicts the presence of a defect. For example, if the neural network generates a network output that predicts the manufactured article as a whole does have a defect when the manufactured article does not have a defect, then the network output is a false positive. As another example, if the neural network generates a network output that includes a prediction that a particular element (e.g., a particular pixel) of the network input does represent a defect when the particular element does not represent a defect, then the prediction about the particular element is a false positive (e.g., in implementations in which the neural network generates a semantic segmentation of the network input)”). It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to train Claybrough’s (in view of Hibi, Deng, Summers) neural network model based on false defect images, as taught by Laszlo. The suggestion/motivation for doing so would have been to provide the neural network model with more training, allowing for better understanding of its input and achieving more accurate outputs. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Claybrough in view of Hibi, Deng, Summers, and Laszlo to obtain the invention as specified in claim 10. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WAYNE ZHANG whose telephone number is (571) 272-0245. The examiner can normally be reached Monday-Friday 10:00-6:00 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ms. Sumati Lefkowitz can be reached on (571) 272-3638. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WAYNE ZHANG/Examiner, Art Unit 2672 /SUMATI LEFKOWITZ/Supervisory Patent Examiner, Art Unit 2672
Read full office action

Prosecution Timeline

Jun 14, 2023
Application Filed
Jun 26, 2025
Non-Final Rejection — §101, §103
Sep 23, 2025
Examiner Interview Summary
Sep 23, 2025
Applicant Interview (Telephonic)
Sep 25, 2025
Response Filed
Oct 20, 2025
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591990
METHOD AND APPARATUS FOR GENERATING SPATIAL GEOMETRIC INFORMATION ESTIMATION MODEL
2y 5m to grant Granted Mar 31, 2026
Patent 12591958
INFRA-RED CONTRAST ENHANCEMENT FILTER
2y 5m to grant Granted Mar 31, 2026
Patent 12561843
METHOD FOR MANAGING IMAGE DATA, AND VEHICLE LIGHTING SYSTEM
2y 5m to grant Granted Feb 24, 2026
Patent 12536629
Image Processing Method and Electronic Device
2y 5m to grant Granted Jan 27, 2026
Patent 12536667
METHOD AND FACILITY FOR SEGMENTATION OF HIGH-CONTRAST OBJECTS IN X-RAY IMAGES
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
50%
Grant Probability
94%
With Interview (+43.6%)
3y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 16 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month