Prosecution Insights
Last updated: April 19, 2026
Application No. 18/179,691

AUTONOMOUS DRONE FOR RAILROAD TRACK INSPECTION

Final Rejection §103
Filed
Mar 07, 2023
Examiner
MOLNAR, SIDNEY LEIGH
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
UNIVERSITY OF SOUTH CAROLINA
OA Round
2 (Final)
54%
Grant Probability
Moderate
3-4
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
7 granted / 13 resolved
+1.8% vs TC avg
Strong +86% interview lift
Without
With
+85.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
31 currently pending
Career history
44
Total Applications
across all art units

Statute-Specific Performance

§101
8.7%
-31.3% vs TC avg
§103
42.2%
+2.2% vs TC avg
§102
22.3%
-17.7% vs TC avg
§112
26.1%
-13.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 13 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This correspondence is in response to amendments filed on August 18, 2025. Drawing replacement sheets satisfy the drawing requirements and as such drawing objections have been withdrawn. Similarly, amendments to the specification obviate the specification objections. Claims 1, 9-11, and 16 have been amended. Claims 2-8, 12-15, and 18-20 remain as originally filed. Amendments to the claims obviate the claim objections, as well as those rejections made under 35 U.S.C. 112(b). Additionally, amendments overcome the previous 35 U.S.C. 101 rejection, and as such the 35 U.S.C. 101 rejection has been withdrawn. Response to Arguments Applicant argues that Graetz and Dick do not teach the amended limitations (Page 11 of Remarks). Applicant’s arguments with respect to the claims have been considered but are moot because the new ground of rejection does not rely on any combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Additionally, in response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., onboard processing of images and specificities of the flight paths) are not recited in the rejected claims. Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 4-7, 11-12, and 14-17 are rejected under 35 U.S.C. 103 as being unpatentable over Graetz et al. (US 2019/0054937 A1; hereinafter “Graetz”) in view of Kurz et al. (US 2020/0143173 A1; hereinafter “Kurz”). Regarding claim 1, Graetz teaches a drone-based railway track inspection system ("an unmanned aerial vehicle (UAV) system provides for inspecting railroad assets using an unmanned aerial vehicle"; [0007]) comprising: at least one drone comprising at least one computing platform (When regarding types of UAVs that may be used in the system, "aircraft share the same flight computer and flight control software as well as many of the same sub-systems" [0067]. Thus, the system comprises a UAV with a "flight computer", i.e., computing platform.); the at least one computing platform further comprising at least one surveillance unit ("the onboard sensors take high resolution precise location photos" [0040]. The sensor system images are used to monitor, or survey, the track for abnormalities thus making the sensor system an equivalent for the surveillance unit.); at least one communication unit (Aircraft systems include a "communications subsystem" [0039].); at least one computer vision unit ("the system performs rail vision" [0260], which is the system equivalent to computer vision); and at least one autonomous flight control unit configured to enable the at least one drone to have autonomous flight control ("the UAS autopilot (onboard the aircraft)"; [0074], additionally see [0045] for details of autonomous flight control when autopilot is engaged) and autonomous track selection during flight; wherein the at least one surveillance unit, at least one communication unit, the at least one computer vision unit, and the at least one autonomous flight control unit are in communication with one another ("The sensor system also has built-in local computational capability, its own navigation system, and independent communication capability for communicating with other onboard subsystems including the autopilot" [0040]. Thus, there is a communication network which connects aircraft subsystems which may be inclusive of those subsystems referenced above.); wherein the at least one computer vision unit is configured to: extract features of at least one track component during flight via obtaining at least one image of the at least one track component to process the image for both navigation heading as well as anomaly detection with respect to the at least one track component to detect at least one health condition of the at least one track component based on the anomaly detection of the at least one track component ("The rail vision includes processing images locally or remotely for the detection of obstructions or faults in the rail system" [0260]. "All of the faulty conditions 1502 are analyzed for changes in pixel coloring, pixel density, and amount of pixels between components indicating a distance, etc. A changed is identified when one of the changes occurs between successive images in a single flight and also identified when one of the changes occurs in images of the same rail from different UAV flights" [0202]. Pixel features are extracted and analyzed to evaluate the track conditions. Additionally, see [0270-0272] in which onboard software analyzes sensor data to output quantitative and qualitative data about what the sensors have seen inclusive of critical conditions, such as obstructions or faults in the rail system. Such image data which is extracted in this onboard image processing is used for navigation heading (sensor and software systems responsible for controlling pitch, yaw, and roll for proper focus over track when imaging, [0275]) and as well as anomaly detection (recognizing critical conditions within the area of the flight path, [0271]). Critical conditions are inclusive of critical faults for track components, i.e., health condition of track components, based on the anomaly detection (see [0206]) and such critical conditions identified during flight are immediately communicated to the pilot (see [0280]).); and identify at least one railway in order for the at least one drone to navigate ("When the pilot actuates the autopilot, the system software takes over and flies the aircraft as close as possible once over track. The software system also automatically enables the sensors to start taking two pictures per second of the track. At the same time, the sensor and software systems control the pitch, yaw, and roll of aircraft and sensors such that the appropriate sensor or sensors remain focused and placed over the track to ensure the required resolution and overlapping imagery" [0045]. Sensor system generates images to be processed in order to detect the track to be followed and analyzed.). However, Graetz does not explicitly teach … at least one autonomous flight control unit configured to enable the at least one drone to have …autonomous track selection during flight… Kurz, in the same field of endeavor, teaches … at least one autonomous flight control unit configured to enable the at least one drone to have …autonomous track selection during flight (“The drone is controlled and/or configured to follow a path along the route in a manner by which the sensor data is suitable for use in determining a degree of curvature in the route based on a difference between a reference location and a location of interest, as explained herein. For example, if the route is a railway having two nominally parallel rails (i.e., for normal operation in an undamaged condition, the rails are configured to be parallel), the drone may be controlled and/or configured to follow a path along the midpoint between the rails, or to travel above and along one of the rails (e.g., a designated or chosen one of the rails), or to travel along one of the rails but a designated set lateral distance to the left or right of the rail” [0154]. Thus, the drone is configured to select which track to navigate along during autonomous flight.)… Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the autonomous flight control system of Graetz to include the track selection features of Kurz with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification such that for any rail system comprising multiple diverging rails, the UAV is able to autonomously track and follow the rail which the operator requires asset health assessments for. Motivation for such a feature can also be supported as a combination of known methods which yields predictable results for railway inspections using autonomous drone navigation systems (see MPEP 2143.I(A)). Regarding claim 2, Graetz as modified by Kurz (references made to Graetz) teaches the drone-based railway track inspection system of claim 1, wherein the drone navigates without access to GPS positioning ("Lost GPS: In the event of a GPS failure, the aircraft reverts to an inertial navigation system (INS). Attitude and heading are maintained. The heading is determined using a magnetometer. The aircraft position estimate is propagated" [0128]. Thus, the drone is able to navigate without access to the GPS positioning.). Regarding claim 4, Graetz as modified by Kurz (references made to Graetz) teaches the drone-based railway track inspection system of claim 1, further comprising an onboard processing unit, in communication with the at least one computing platform, that processes inspected data immediately to provide real-time track condition assessment without requiring the inspected data to transfer away from the drone ("The system performs rail vision. The rail vision includes processing images locally or remotely for the detection of obstructions or faults in the rail system" [0260]. "If during the course of flight a critical condition is identified, the UAS's sensor can utilize a secondary communications channel not connected to the primary to send immediate notification to the pilots" [0280]. Thus, image data is processed locally to the UAS and data/results are transferred to the command center only when an analyzed condition is deemed critical. Processing must happen in real time such that the pilot may be notified immediately of critical conditions of the railway.). Regarding claim 5, Graetz as modified by Kurz (references made to Graetz) teaches the drone-based railway track inspection system of claim 1, wherein the at least one health condition of the at least one track component includes identifying at least one item missing from the at least one track component or identifying at least one defect in a rail surface ("Faulty condition 1500 is called a broken rail or rail gap…. Faulty condition 1500 is called a fouled ballast…. Faulty condition 1502 is called a curved rail, wavy rail, or misaligned rail." [0198-0200]. These descriptions of faulty conditions embody missing components and defects in the rail.). Regarding claim 6, Graetz as modified by Kurz (references made to Graetz) teaches the drone-based railway track inspection system of claim 1, further comprising a data repository, in communication with the at least one computing platform, wherein the data repository contains field track appearance data to compare with real-time data obtained by the drone-based railway track inspection system ("The aircraft preferably also includes onboard information storage media for local storage of gathered information" [0039]. Any data which is gathered or pertinent to the aircraft may thus be stored locally. "The faulty conditions 1500, 1501, and 1502 can be detected by comparing the image to an image of the previous rail and also by comparing the image to an image or series of images of the rail taken previously" [0201]. "The system performs rail vision. The rail vision includes processing images locally or remotely for the detection of obstructions or faults in the rail system" [0260]. Images may be processed locally, thus the comparison between prior data and real-time data for fault detection may occur locally to the UAV.). Regarding claim 7, Graetz as modified by Kurz (references made to Graetz) teaches the drone-based railway track inspection system of claim 1, further comprising an automatic landing control module, in communication with the at least one computing platform, to execute an emergency landing protocol ("[T]he system includes both onboard and external subsystems for facilitating emergency maneuvering and landing of the UAS on the flight corridor" [0039]. Further, "Upon entering the flight termination mode, the aircraft can automatically perform an emergency VTOL recovery" [0133]. The system thus is configured to enact emergency landings using automated performance.). Regarding claim 11, Graetz teaches a method for using a drone to inspect railway tracks ("a method for inspecting railroad assets using an unmanned aerial vehicle"; [0005]) comprising: flying at least one drone over at least one railway ("The UAS 2420 flies over the rail system monitoring for faults or obstructions" [0284].); wherein the at least one drone includes at least one computing platform (When regarding types of UAVs that may be used in the system, "aircraft share the same flight computer and flight control software as well as many of the same sub-systems" [0067]. Thus, the system comprises a UAV with a "flight computer", i.e., computing platform.); the at least one computing platform further comprising: at least one surveillance unit ("the onboard sensors take high resolution precise location photos" [0040]. The sensor system images are used to monitor, or survey, the track for abnormalities thus making the sensor system an equivalent for the surveillance unit.); at least one communication unit (Aircraft systems include a "communications subsystem" [0039].); at least one computer vision unit ("the system performs rail vision" [0260], which is the system equivalent to computer vision); and at least one autonomous flight control unit configured to enable the at least one drone to have autonomous flight control ("the UAS autopilot (onboard the aircraft)"; [0074], additionally see [0045] for details of autonomous flight control when autopilot is engaged) and autonomous track selection during flight; wherein the at least one surveillance unit, at least one communication unit, the at least one computer vision unit, and the at least one autonomous flight control unit are in communication with one another ("The sensor system also has built-in local computational capability, its own navigation system, and independent communication capability for communicating with other onboard subsystems including the autopilot" [0040]. Thus, there is a communication network which connects aircraft subsystems which may be inclusive of those subsystems referenced above.); wherein the at least one computer vision unit is configured to: extract features of at least one track component via obtaining at least one image of the at least one track component and process the image for both navigation heading as well as anomaly detection with respect to the at least one track component to detect while flying over the at least one railway to evaluate at least one health condition of the at least one track component ("The rail vision includes processing images locally or remotely for the detection of obstructions or faults in the rail system" [0260]. "All of the faulty conditions 1502 are analyzed for changes in pixel coloring, pixel density, and amount of pixels between components indicating a distance, etc. A changed is identified when one of the changes occurs between successive images in a single flight and also identified when one of the changes occurs in images of the same rail from different UAV flights" [0202]. Pixel features are extracted and analyzed to evaluate the track conditions. Additionally, see [0270-0272] in which onboard software analyzes sensor data to output quantitative and qualitative data about what the sensors have seen inclusive of critical conditions, such as obstructions or faults in the rail system. Such image data which is extracted in this onboard image processing is used for navigation heading (sensor and software systems responsible for controlling pitch, yaw, and roll for proper focus over track when imaging, [0275]) and as well as anomaly detection (recognizing critical conditions within the area of the flight path, [0271]). Critical conditions are inclusive of critical faults for track components, i.e., health condition of track components, based on the anomaly detection (see [0206]) and such critical conditions identified during flight are immediately communicated to the pilot (see [0280]).); and identify the at least one railway while flying over the at least one railway in order for the at least one drone to navigate ("When the pilot actuates the autopilot, the system software takes over and flies the aircraft as close as possible once over track. The software system also automatically enables the sensors to start taking two pictures per second of the track. At the same time, the sensor and software systems control the pitch, yaw, and roll of aircraft and sensors such that the appropriate sensor or sensors remain focused and placed over the track to ensure the required resolution and overlapping imagery" [0045]. Sensor system generates images to be processed in order to detect the track to be followed and analyzed.). However, Graetz does not explicitly teach … at least one autonomous flight control unit configured to enable the at least one drone to have …autonomous track selection during flight… Kurz, in the same field of endeavor, teaches … at least one autonomous flight control unit configured to enable the at least one drone to have …autonomous track selection during flight (“The drone is controlled and/or configured to follow a path along the route in a manner by which the sensor data is suitable for use in determining a degree of curvature in the route based on a difference between a reference location and a location of interest, as explained herein. For example, if the route is a railway having two nominally parallel rails (i.e., for normal operation in an undamaged condition, the rails are configured to be parallel), the drone may be controlled and/or configured to follow a path along the midpoint between the rails, or to travel above and along one of the rails (e.g., a designated or chosen one of the rails), or to travel along one of the rails but a designated set lateral distance to the left or right of the rail” [0154]. Thus, the drone is configured to select which track to navigate along during autonomous flight.)… Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the autonomous flight control system of Graetz to include the track selection features of Kurz with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification such that for any rail system comprising multiple diverging rails, the UAV is able to autonomously track and follow the rail which the operator requires asset health assessments for. Motivation for such a feature can also be supported as a combination of known methods which yields predictable results for railway inspections using autonomous drone navigation systems (see MPEP 2143.I(A)). Regarding claim 12, Graetz as modified by Kurz (references made to Graetz) teaches the method for using a drone to inspect railway tracks of claim 11, wherein the drone navigates without access to GPS positioning ("Lost GPS: In the event of a GPS failure, the aircraft reverts to an inertial navigation system (INS). Attitude and heading are maintained. The heading is determined using a magnetometer. The aircraft position estimate is propagated" [0128]. Thus, the drone is able to navigate without access to the GPS positioning.). Regarding claim 14, Graetz as modified by Kurz (references made to Graetz) teaches the method for using a drone to inspect railway tracks of claim 11, wherein the at least one computing platform further comprises an onboard processing unit that processes inspected data immediately to provide real-time track condition assessment without requiring the inspected data to transfer away from the drone ("The system performs rail vision. The rail vision includes processing images locally or remotely for the detection of obstructions or faults in the rail system" [0260]. "If during the course of flight a critical condition is identified, the UAS's sensor can utilize a secondary communications channel not connected to the primary to send immediate notification to the pilots" [0280]. Thus, image data is processed locally to the UAS and data/results are transferred to the command center only when an analyzed condition is deemed critical. Processing must happen in real time such that the pilot may be notified immediately of critical conditions of the railway.). Regarding claim 15, Graetz as modified by Kurz (references made to Graetz) teaches the method for using a drone to inspect railway tracks of claim 11, wherein the at least one health condition of the at least one track component includes identifying at least one item missing from the at least one track component or identifying at least one defect in a rail surface ("Faulty condition 1500 is called a broken rail or rail gap…. Faulty condition 1500 is called a fouled ballast…. Faulty condition 1502 is called a curved rail, wavy rail, or misaligned rail." [0198-0200]. These descriptions of faulty conditions embody missing components and defects in the rail.). Regarding claim 16, Graetz as modified by Kurz (references made to Graetz) teaches the method for using a drone to inspect railway tracks of claim 11, wherein the at least one computing platform further comprises a data repository in communication with the at least one computing platform wherein the data repository contains field track appearance data to compare with real-time data obtained by a drone-based railway track inspection system ("The aircraft preferably also includes onboard information storage media for local storage of gathered information" [0039]. Any data which is gathered or pertinent to the aircraft may thus be stored locally. "The faulty conditions 1500, 1501, and 1502 can be detected by comparing the image to an image of the previous rail and also by comparing the image to an image or series of images of the rail taken previously" [0201]. "The system performs rail vision. The rail vision includes processing images locally or remotely for the detection of obstructions or faults in the rail system" [0260]. Images may be processed locally, thus the comparison between prior data and real-time data for fault detection may occur locally to the UAV.). Regarding claim 17, Graetz as modified by Kurz (references made to Graetz) teaches the method for using a drone to inspect railway tracks of claim 11, wherein the at least one computing platform further comprises an automatic landing control module to execute an emergency landing protocol ("[T]he system includes both onboard and external subsystems for facilitating emergency maneuvering and landing of the UAS on the flight corridor" [0039]. Further, "Upon entering the flight termination mode, the aircraft can automatically perform an emergency VTOL recovery" [0133]. The system thus is configured to enact emergency landings using automated performance.). Claims 3, 8, 13, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Graetz in view of Kurz and further in view of Dick et al. (US 2020/0160733 A1; hereinafter Dick). Regarding claim 3, Graetz as modified by Kurz teaches the drone-based railway track inspection system of claim 1. However, Graetz as modified does not explicitly teach the system further comprising the at least one flight control unit employing a real-time obstacle avoidance system (Disclosure leaves room for such technologies, but does not apply them to the system at hand: “Additional air traffic avoidance technologies can be utilized as they become accepted for operational use. Examples of such technology include alternative radars and on-board collision avoidance” [0094].). Dick, in the same field of endeavor, teaches a drone system for railway inspection further comprising the at least one flight control unit employing a real-time obstacle avoidance system ("The LIDAR sensor 150 and/or the SLAM sensor 152 are communicatively coupled to the flight controller 120 and are generally used to identify potential obstacles within the current flight path of the aerial vehicle 100…. Responsive to identifying a potential obstacle, the flight controller 120 updates the flight instructions to the propulsion system 110 to cause the aerial vehicle 100 to avoid the obstacle." [0040].). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the UAV as taught by Graetz to include the obstacle avoidance system of Dick with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because "preprogramming the flight path is often time consuming and requires reference waypoints and/or landmarks to control the flight path, which in turn requires knowledge of the surrounding terrain" so including real-time obstacle detection would be necessary to obtain real-time knowledge of UAV surroundings so as to avoid collisions (Dick, [0003]). Regarding claim 8, Graetz as modified by Kurz teaches the drone-based railway track inspection system of claim 1. Graetz as modified does not teach the system further comprising an object depth estimation module, in communication with the at least one computing platform, to detect at least one oncoming obstacle in a drone flight path as well as to determine a distance of the at least one oncoming obstacle to the drone. Dick, in the same field of endeavor, teaches a system for railway inspection further comprising an object depth estimation module, in communication with the at least one computing platform, to detect at least one oncoming obstacle in a drone flight path as well as to determine a distance of the at least one oncoming obstacle to the drone ("The LIDAR sensor 150 emits pulsed laser light and measures the reflected pulses to determine a distance to a target object (e.g., a potential obstacle in the current flight path of the aerial vehicle 100). In addition to detecting obstacles within the flight path of the aerial vehicle 100, the LIDAR sensor 150 can be used to generate three-dimensional images of a target object and/or its surroundings" [0041].). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the UAV as taught by Graetz to include the LIDAR distance detection of Dick with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because "preprogramming the flight path is often time consuming and requires reference waypoints and/or landmarks to control the flight path, which in turn requires knowledge of the surrounding terrain" so including distance information for detected obstacles would be essential for understanding UAV surroundings in real-time to avoid collisions (Dick, [0003]). Regarding claim 13, Graetz as modified by Kurz teaches the method for using a drone to inspect railway tracks of claim 11. However, Graetz as modified does not teach the method further comprising the at least one flight control unit employing a real-time obstacle avoidance system (Disclosure leaves room for such technologies, but does not apply them to the system at hand: “Additional air traffic avoidance technologies can be utilized as they become accepted for operational use. Examples of such technology include alternative radars and on-board collision avoidance” [0094].). Dick, in the same field of endeavor, teaches a method for railway inspection further comprising the at least one flight control unit employing a real-time obstacle avoidance system ("The LIDAR sensor 150 and/or the SLAM sensor 152 are communicatively coupled to the flight controller 120 and are generally used to identify potential obstacles within the current flight path of the aerial vehicle 100…. Responsive to identifying a potential obstacle, the flight controller 120 updates the flight instructions to the propulsion system 110 to cause the aerial vehicle 100 to avoid the obstacle." [0040].). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the UAV as taught by Graetz to include the obstacle avoidance system of Dick with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because "preprogramming the flight path is often time consuming and requires reference waypoints and/or landmarks to control the flight path, which in turn requires knowledge of the surrounding terrain" so including real-time obstacle detection would be necessary to obtain real-time knowledge of UAV surroundings so as to avoid collisions (Dick, [0003]). Regarding claim 18, Graetz as modified by Kurz teaches method for using a drone to inspect railway tracks of claim 11. Graetz as modified does not teach the method, wherein the at least one computing platform further comprises an object depth estimation module to detect at least one oncoming obstacle in a drone flight path as well as to determine a distance of the at least one oncoming obstacle to the drone. Dick, in the same field of endeavor, teaches a method for railway inspection, wherein the at least one computing platform further comprises an object depth estimation module to detect at least one oncoming obstacle in a drone flight path as well as to determine a distance of the at least one oncoming obstacle to the drone ("The LIDAR sensor 150 emits pulsed laser light and measures the reflected pulses to determine a distance to a target object (e.g., a potential obstacle in the current flight path of the aerial vehicle 100). In addition to detecting obstacles within the flight path of the aerial vehicle 100, the LIDAR sensor 150 can be used to generate three-dimensional images of a target object and/or its surroundings" [0041].). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the UAV as taught by Graetz to include the LIDAR distance detection of Dick with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because "preprogramming the flight path is often time consuming and requires reference waypoints and/or landmarks to control the flight path, which in turn requires knowledge of the surrounding terrain" so including distance information for detected obstacles would be essential for understanding UAV surroundings in real-time to avoid collisions (Dick, [0003]). Claims 9-10 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Graetz in view of Kurz and further in view of Kalra et al. ("Image deblurring using saliency detection," 2014 Recent Advances in Engineering and Computational Sciences (RAECS), Chandigarh, India, 2014, pp. 1-4; hereinafter “Kalra”) and further in view of Deora et al. ("Salient image matting." arXiv preprint arXiv:2103.12337 (2021); hereinafter “Deora”). Regarding claim 9, Graetz as modified by Kurz (references made to Graetz) teaches the drone-based railway track inspection system of claim 1, … to detect a rail surface (“[T]he system also includes software focused on rail detection” [0041]). However, Graetz as modified does not teach the system further comprising at least one non-edge-aware saliency detection framework, in communication with the at least one computing platform, to detect … under blurred, incomplete, or incorrect … boundary visual information. Kalra, in a related field of endeavor, teaches a system further comprising at least one … saliency detection framework, in communication with the at least one computing platform, to detect … under blurred, incomplete, or incorrect … boundary visual information (Saliency detection is used by a computing system to segment a non-uniformly blurred image into foreground and background, detecting and separating the boundary visual information of the example flower in Fig. 2(a)-2(d).). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the detection software of Graetz used to detect rail boundaries to include the saliency detection framework for detecting boundary visual information over a non-uniformly blurred image as taught by Kalra with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because the image deblurring technique used by Kalra focuses on images primarily blurred by camera motion so as to not accidentally throw away essential details in the background (Kalra, Page 1), which directly applies to possible applications of the drone imaging system in which images may be blurred by drone movements or weather which interfere with the camera alignment. However, Graetz as modified by Kurz and further modified by Kalra fails to teach a non-edge-aware saliency detection framework. Deora, in a related field of endeavor, teaches a non-edge-aware saliency detection framework (Salient image matting which, instead of known edge detection methods, uses a trimap coloring process and a multi scale matting network in order to perform saliency detection in defining the borders of the foreground image. See Fig. 3 for “Overview of Salient Image Matting”. Further, Saliency Detection for the method utilizes U2NET which is commonly non-edge-aware [Page 5].). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the detection system of Graetz to include a non-edge-aware saliency detection method as taught by Deora with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because the use of salient image matting allows capturing of both high and low level features while reducing the computational resources necessary to implement full trimap coloring. Regarding claim 10, Graetz as modified by Kurz, Kalra, and Deora teaches the drone-based railway track inspection system of claim 9, further comprising the least one non-edge-aware saliency detection framework (see above). Kalra further teaches the saliency detection framework configured to reproduce the rail surface in real time (Convolution and Deconvolution process of Fig. 3-5 shows the salient foreground object which is blurred by the convolution process before being returned to the original image. Once returned, the deconvolution process deblurs the whole image based on an inverse filtering technique and outputs the corrected image.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the detection software of Graetz used to detect rail boundaries to include the corrected output image as taught by Kalra with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because the image deblurring technique used by Kalra outputs a corrected image in which both the foreground and background show clear object edges, therefore improving the feasibility of detection needed for each image (Kalra, Fig. 5). Regarding claim 19, Graetz as modified by Kurz (references made to Graetz) teaches the method for using a drone to inspect railway tracks of claim 11, … to detect a rail surface (“[T]he system also includes software focused on rail detection” [0041]. The system performs the steps of the method, inclusive of detecting the rail.). However, Graetz as modified does not teach the method wherein the at least one computing platform further comprises at least one … saliency detection framework to detect … under blurred, incomplete, or incorrect … boundary visual information. Kalra, in a related field of endeavor, teaches a system wherein the at least one computing platform further comprises at least one … saliency detection framework, in communication with the at least one computing platform, to detect … under blurred, incomplete, or incorrect … boundary visual information (Saliency detection is used by a computing system to segment a non-uniformly blurred image into foreground and background, detecting and separating the boundary visual information of the example flower in Fig. 2(a)-2(d).). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the detection software of Graetz used to detect rail boundaries to include the saliency detection framework for detecting boundary visual information over a non-uniformly blurred image as taught by Kalra with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because the image deblurring technique used by Kalra focuses on images primarily blurred by camera motion so as to not accidentally throw away essential details in the background (Kalra, Page 1), which directly applies to possible applications of the drone imaging system in which images may be blurred by drone movements or weather which interfere with the camera alignment. However, Graetz as modified by Kurz and further modified by Kalra fails to teach a non-edge-aware saliency detection framework. Deora, in a related field of endeavor, teaches a non-edge-aware saliency detection framework (Salient image matting which, instead of known edge detection methods, uses a trimap coloring process and a multi scale matting network in order to perform saliency detection in defining the borders of the foreground image. See Fig. 3 for “Overview of Salient Image Matting”. Further, Saliency Detection for the method utilizes U2NET which is commonly non-edge-aware [Page 5].). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the detection system of Graetz to include a non-edge-aware saliency detection method as taught by Deora with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because the use of salient image matting allows capturing of both high and low level features while reducing the computational resources necessary to implement full trimap coloring. Regarding claim 20, Graetz as modified by Kurz, Kalra, and Deora teaches the method for using a drone to inspect railway tracks of claim 19, further comprising the least one non-edge-aware saliency detection framework (see above). Kalra further teaches the saliency detection framework configured to reproduce the rail surface in real time (Convolution and Deconvolution process of Fig. 3-5 shows the salient foreground object which is blurred by the convolution process before being returned to the original image. Once returned, the deconvolution process deblurs the whole image based on an inverse filtering technique and outputs the corrected image.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the detection software of Graetz used to detect rail boundaries to include the corrected output image as taught by Kalra with a reasonable expectation for success. One of ordinary skill in the art would have been motivated to make this modification because the image deblurring technique used by Kalra outputs a corrected image in which both the foreground and background show clear object edges, therefore improving the feasibility of detection needed for each image (Kalra, Fig. 5). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Fowe et al. (US 2020/0066142 A1) additionally provides a drone-camera system for road selection during monitoring tasks. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SIDNEY L MOLNAR whose telephone number is (571)272-2276. The examiner can normally be reached 8 A.M. to 3 P.M. EST Monday-Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jonathan (Wade) Miles can be reached on (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /S.L.M./Examiner, Art Unit 3656 /WADE MILES/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Mar 07, 2023
Application Filed
Mar 12, 2025
Non-Final Rejection — §103
Aug 18, 2025
Response Filed
Oct 08, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600039
ROBOT, CONVEYING SYSTEM, AND ROBOT-CONTROLLING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12533807
ROBOTIC APPARATUS AND CONTROL METHOD THEREOF
2y 5m to grant Granted Jan 27, 2026
Patent 12479098
SURGICAL ROBOTIC SYSTEM WITH ACCESS PORT STORAGE
2y 5m to grant Granted Nov 25, 2025
Patent 12384048
TRANSFER APPARATUS
2y 5m to grant Granted Aug 12, 2025
Patent 12376922
TOOL HEAD POSTURE ADJUSTMENT METHOD, APPARATUS AND READABLE STORAGE MEDIUM
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
54%
Grant Probability
99%
With Interview (+85.7%)
2y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 13 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month