Prosecution Insights
Last updated: April 19, 2026
Application No. 18/167,710

REMOTE ORDNANCE IDENTIFICATION AND CLASSIFICATION SYSTEM UTILIZING ARTIFICIAL INTELLIGENCE AND UNMANNED AERIAL VEHICLE FUNCTIONALITY

Non-Final OA §103
Filed
Feb 10, 2023
Examiner
BAGHDASARYAN, HOVHANNES
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Howard G Buffett Foundation
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
3y 1m
To Grant
94%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
759 granted / 971 resolved
+26.2% vs TC avg
Strong +16% interview lift
Without
With
+16.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
85 currently pending
Career history
1056
Total Applications
across all art units

Statute-Specific Performance

§101
2.6%
-37.4% vs TC avg
§103
45.7%
+5.7% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
23.9%
-16.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 971 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 17 and claims bellow are rejected under 35 U.S.C. 103 as being unpatentable over D1 US 20190064362 A1 . Regarding claims 1, 17 D1 teaches 1 , 17 A system for detecting potential locations of unexploded ordnance in near real-time in a geographic area, the system comprising: at least one unmanned aerial vehicle (Primary ISSM device fig. 6) configured to gather sensor data using multiple types of sensors regarding locations included in the geographic area during a survey of the geographic area; [0057] at least one ground control processor configured to communicate with a flight controller included in the at least one unmanned aerial vehicle to enable remote control of the at least one unmanned aerial vehicle to complete the survey of the geographic area; and (abstract and [0035]implicit as user controls the UAV hence ground controller is inherent) at least one survey analytics processor (7) configured to communicate with the at least one unmanned aerial vehicle to receive sensor data generated by the multiple types of sensors regarding locations included in the geographic area and location data generated by the flight controller, [0059] analyze the received data and identify potential locations of unexploded ordnance based on analysis of the received data, [0062-0076] wherein the analysis includes comparing [0075](correlation includes comparing to reference) the received data with reference data indicating a plurality of characteristics (surface and subsurface features) known to correspond to unexploded ordnance. But does not explicitly teach communicating between at least one ground control processor and a flight controller Although D1 does not explicitly say “ communicating between at least one ground control processor and a flight controller ” D1 teaches saving data on SANDISC for further post processing [0068] and therefore communicating between at least one ground control processor and a flight controller” is just another obvious modification in order to post process data while it is gathered at the field. 2 , 18 The system of claim 1, wherein the at least one unmanned aerial vehicle includes a plurality of sensors of multiple types (1, 2, 12) configured to generate sensor data regarding locations included in the geographic area during the survey of the geographic area. 3. The system of claim 2, wherein the plurality of sensors of multiple types includes an electro optical sensor (2) and a synthetic aperture radar sensor. [0070] 4. The system of claim 2, wherein the plurality of sensors of multiple types includes an infra-red sensor (obvious modification in order to provide vision at night) and a synthetic aperture radar sensor. [0070] 5. The system of claim 2, wherein the plurality of sensors of multiple types includes a LiDAR sensor and a synthetic aperture radar sensor. [0070] 6 , 19 The system of claim 2, wherein the plurality of sensors of multiple types includes an electro optical sensor (3) , an infra-red sensor (obvious modification in order to provide vision at night) , a LiDAR sensor and a synthetic aperture radar sensor. [0070] 7 , 20 The system of claim 1, wherein the generated sensor data images terrain of the geographic area to analyze the terrain and detect anomalies that indicate potential locations of unexploded ordnance. (fig. 10) 8 , 21 The system of claim 1, wherein the generated sensor data images terrain of the geographic area to analyze the terrain and detect changes in sensor data that indicate potential locations of unexploded ordnance. [0075] 9 ,22 The system of claim 8, wherein the changes are detected based on sensor data generated in at least two surveys, which are compared to detect changes therebetween. (obvious modification in order to provide reference for correlation data) 10 , 23 The system of claim 1, wherein the at least one unmanned aerial vehicle includes at least one computer element (7) configured process the sensor data to detect characteristics of the terrain at locations in the geographic area and associate the detected characteristics with data indicating the location at which the sensor data was generated. (fig. 10) 11 , 24 The system of claim 10, wherein the terrain characteristic data and data indicating the location associated with that terrain characteristic data is transmitted to the at least one survey analytics processor during the survey of the geographic area for further analysis and output via user interface of the at least one survey analytics processor. [0061-0070] 12 , 25 The system of claim 10, wherein the terrain characteristic data and data indicating the location associated with that terrain characteristic data is downloaded to the at least one survey analytics processor following completion of scanning performed by the at least one unmanned for further analysis and output via user interface of the at least one survey analytics processor. [0003][0070](obvious to do postprocessing on separate computer as it has more processing power than processor on UAV) 13 , 26 The system of claim 10, wherein the data indicating the location at which the sensor data was generated is provided by parsing message data from a data stream used by the at least one unmanned aerial vehicle to control guidance. (Implicit as the UAV is controlled and transmit images of terrain to user) 14 , 27 The system of claim 1, wherein data generated by the multiple types of sensors is analyzed to determine likelihood of accuracy based on analysis of the data indicating that at least a plurality of the multiple types of sensors indicate characteristic data that is in agreement regarding the potential presence of an unexploded ordnance. ([0075] Correlation result can be interpreted as likelyhood) 15 , 28 The system of claim 1, wherein the comparison of the received data with reference data indicating a plurality of characteristics known to correspond to unexploded ordnance is used to generate an identification of a type of unexploded ordnance, and wherein the identification of the type of unexploded ordnance is output via the at least one survey analytics processor along with a photographic image of the location included in the received data and generated by one of the multiple sensors. [0075 and fig. 10] Claim(s) 16, 29 and claims bellow are rejected under 35 U.S.C. 103 as being unpatentable over D1 US 20190064362 A1 in view of D2 US 10698104 B1 . Regarding claims bellow D1 does not teach but D2 teaches 16 , 29 The system of claim 1, wherein the comparison of the received data with reference data indicating a plurality of characteristics known to correspond to unexploded ordnance is associated with documentation indicating whether and what unexploded ordnance type was subsequently located at a particular location to provide a survey-neutralization profile for a particular location, wherein the survey-neutralization profile data is analyzed by machine learning algorithms to increase accuracy of analysis of sensor generated data to detect the potential presence of unexploded ordnance and/or to identify ordnance type. (col 10 lines 29-44) It would be obvious to one of ordinary skills in the art at the time of filing to modify teachings by D1 with teaching by D2 in order to identify possible problematic regions. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT HOVHANNES BAGHDASARYAN whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (571)272-7845 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT Mon-Fri 7am - 5 pm . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT Isam Alsomiri can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT 5712726970 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HOVHANNES BAGHDASARYAN/ Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Feb 10, 2023
Application Filed
Aug 18, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591059
OPTICAL RANGING DEVICE AND OPTICAL RANGING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12591047
OPTICAL SYSTEM FOR LIGHT DETECTION AND RANGING
2y 5m to grant Granted Mar 31, 2026
Patent 12585000
RECEIVING DEVICE FOR AN OPTICAL MEASUREMENT APPARATUS FOR CAPTURING OBJECTS, LIGHT SIGNAL REDIRECTION DEVICE, MEASUREMENT APPARATUS AND METHOD FOR OPERATING A RECEIVING DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12569880
CMOS ULTRASONIC TRANSDUCERS AND RELATED APPARATUS AND METHODS
2y 5m to grant Granted Mar 10, 2026
Patent 12560721
SPAD LIDAR SYSTEM WITH BINNED PIXELS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
94%
With Interview (+16.1%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 971 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month