Prosecution Insights
Last updated: April 19, 2026
Application No. 18/677,423

SMART SURVEILLANCE DATA PROCESSING SYSTEMS FOR SAFE UNCREWED AIRCRAFT AND SURFACE VESSEL OPERATION BEYOND VISUAL LINE OF SIGHT

Non-Final OA §102§112
Filed
May 29, 2024
Examiner
BROSH, BENJAMIN J
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Accipiter Radar Technologies Inc.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
56 granted / 77 resolved
+20.7% vs TC avg
Strong +30% interview lift
Without
With
+29.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
40 currently pending
Career history
117
Total Applications
across all art units

Statute-Specific Performance

§101
13.6%
-26.4% vs TC avg
§103
39.6%
-0.4% vs TC avg
§102
20.9%
-19.1% vs TC avg
§112
20.5%
-19.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 77 resolved cases

Office Action

§102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Joint Inventors This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Domestic Benefit Claim to domestic benefit is acknowledged as requirements of 37 CFR 1.78 and 35 U.S.C. 119(e) are met. Information Disclosure Statement The information disclosure statement (IDS) filed on 30 May 2024 complies with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Response to Election/Restriction and Status of Claims The examiner issued election/restriction requirement on 24 November 2025, identifying two groups of inventions, group 1 pertaining to claims 1-4 and 11-14, and group 2 pertaining to claims 5-10 and 15-20. Applicant filed a reply dated 23 January 2026 electing group 1 (claims 1-4 and 11-14) without traverse. As this election was made without traverse, the previously made restriction requirement is hereby made FINAL. The most recent revision of the claim set is dated 23 January 2026. Claims 1-20 are pending, of which, claims 5-10 and 15-20 are withdrawn from consideration. Of the claims under consideration (claims 1-4 and 11-14), claims 1 and 11 are independent claims. All pending claims currently under consideration (claims 1-4 and 11-14) are rejected for the reasons provided below. Drawings The examiner notes that Figures 3-8 provide exemplary reports generated by the system/method of the instant application. The examiner notes that many of the details of the figures are obscured due to quality of the image provided. As the figures appear to be provided merely to show exemplary formats, the examiner notes that claimed invention details are not obscured (and therefore an objection is withheld at this time), but if display details are crucial/directly pertinent to the claimed invention, the examiner recommends revision of the aforementioned figures to increase readability. An example of Figure 8, showing portions of the figure that are difficult to read is provided below: PNG media_image1.png 607 868 media_image1.png Greyscale Specification The disclosure is objected to because of the following informalities: Paragraphs [0022] and [0023] appear to have an inadvertent space, placing "their respective loss.." onto a new line, assigning a new paragraph. Page 21 of the specification includes a reference to Leonardi et al. through recitation of a footnote and link. This is generally improper form for a patent application and per MPEP 608.01.VII., hyperlinks are not permitted. Applicant is respectfully requested to remove all hyperlinks from the instant application. Rather than footnote-type references, the examiner recommends reference in the following manner (paragraph [0079] for instance): "…otherwise regional ADS-B coverage volume [[1]] as is known in the art and discussed, for example, in the teachings of Leonardi et al. "ADS-B vulnerability to low cost jammers: Risk assessment and possible solutions". Appropriate correction is required. While not objections, the examiner makes the following notes: Paragraph [0087] states "Second, where TAX-TC processing is available..". The examiner notes that best practice is to initiate the term at its first usage. However, TAX-TC is not defined until paragraph [0089]. The examiner recommends providing the full name (rather than just the abbreviation) in paragraph [0087]. Similarly, Paragraph [0088] states "…(ii) it supports smart downstream data processing included with the DUNE described below…". In a similar manner, DUNE is not defined until later paragraph [0090]. While neither are necessarily objections (as the terms are defined in the specification, just not at first use), consideration is respectfully requested. Claim Objections Claims 1-2 and 11-14 are objected to because of the following informalities: Claims 1 and 11: Claims 1 and 11 recite in a) “associated with the same physical target”, however “the same physical target” lacks antecedent basis and should read “a same physical target”. Claims 1 and 11: Claims 1 and 11 recite in c) “and unified target location information which update continuously over time”, missing an “s” on update to make “an unified target location information which updates continuously over time”. Claims 1 and 11: Claims 1 and 11 recite in c) “associated with the same target” but should state “the same physical target” to properly relate to the initiated term. Claims 1 and 11: Claims 1 and 11 recite in c) “the best target location information available”, which lacks antecedent basis. Claims 2 and 12: Claims 2 and 12 recite in a), “the best location information”, which improperly refers to “the best target location information” of claim 1. The word “target” should be included or initiation of a new term with “a” should be performed. Claim 11: Claim 11 states “beyond their visual line of sigh” and is missing the “t” at the end of “sight”. Claim 11: Claim 11 states “automobiles, comprising” and is missing a colon “:” prior to the limitations of note. Claim 13: Claim 13 is dependent upon “The method of Claim 1”, however, claim 1 is not a method claim. Additionally, if recited to depend upon the system of claim 1, it would be a substantial duplicate of claim 3. Instead, the examiner believes that claim 13 is intended to depend upon the method of claim 11. Attention is required. Claim 14: Claim 14 is a substantial duplicate of claim 4. The examiner believes that claim 14 intends to say “The method of Claim 11” rather than “The system of Claim 1”. Appropriate correction is required. While not objections, the examiner makes the following claim notes: The claim set generally lacks the use of oxford commas. For example, claim 1 states "…where said vehicles are aircraft, vessels or automobiles, comprising:" in a plurality of areas/instances and should state "…where said vehicles are aircraft, vessels, or automobiles, comprising:"; the use of an oxford comma more clearly delineates the finality of a list and reduces the possibility of incorrectly interpreting a list. The examiner recommends amending the claim language with the use of oxford commas to reduce the possibility of incorrectly interpreting claim elements. Claim Interpretation The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. Regarding "beyond visual line of sight", the examiner notes that this term is generally well understood by those having ordinary skill in the art in addition to the description in paragraph [0045]. However, the range/distance constituting beyond visual range/line of sight is not necessarily explicitly defined and so the language is instead interpreted broadly to not be particularly/significantly limiting. Regarding cooperative versus noncooperative targets, the examiner is using the description from the independent claims for purposes of interpretation, "…cooperative targets in the environment taken from the group of crewed and uncrewed aircraft, vessels and automobiles, and who broadcast their respective identity and precise location regularly over time…" and "…said noncooperative targets not actively participating in being identified and precisely located in the environment over time…". Put plainly, the examiner considers any target that continues to broadcast identity and location to be a cooperative target and any target that does not broadcast its identity and location to be a noncooperative target. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-4 and 11-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Issues are summarized in list form below: Claims 1 and 11 (all pending and under consideration independent claims) utilize preamble verbiage “A smart, real-time radar data processing system…” (claim 1) / “A method of smart, real-time…” (claim 11). The examiner is unsure if “smart” is intended to have a limiting effect of sorts, but at this time, the examiner is unsure what may or may not constitute a “smart” system or method, as this is a term of relative degree per MPEP 2173.05(b). Thus, the term renders the overall intended invention indefinite. Claims 1, 2, 11, and 12 utilize the phrase “such as”. The examiner is unsure if the exemplary language is intended to be limiting, falling under the guidance of MPEP 2173.05(d). Claims 1 and 11 state “said sensors taken from the group of radar, cameras, C-UAS receivers …”. The examiner reviewed the specification and found that C-UAS in the context of the application means “Counter-UAS” (Paragraph [0012]) and that C-UAS sensor examples are given in paragraph [0026] as “C-UAS sensors (e.g., radio frequency (RF) receivers, camera technologies, et cetera)”. As radar and cameras are also disclosed in the list (as mentioned above) and the two are merely exemplary (hence “e.g.”), the examiner must consider what other sensors may be included in the group of “C-UAS” receivers/sensors. The examiner was unable to find a definition in the specification and a search in an online browser did not reveal a certain well-known single- or group- of sensors that are generally known as C-UAS receivers/sensors. Therefore, as the examiner is unsure what sensors may reasonably read upon the claim set, the term renders the claim indefinite. For prior art purposes, the examiner will consider any sensing device which is capable of detecting a UAS to read upon the claim language. Claims 1 and 11 state in c) “the best target location information available”. The word “best” implies a measure of how something can be better than another thing, however the basis for understanding how “best” is quantified or chosen is not provided. Is it best because it has the highest degree of match? Lowest processing time? Lowest processing cost? Most reliable data set? Thus, it is a term of relative degree that is not properly supported, rendering the claim indefinite. Therefore, the examiner notes that the above phrases/terms are indefinite and fail to particularly point out and distinctly claim the invention of the instant application. Consistent with USPTO examination practices, for purposes of compact prosecution, the claim limitations will be treated as best understood by the Examiner, which according to broadest reasonable interpretation (BRI), would mean that the examiner could follow any one or more of the interpretations discussed above. As all pending and under consideration independent claims (1 and 11) are rejected under 35 U.S.C. 112(b), and the dependent claims do not cure their deficiencies, all pending and under consideration claims are rejected under 35 U.S.C. 112(b) (claims 1-4 and 11-14). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-4 and 11-14 are rejected under 35 U.S.C. 102(a)(2) as being anticipated Bageshwar et al. (US 2025/0363898 A1; filed 22 Jun 2022, hereinafter Bageshwar). Regarding independent claims 1 (system) and 11 (method): Bageshwar discloses A smart, real-time radar data processing system for assisting remote pilots in command of uncrewed vehicles beyond their visual line of sight to avoid collisions with other crewed or uncrewed vehicles in their vicinity where said vehicles are aircraft, vessels or automobiles, comprising: (per claim 1) / A method of smart, real-time radar processing for assisting remote pilots in command of uncrewed vehicles beyond their visual line of sigh to avoid collisions with other crewed or uncrewed vehicles in their vicinity where said vehicles are aircraft, vessels or automobiles, comprising (per claim 11) (First, the examiner notes that “smart” is not particularly limiting as any system may be “smart” by comparison to another system or process. Second, “beyond visual line of sight”, while its meaning is generally understood in the art, does not impose meaningful limits as line of sight is a variable value depending upon a multitude of factors relying on a user’s vision, weather, cockpit sightlines, etc. Third, “in their vicinity” does not impose any meaningful limits, as a particular range or limiting factor to connotate range is not provided. Finally, “for assisting…” recites an intended use and also does not convey patentable significance, as a system that is not explicitly taught to be used in this manner may still read upon the claims. Abstract, Paragraph [0001, 0005-0006, 0022, 0038, 0064, 0089-0091] and Figure [1-2, 6], Bageshwar discloses systems and methods for detecting and tracking cooperative and non-cooperative targets to be used for object/collision avoidance) a. a track data interface configured to receive continuously in real-time target track data from a surveillance network and making said target track data available for subsequent data processing, (per claim 1) / a. receiving continuously in real-time target track data from a surveillance network and making said target track data available for subsequent data processing, (per claim 11) (Paragraph [0021-0022, 0024, 0078], Bageshwar discloses a DAA system (an interface system) that receives information pertaining to tracks for objects or targets in an environment around an ego vehicle from cooperative sensors (a surveillance network) for further processing (such as estimating optimal tracks of targets), the system operating in real-time) said target track data originating from at least two sensors associated with said surveillance network with each of said sensors generating a respective track data stream and said track data streams collectively making up said target track data, said sensors taken from the group of radar, cameras, C-UAS receivers, ADS-B receivers, RemoteID receivers, and AIS receivers, (per claims 1 and 11) (Paragraph [0022, 0024, 0026, 0031, 0040] and Figure [2], Bageshwar discloses sensor fusion from a plurality of sensors providing “track data”, such as ADS-B, TCAS (broadly a C-UAS), radar, cameras, etc.) said ADS-B, RemotelD and AIS receivers generating target track data from cooperative targets in the environment taken from the group of crewed and uncrewed aircraft, vessels and automobiles, and who broadcast their respective identity and precise location regularly over time, and (per claims 1 and 11) (Paragraph [0003, 0031, 0046-0047, 0054, 0078, 0082] and Figure [2], Bageshwar discloses generating target track data from cooperative targets using ADS-B sensors (the examiner notes that “from the group of” implied that only one was necessary), providing identification/ID numbers and track/position data at time periods) said radar, cameras and CUAS receivers generating target track data from the group of noncooperative targets, cooperative targets, and clutter generating phenomenon in the environment, said clutter generating phenomenon taken from the group of birds, precipitation and surface features such as trees and ocean waves, said noncooperative targets not actively participating in being identified and precisely located in the environment over time, said noncooperative targets also taken from the group of crewed and uncrewed aircraft, vessels and automobiles, (per claims 1 and 11) (First, the examiner notes that “from the group of” implies that only one is necessary amongst noncooperative targets, cooperative targets, and clutter. Paragraph [0026, 0031, 0042-0043, 0046] and Figure [2], Bageshwar discloses ADS-B, TCAS, ICAO, radar, and vision/camera sensors (broadly, any can be a “C-UAS” receiver) tracking cooperative and non-cooperative targets (including identifying background/stationary targets, reasonably including “surface features” with vision sensors), the targets constituting aircraft) each said track data stream providing a continuous stream of track updates from said cooperative targets, noncooperative targets and clutter generating phenomenon tracked by said surveillance network and sent to said track data interface, (per claims 1 and 11) (Paragraph [0042-0043, 0051, 0063, 0078, 0081], Bageshwar discloses continuous sampling of track updates in real time for the cooperative and noncooperative targets, used amongst the system) each respective said track update including a track identifier, date/time stamp, and location information associated with said respective target or said respective clutter generating phenomenon, said target track data thereby including a multiplicity of seemingly unique target tracks which are in fact associated with the same physical target that is active in the environment, said multiplicity increasing as the number of said sensors with overlapping coverage in said surveillance network grows and as the number of active targets in the environment grows, making said target track data more and more confounding to interpret; (per claims 1 and 11) (First, “seemingly unique target tracks” is a term of relative degree, as something may “seem” one way or another, and therefore the limitation does not impose meaningful limits. Second, “said multiplicity increasing as the number of said sensors with overlapping coverage in said surveillance network grows as the number of active targets in the environment grows, making said target track data more and more confounding to interpret” to also be function language including terms of degree, merely claiming the obvious fact that as more sensors and targets are present, the more complex the computation; however, this does not directly claim a component/process/etc. and merely claims a biproduct/environmental condition. Therefore the aforementioned language is interpreted to not provide patentable significance. Paragraph [0026, 0047, 0054, 0069-0072, 0077, 0082] and Figure [2-3, 7], Bageshwar discloses correlation of data obtained, including track statistics made up of track ID, measurement time, and position) b. a surveillance data processor operatively connected to said track data interface and configured to receive said target track data from said track data interface in real-time; (per claim 1) / b. receiving said target track data in real-time; (per claim 11) (Paragraph [0025, 0030-0031, 0078] and Figure [1], Bageshwar discloses a system comprising a processor (110, for example) to process data, the tracking system disclosed to operate in real-time) c. said surveillance data processor further configured to automatically process said received target track data in real-time by unifying the collection of said respective track data streams contained in said received target track data into a single unified track feed, (per claim 1) / c. processing said received target track data in real-time by unifying the collection of said respective track data streams contained in said received target track data into a single unified track feed, (per claim 11) (Paragraph [0048, 0074, 0078-0079, 0087], Bageshwar discloses processing the tracks as merged tracks into one framework) said unified track feed representing each said active target with a unique, unified target identifier, and each track update associated with said unified target identifier including a unified target date/time stamp and unified target location information which update continuously over time, (per claims 1 and 11) (Paragraph [0046-0048, 0074, 0078-0079, 0087], Bageshwar discloses processing the tracks as merged tracks into one framework for each target with target IDs, time information, and position data, the process being continuous and real time as discussed above) said unified track feed generated by said surveillance data processor by the automatic (i) determination of the said multiplicity of unique tracks associated with each physical target using multi-sensor processing algorithms taken from the group of cross-correlation algorithms and fusion algorithms on a time update by time update basis to identify those track updates from said received target track data which are associated with the same target, and (per claims 1 and 11) (Paragraph [0022, 0024, 0040, 0047, 0069, 0081], Bageshwar discloses determination of tracks of the same physical target using sensor fusion (multi sensor processing) and correlation (cross-correlation) elements, rejecting false tracks) (ii) combining or joining of said associated track updates in order to form each unified track feed update by extracting and assembling the best target location information available from the said associated track updates; (per claims 1 and 11) (First, the examiner notes that “best” is relative. Paragraph [0022, 0024, 0026, 0040, 0047-0048, 0069, 0074, 0081, 0087], Bageshwar discloses determination of tracks of the same physical target using sensor fusion (multi sensor processing) and correlation (cross-correlation) elements, rejecting false tracks, correlating data to estimate one statistically optimal (best) track of an object or target) d. said surveillance data processor further configured to automatically process said unified track feed by performing additional integration, filtering or enhancement functions whose purpose is to improve said unified track feed data quality; and (per claim 1) / d. automatically processing said unified track feed by performing additional integration, filtering or enhancement functions whose purpose is to improve said unified track feed data quality; and (per claim 11) (First, the examiner notes that “whose purpose…” recites an intended use, and is therefore not afforded significant patentable weight. Paragraph [0048, 0070, 0081, 0087], Bageshwar discloses performing further filtering for the purpose of generic quality increase for downstream use) e. said surveillance data processor further configured to send its unified track feed in real-time to a downstream system or user. (per claim 1) / e. sending the unified track feed in real-time to a downstream system or user. (per claim 11) (Paragraph [0026, 0053, 0055, 0082, 0087], Bageshwar discloses sending to downstream systems and users) Regarding claims 2 and 12: Parent claims 1 and 11 are anticipated by Bageshwar. Bageshwar further discloses where said additional integration, filtering or enhancement functions includes at least one of the following functions: a. when assembling the best location information from the said associated track updates involving a 2D radar sensor and an ADS-B sensor, the altitude information is taken from the associated ADS-B track update; b. filtering out of said unified track feed track updates generated from said clutter generating phenomenon determined automatically based on track attributes selected to differentiate said updates from those associated with targets of interest such as aircraft or vessels; c. automatically determining when a target of interest enters one or more defined geo-fences and generating and sending out an alert when this occurs to a remote pilot, display system or service, database, file system, storage device or another post-processor; d. in said combining or joining of said associated track updates in order to form each unified track feed update, adding to said unified track feed update a field that records: i. the respective, contributing sensor names associated with said associated track updates in order to allow for the downstream, automated separation and analysis of cooperative and noncooperative targets as well as automated system performance assessment and system performance health monitoring; ii. the aircraft ICAO Identifier and flight number if an ADS-B sensor is contributing; iii. the MMSI number and vessel name if an AIS sensor is contributing; and iv. the drone serial number if a C-UAS sensor is contributing; e. if said surveillance network is providing said target track data from multiple ADS-B sensors with overlapping coverage, said ADS-B track data streams are first integrated together into a single composite ADS-B track data stream by associating the same ICAO identifier in order to increase update rates and remove multiplicity before generating said unified track feed; f. if said surveillance network is providing said target track data from multiple AIS sensors with overlapping coverage, said AIS track data streams are first integrated together into a single composite AIS track data stream by associating the same MMSI number in order to increase update rates and remove multiplicity before generating said unified track feed; and g. if said surveillance network is providing said target track data from multiple CUAS sensors with overlapping coverage, said C-UAS track data streams are first integrated together into a single composite C-UAS track data stream by associating the same drone serial number in order to increase update rates and remove multiplicity before generating said unified track feed. (per claims 2 and 12) (First, the examiner notes that “at least one of the following functions” implies that only one is necessary. Paragraph [0041, 0079, 0088] and Figure [9A], Bageshwar discloses that fusion is performed when a surveillance sensor provides only two dimensional measurement statistics. Further, Bageshwar discloses ADS-B sensors providing measurement tracks of objects/targets including sensor measurements, shown as including altitude in Figure [9B]) Regarding claims 3 and 13: Parent claims 1 and 11 are anticipated by Bageshwar. Bageshwar further discloses where said downstream system or user is taken from the group of: i. a remote pilot through a data interface; ii. a display system iii. a service including a UTM service or vessel traffic service; iv. a database, file system, or storage device; or v. another post-processor. (per claims 3 and 13) (First, the examiner notes that “taken from the group of” implies that only one is necessary. Paragraph [0026, 0053, 0055, 0082, 0087], Bageshwar discloses sending to downstream systems and users, the systems reasonably constituting a vessel traffic service, database, or post-processor) Regarding claims 4 and 14: Parent claims 1 and 11 are anticipated by Bageshwar. Bageshwar further discloses where said track data interface is a target information system. (The examiner notes that “a target information system” is not particularly limiting as it has no apparent explicit definition. Paragraph [0001, 0005-0006, 0022, 0038, 0064, 0089-0091] and Figure [1-2, 6], Bageshwar discloses systems and methods for detecting and tracking cooperative and non-cooperative targets to be used for object/collision avoidance, and as the system tracks targets and provides track information, reasonably constitutes “target information system”) References Further references that discuss prior art, but were not relied upon for creation of this office action are provided below: # Publication Number Title Inventor Dates Description of Relevance 1 US 2019/0383936 A1 SYSTEM AND METHOD FOR DETECTION AND REPORTING OF TARGETS WITH DATA LINKS Bartone et al. Filed: 23 Jan 2018 Pub: 19 Dec 2019 Discusses a system of detecting aerial targets (birds, drones, etc.) using sensor fusion and reporting of the detections. 2 US 2019/0204434 A1 SYSTEM AND METHOD FOR INTEGRATION OF DATA RECEIVED FROM GMTI RADARS AND ELECTRO OPTICAL SENSORS Novoselsky et al. Filed: 30 Nov 2018 Pub: 04 Jul 2019 Discusses sensor fusion of data received from radar and additional sensor data to combine for tracking a ground target. 3 US 2010/0039310 A1 SYSTEMIS AND METHODS FOR AR TRAFFIC SURVELLANCE Smith et al. Filed: 02 May 2008 Pub: 18 Feb 2010 Discloses a system of tracking (through sensor fusion of a multitude of sensor types) cooperative and non-cooperative aircraft in order to prevent collision. 4 US 2021/0383708 A1 SYSTEM AND METHOD FOR COMMUNITY PROVIDED WEATHER UPDATES FOR AIRCRAFT Gibbons, II et al. Filed: 03 Jun 2021 Pub: 09 Dec 2021 Discusses a tracking system for aircraft that displays weather data, indication of corresponding time, and trajectory of the weather (such as wind direction). 5 US 2014/0028485 A1 AIRSPACE RISKMITIGATION SYSTEM Nordlie et al. Filed: 08 Nov 2012 Pub: 30 Jan 2014 Discloses a system that tracks broadcasting (cooperative) aircraft, non-cooperative aircraft, and weather data using sensor fusion. 6 US 11,854,415 B1 RADAR-BASED DISCOVERY TECHNOLOGIES FOR MANAGING AIR TRAFFIC Robertson et al. Filed: 09 Apr 2019 Pub: 26 Dec 2023 Discloses an air traffic monitoring system that collects data from cooperative targets with ADS-B data, non-cooperative target data (including objects such as birds) with radar data, for the purpose of routing including collision avoidance. 7 US 2015/0130618 A1 DYNAMIC ALARM ZONES FOR BRD DETECTION SYSTEMS Hamminga et al. Filed: 14 Oct 2014 Pub: 14 May 2015 Discusses an aviation system of monitoring bird location and trajectory for avoidance. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENJAMIN J BROSH whose telephone number is (571)270-0105. The examiner can normally be reached M-F 0730-1700. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, THOMAS WORDEN can be reached at (571)272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /B.J.B./Examiner, Art Unit 3658 /JASON HOLLOWAY/Primary Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

May 29, 2024
Application Filed
Mar 11, 2026
Non-Final Rejection — §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12576850
LANE CHANGE SYSTEM OF AUTONOMOUS VEHICLE
2y 5m to grant Granted Mar 17, 2026
Patent 12575997
EXOSUIT CONTROL USING MOVEMENT PRIMITIVES FROM EMBEDDINGS OF UNSTRUCTURED MOVEMENTS
2y 5m to grant Granted Mar 17, 2026
Patent 12552017
OPTIMIZING ROBOTIC DEVICE PERFORMANCE
2y 5m to grant Granted Feb 17, 2026
Patent 12552533
INFORMATION PROCESSING SYSTEM, NOTIFICATION METHOD, AND UNMANNED AERIAL VEHICLE
2y 5m to grant Granted Feb 17, 2026
Patent 12536918
AIRSPACE TRAFFIC PREDICTION METHOD BASED ON ENSEMBLE LEARNING ALGORITHM
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+29.5%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 77 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month