DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 have been examined.
P = paragraph e.g. P[0001] = paragraph[0001]
Examiner’s Note: The 12/10/2025 claim amendments have rendered moot the rejections under 35 U.S.C. 112(b), as the antecedent basis issues have been corrected by the amendments.
Response to Arguments
Applicant's arguments filed 12/10/2025 have been fully considered but they are not persuasive.
Regarding the rejections under 35 U.S.C. 101, the Applicant argues
“The Claims expressly require the computer apparatus to automatically aggregate and correlate a set of heterogeneous aviation data sources, including airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps. These data sources correspond to distinct hardware subsystems and communication interfaces, and their fusion requires machine-implemented synchronization and correlation. Further, the Claims require projecting predicted aircraft trajectories onto airport-surface geometry defined by runway and taxiway structures in the airport maps. This operation necessarily involves a geospatial computation tied directly to physical airport infrastructure and cannot be carried out using mental thought or pen-and-paper techniques.
These limitations satisfy Step 2A, Prong II of the 2019 Subject Matter Eligibility Guidance because the claims integrate any alleged abstract idea into a practical application. The Claims are limited to an on-aircraft computing environment, utilizing data sources unique to aviation operations, including airport operations databases, ADS-B broadcasts, ATC voice communications, and onboard sensors. The projections of predicted trajectories occur within the specific geospatial constraints of an airport surface, and the determinations are used to identify real-time conflict conditions between taxiing and runway-entering aircraft. This represents a clear application of the alleged abstract idea to a technological field, using domain-specific structures and constraints, rather than an abstract result”.
The arguments are not persuasive.
The Examiner first notes for the record that the claimed word “projecting” (or “project”, or “projects”) is not present in the specification of the present application.
Furthermore, the “computer apparatus” and the “at least one processor” are recited at a high level of generality and amount to nothing more than a generic computer component used to apply the exception. The argument “These data sources correspond to distinct hardware subsystems and communication interfaces, and their fusion requires machine-implemented synchronization and correlation” is not persuasive as gathering data from data sources amounts to mere data gathering. Furthermore, a person may simply look at such data from data sources, such as by looking at the data on paper. Regarding the argument “Further, the Claims require projecting predicted aircraft trajectories onto airport-surface geometry defined by runway and taxiway structures in the airport maps. This operation necessarily involves a geospatial computation tied directly to physical airport infrastructure and cannot be carried out using mental thought or pen-and-paper techniques”, the Applicant provides no evidence that proves a user cannot mentally perform the claimed “projecting”, and the argument is not persuasive. The Examiner strongly disagrees that a user could not mentally perform the “projecting”, as a user could clearly simply mentally determine or imagine how “predicted future positions” may interact “airport-surface geometry including runway and taxiway structures as defined by the airport maps” as a simply mental exercise of “projecting” or mentally envisioning each position on the entire runway, which would in fact be a trivial mental exercise for a user. The Applicant’s argument implies that a user could not mentally perform the “projecting” and the user would then be incapable of drawing a predicted trajectory or predicted positions on a map of an airport including runways, which is an argument that is not supported by any facts. The Examiner notes that even FIGS. 2A-2B of the present applicant (which were presumably thought of and created by a person mentally before creating the drawings) would be trivial to determine and create mentally for a user. Therefore, the arguments are not persuasive.
The Applicant further argues
“Even under Step 2B of the 2019 Guidance, the Claims recite significantly more than well-understood, routine, or conventional activity. There is no indication in the cited art, or in conventional avionics systems, of automatically aggregating and correlating ATC voice-recognition output, airport operations data, ADS-B data, onboard sensor data, and airport maps into a unified predictive model. Nor is projecting predicted trajectories onto airport-surface geometry a conventional or generic operation performed by known systems such as TAWS, TCAS, or ADS-B IN. The ordered combination of multi-source correlation, intent-informed prediction, and geospatial projection is not taught or suggested in the art”.
The arguments are not persuasive. The Applicant appears to be arguing patentability over the prior art, where the Examiner disagrees that the claims are patentable over the prior art and that the prior art does not teach or suggest the claimed limitations, as seen in the rejections under 35 U.S.C. 103. The Applicant provides no specific argument against any specific citation of the prior art or specific portion of the rejections under 35 U.S.C. 103. Furthermore, because the claimed steps can be performed mentally and the additional elements do not amount to significantly more than the abstract idea, the claimed invention does not require any improvement to any technology or technical field. Therefore, the arguments are not persuasive.
The Applicant further argues
“Finally, the Claims address a technological problem rooted in airport- surface operations; specifically, reliable prediction of taxiway and runway incursions using multiple imperfect data sources. The solution recited in the claims is likewise technological and cannot be implemented without the specific avionics components and data relationships identified in the specification. Because the Claims now recite a concrete, technical application that improves the functioning of an aircraft-surface collision detection system, the §101 rejection is no longer applicable. Applicant respectfully requests that the rejection be withdrawn”.
The arguments are not persuasive. The additional limitations directed to the use of a generic computer component to apply the exception, mere instructions to apply the exception using a generic computer component, and mere data gathering, do not amount to significantly more than the abstract idea, and the claimed invention does not require any improvement to any technology or technical field. Therefore, the arguments are not persuasive.
Regarding the rejections under 35 U.S.C 103, these arguments are moot in view of the new grounds of rejection.
However, regarding the argument
“Second, the Claims require identifying a lost aircraft when the observed movement does not match the intended route data, a capability wholly absent from the cited art. None of the references detect when an aircraft's real-world movements diverge from an expected taxi or runway path, nor do they classify an aircraft as "lost." Bharadwaja merely overlays positional data on a display. Coles predicts airborne trajectories but never compares actual paths against intended ground routes. Doyen checks procedural compliance, not spatial adherence to an intended path. Roberts and Baladhandapani are unrelated to monitoring conformance to surface taxi routes. The cited art lacks any teaching or suggestion of detecting mismatches between intended and observed behavior to classify an aircraft as lost”,
with respect to Claim 1, Claim 1 does not require “classify an aircraft as lost” as no classification is claimed. The limitation “identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data” encompasses simply identifying if “movement of one of the one or more other aircraft does not match the corresponding intended route data”, where the claim labels an aircraft as “lost”, but “identify a lost aircraft” within broadest reasonable interpretation encompasses the identifying occurs as a result of the condition of “movement of one of the one or more other aircraft does not match the corresponding intended route data”, with no separate step of labeling or classifying being required, as only an “identify” step is required. The Examiner also notes that the limitation “identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data” does not occur and is then not a step required by the prior art if simply interpreting that the “when” condition does not occur, and the claims as interpreted in the present rejection interpret the “when” condition as not occurring.
Furthermore, regarding the argument “Doyen checks procedural compliance, not spatial adherence to an intended path”, Doyen et al. recites “…the disclosed schemes may detect deviations from intended operations, operating parameters or operating values that may be indicative of, for example, a deviation from an intended flight planned route…” (emphasis added) (Doyen et al.; see col.6, particularly lines 31-53, also see col.9, particularly lines 7-18), therefore, the argument that Doyen et al. does not check “spatial adherence to an intended path” is clearly incorrect and is not persuasive. This citation from Doyen et al. was provided in the previous rejection, and the Applicant’s arguments against Doyen et al. are a conclusory statement that does not address this cited and quoted aspect of Doyen et al., therefore, the arguments are not persuasive.
All other arguments are moot in view of the new grounds of rejection.
All claims are rejected under 35 U.S.C. 101 and 35 U.S.C. 103.
See the new grounds of rejection.
Claim Interpretation
Regarding the limitation “projecting” of Claims 1, 8 and 14 of the 12/10/2025 claim amendments, the word “projecting” (or “project”, or “projects”) is not present in the specification of the present application. Rather than apply a rejection under 35 U.S.C. 112(a) for new matter, the Examiner has determined that the limitation “projecting” encompasses any determination of any position of an aircraft relative to any surface of an airport. The Examiner also notes that there is nothing in the specification to contradict this interpretation, as the word “projecting” is not present in the specification of the present application, as already mentioned.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. See below.
Claim 1 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
101 Analysis – Step 1
Claim 1 is directed to an apparatus (i.e., a machine). Therefore, claim 1 is within at least one of the four statutory categories.
101 Analysis – Step 2A, Prong I
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 1 includes limitations that recite an abstract idea (emphasized below) and will be used as a representative claim for the remainder of the 101 rejection. Claim 1 recites:
A computer apparatus comprising:
at least one processor in data communication with a memory storing processor executable code for configuring the at least one processor to:
determine a current location, velocity, acceleration, and route for a present aircraft including the computer apparatus;
determine a future position of the present aircraft based on the current location, velocity, acceleration, and route;
receive position and velocity data for one or more other aircraft, distinct from the present aircraft;
automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps;
receive intended route data for the one or more other aircraft;
predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data;
identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent; and
determine if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps.
The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. Specifically, regarding the “determine a current location, velocity, acceleration, and route for a present aircraft” step, a user may mentally determine a current location, velocity, acceleration, and route for a present aircraft. Regarding the “determine a future position of the present aircraft based on the current location, velocity, acceleration, and route” step, a user may mentally determine a future position of the present aircraft based on the current location, velocity, acceleration, and route. Regarding the “receive position and velocity data for one or more other aircraft, distinct from the present aircraft” step, a user may mentally receive position and velocity data for one or more other aircraft, distinct from the present aircraft. Regarding the “automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps” step, a user may mentally automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps. Regarding the “receive intended route data for the one or more other aircraft” step, a user may mentally receive intended route data for the one or more other aircraft. Regarding the “predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data” step, a user may mentally predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data. Regarding the ”identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” step, a user may mentally identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent. Regarding the “determine if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps” step, a user may mentally determine if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps.
101 Analysis – Step 2A, Prong II
Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”):
A computer apparatus comprising:
at least one processor in data communication with a memory storing processor executable code for configuring the at least one processor to:
determine a current location, velocity, acceleration, and route for a present aircraft including the computer apparatus;
determine a future position of the present aircraft based on the current location, velocity, acceleration, and route;
receive position and velocity data for one or more other aircraft, distinct from the present aircraft;
automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps;
receive intended route data for the one or more other aircraft;
predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data;
identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent; and
determine if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps.
For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application.
Regarding the additional limitation “A computer apparatus comprising: at least one processor”, the “computer apparatus” and the “at least one processor” are recited at a high level of generality and amount to nothing more than a generic computer component used to apply the exception. Regarding the additional limitation “a memory storing processor executable code for configuring the at least one processor”, the “memory” is recited at a high level of generality and amounts to nothing more than mere instructions to apply the exception using a generic computer component. Furthermore, using the “at least one processor in data communication with a memory storing processor executable code” to perform the two “receive” steps amounts to mere data gathering, which is a form of insignificant extra-solution activity, and mere instructions to apply the exception using a generic computer component.
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
101 Analysis – Step 2B
Regarding Step 2B of the Revised Guidance, representative independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional limitation “at least one processor” is recited at a high level of generality and amounts to nothing more than a generic computer component used to apply the exception, and the additional limitation “a memory storing processor executable code for configuring the at least one processor” amounts to nothing more than mere instructions to apply the exception using a generic computer component, and the two “receive” limitations amount to mere data gathering, which is a form of insignificant extra-solution activity, and mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Hence, the claim is not patent eligible.
Dependent claim(s) 2-7 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims 2-7 are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 2-7 are similarly rejected as being directed towards non-statutory subject matter.
Therefore, claim(s) 1-7 are ineligible under 35 USC §101.
See below regarding the dependent claims.
As per Claim 2, said claim is rejected as it fails to correct the deficiency of Claim 1. A user may mentally apply “voice recognition” to communications between the one or more other aircraft and a ground control to derive intended route data. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 3, said claim is rejected as it fails to correct the deficiency of Claim 1. A user may mentally confirm that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 4, said claim is rejected as it fails to correct the deficiency of Claim 1. A user may mentally determine a confidence metric associated with each of the predicted future positions. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 5, said claim is rejected as it fails to correct the deficiency of Claim 1. A user may mentally at least partially base each confidence metric on observations of the one or more other aircraft over time including correlations of historical movement patterns. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 6, said claim is rejected as it fails to correct the deficiency of Claim 1. The claim is directed to describing data, which does not amount to significantly more than the judicial exception.
As per Claim 7, said claim is rejected as it fails to correct the deficiency of Claim 1. A user may mentally produce an alert if any predicted future position is determined to intersect with the future position of the present aircraft, where the broadest reasonable interpretation of “produce” encompasses merely generating data. Therefore, the claim does not amount to significantly more than the judicial exception.
Claim 8 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
101 Analysis – Step 1
Claim 8 is directed to a method (i.e., a process). Therefore, claim 8 is within at least one of the four statutory categories.
101 Analysis – Step 2A, Prong I
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 8 includes limitations that recite an abstract idea (emphasized below). Claim 8 recites:
A method comprising:
determining a current location, velocity, acceleration, and route for a present aircraft;
determining a future position of the present aircraft based on the current location, velocity, acceleration, and route;
receiving position and velocity data for one or more other aircraft, distinct from the present aircraft;
automatically aggregating and correlating airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps;
receiving intended route data for the one or more other aircraft;
predicting a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data;
determining a confidence metric associated with each predicted future position;
identifying a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent; and
determining if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps.
The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. Specifically, regarding the “determining a current location, velocity, acceleration, and route for a present aircraft” step, a user may mentally determine a current location, velocity, acceleration, and route for a present aircraft. Regarding the “determining a future position of the present aircraft based on the current location, velocity, acceleration, and route” step, a user may mentally determine a future position of the present aircraft based on the current location, velocity, acceleration, and route. Regarding the “receiving position and velocity data for one or more other aircraft, distinct from the present aircraft” step, a user may mentally receive position and velocity data for one or more other aircraft, distinct from the present aircraft. Regarding the “automatically aggregating and correlating airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps” step, a user may mentally automatically aggregating and correlating airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps. Regarding the “receiving intended route data for the one or more other aircraft” step, a user may mentally receive intended route data for the one or more other aircraft. Regarding the “predicting a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data” step, a user may mentally predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data. Regarding the “determining a confidence metric associated with each predicted future position” step, a user may mentally determine a confidence metric associated with each predicted future position. Regarding the “identifying a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” step, a user may mentally identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent. Regarding the “determining if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps” step, a user may mentally determine if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps. Accordingly, the claim recites at least one abstract idea.
101 Analysis – Step 2A, Prong II
Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, there are no additional limitations beyond the above-noted abstract idea. Had there been any additional limitations, such additional limitations would be underlined portions while the bolded portions continue to represent the “abstract idea” in the following:
A method comprising:
determining a current location, velocity, acceleration, and route for a present aircraft;
determining a future position of the present aircraft based on the current location, velocity, acceleration, and route;
receiving position and velocity data for one or more other aircraft, distinct from the present aircraft;
automatically aggregating and correlating airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps;
receiving intended route data for the one or more other aircraft;
predicting a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data;
determining a confidence metric associated with each predicted future position;
identifying a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent; and
determining if any predicted future position intersects with the future position of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps.
The examiner submits that there are no additional limitations that integrate the above-noted abstract idea into a practical application. The Examiner notes that even if the two “receiving” steps were interpreted as additional limitations (which they are not for this rejection), receiving data amounts to mere data gathering, which is a form of insignificant extra-solution activity.
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, there are no additional that add anything that is not already present when looking at the elements taken individually. For instance, there are no additional elements that, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, there are no additional limitation(s) that integrate the abstract idea into a practical application because there are no additional limitations that impose any meaningful limits on practicing the abstract idea, as no additional limitations are claimed.
101 Analysis – Step 2B
Regarding Step 2B of the Revised Guidance, independent claim 8 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application, as no additional limitations are claimed. Hence, the claim is not patent eligible. The Examiner notes that even if it was assumed that the method of Claim 10 was performed by a generic computer, such an assumption would amount to no more than mere instructions to apply the exception using a generic computer component, where mere instructions to apply an exception using a generic computer component cannot provide an inventive concept.
Dependent claim(s) 9-13 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims 9-13 are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 9-13 are similarly rejected as being directed towards non-statutory subject matter.
Therefore, claim(s) 8-13 are ineligible under 35 USC §101.
See below regarding the dependent claims.
As per Claim 9, said claim is rejected as it fails to correct the deficiency of Claim 8. A user may mentally apply “voice recognition” to communications between the one or more other aircraft and a ground control to derive intended route data. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 10, said claim is rejected as it fails to correct the deficiency of Claim 8. A user may mentally confirm that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 11, said claim is rejected as it fails to correct the deficiency of Claim 8. A user may mentally receive airport operations data, and may mentally at least partially base the intended route data on the airport operations data. Furthermore, the “receiving” step of Claim 11 is directed to mere data gathering, which is a form of insignificant extra-solution activity. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 12, said claim is rejected as it fails to correct the deficiency of Claim 8. A user may mentally at least partially base each confidence metric on observations of the one or more other aircraft over time including correlations of historical movement patterns. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 13, said claim is rejected as it fails to correct the deficiency of Claim 8. A user may mentally produce an alert if any predicted future position is determined to intersect with the future position of the present aircraft, where the broadest reasonable interpretation of “produce” encompasses merely generating data. Therefore, the claim does not amount to significantly more than the judicial exception.
Claim 14 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
101 Analysis – Step 1
Claim 14 is directed to a system (i.e., a machine). Therefore, claim 14 is within at least one of the four statutory categories.
101 Analysis – Step 2A, Prong I
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 14 includes limitations that recite an abstract idea (emphasized below). Claim 14 recites:
An on-aircraft system comprising:
at least one processor in data communication with a memory storing processor executable code for configuring the at least one processor to:
determine a current location, velocity, acceleration, and route for a present aircraft including the computer apparatus;
determine a future position of the present aircraft based on the current location, velocity, acceleration, and route;
receive position and velocity data for one or more other aircraft, distinct from the present aircraft;
automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps;
receive intended route data for the one or more other aircraft;
predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data;
identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent; and
determine if the intended route intersects with the route of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps.
The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. Specifically, regarding the “determine a current location, velocity, acceleration, and route for a present aircraft” step, a user may mentally determine a current location, velocity, acceleration, and route for a present aircraft. Regarding the “determine a future position of the present aircraft based on the current location, velocity, acceleration, and route” step, a user may mentally determine a future position of the present aircraft based on the current location, velocity, acceleration, and route. Regarding the “receive position and velocity data for one or more other aircraft, distinct from the present aircraft” step, a user may mentally receive position and velocity data for one or more other aircraft, distinct from the present aircraft. Regarding the “automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps” step, a user may mentally automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps. Regarding the “receive intended route data for the one or more other aircraft” step, a user may mentally receive intended route data for the one or more other aircraft. Regarding the “predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data” step, a user may mentally the “predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data. Regarding the “identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” step, a user may mentally identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent. Regarding the “determine if the intended route intersects with the route of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps” step, a user may mentally determine if the intended route intersects with the route of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps. Accordingly, the claim recites at least one abstract idea.
101 Analysis – Step 2A, Prong II
Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”):
An on-aircraft system comprising:
at least one processor in data communication with a memory storing processor executable code for configuring the at least one processor to:
determine a current location, velocity, acceleration, and route for a present aircraft including the computer apparatus;
determine a future position of the present aircraft based on the current location, velocity, acceleration, and route;
receive position and velocity data for one or more other aircraft, distinct from the present aircraft;
automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps;
receive intended route data for the one or more other aircraft;
predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data;
identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent; and
determine if the intended route intersects with the route of the present aircraft by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps.
For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application.
Regarding the additional limitation “at least one processor”, the “at least one processor” is recited at a high level of generality and amounts to nothing more than a generic computer component used to apply the exception. Regarding the additional limitation “a memory storing processor executable code for configuring the at least one processor”, the “memory” is recited at a high level of generality and amounts to nothing more than mere instructions to apply the exception using a generic computer component. Furthermore, using the “at least one processor in data communication with a memory storing processor executable code” to perform the two “receive” steps amounts to mere data gathering, which is a form of insignificant extra-solution activity.
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
101 Analysis – Step 2B
Regarding Step 2B of the Revised Guidance, independent claim 14 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional limitation is recited at a high level of generality and amounts to nothing more than a generic computer component used to apply the exception, and the additional limitation “a memory storing processor executable code for configuring the at least one processor” amounts to nothing more than mere instructions to apply the exception using a generic computer component, and the two “receive” limitations amount to mere data gathering, which is a form of insignificant extra-solution activity. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Hence, the claim is not patent eligible.
Dependent claim(s) 15-20 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims 15-20 are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 15-20 are similarly rejected as being directed towards non-statutory subject matter.
Therefore, claim(s) 14-20 are ineligible under 35 USC §101.
See below regarding the dependent claims.
As per Claim 15, said claim is rejected as it fails to correct the deficiency of Claim 14. A user may mentally apply “voice recognition” to communications between the one or more other aircraft and a ground control to derive intended route data. Furthermore, the claim does not require any step of executing a communication between the one or more other aircraft and a ground control, and is directed to merely analyzing data, where the analysis may be performed mentally as already stated. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 16, said claim is rejected as it fails to correct the deficiency of Claim 14. A user may mentally confirm that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 17, said claim is rejected as it fails to correct the deficiency of Claim 14. A user may mentally determine a confidence metric associated with each predicted future position. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 18, said claim is rejected as it fails to correct the deficiency of Claim 14. The claim is directed to describing data, which does not amount to significantly more than the judicial exception.
As per Claim 19, said claim is rejected as it fails to correct the deficiency of Claim 14. A user may mentally at least partially base each confidence metric on observations of the one or more other aircraft over time including correlations of historical movement patterns. Therefore, the claim does not amount to significantly more than the judicial exception.
As per Claim 20, said claim is rejected as it fails to correct the deficiency of Claim 14. A user may mentally produce an alert if any predicted future position is determined to intersect with the future position of the present aircraft, where the broadest reasonable interpretation of “produce” encompasses merely generating data. Therefore, the claim does not amount to significantly more than the judicial exception.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 6, 7, 14-16 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Bharadwaja et al. (2025/0269978) in view of Coles et al. (5,596,332) further in view of Baladhandapani et al. (2023/0215431), further in view of Doyen et al. (9,830,829).
Regarding Claim 1, Bharadwaja et al. teaches the claimed computer apparatus comprising:
at least one processor in data communication with a memory storing processor executable code (see P[0027]-P[0028]) for configuring the at least one processor to:
determine a current location, velocity, acceleration, and route for a present aircraft including the computer apparatus (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029]);
determine a future position of the present aircraft based on the current location…and route (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]);
receive position and velocity data for one or more other aircraft, distinct from the present aircraft (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029]);
automatically aggregate and correlate airport operations data (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031]),…ADS-B data (“…the tracking sub-system is an automatic dependent surveillance-broadcast (ADS-B) tracking sub-system”, see P[0029]), onboard sensor data (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031] and FIG. 2), and airport maps (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031] and FIG. 2);
receive intended route data for the one or more other aircraft (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031]);
predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029] and “In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]);
…; and
determine if any predicted future position intersects with the future position of the present aircraft (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]) by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps (see FIG. 2 and “…intersection of an intended path graphic with a structure, or another intended path graphic, and/or the like”, see P[0046]).
Bharadwaja et al. does not expressly recite the bolded portions of the claimed
determine a future position of the present aircraft based on the current location, velocity, acceleration, and route.
However, Coles et al. (5,596,332) teaches determining a probability of path intersections between two aircraft based on relative position, velocities, and paths of travel of both aircraft (Coles et al.; see col.10, particularly lines 55-62).
Bharadwaja et al. does not expressly recite the bolded portions of the claimed
automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor, and airport maps
or the claimed
identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent.
However, Baladhandapani et al. (2023/0215431) teaches determining aircraft context and determining aircraft actions based on “voice recognition output” (Baladhandapani et al.; “…natural language processing may be applied to the textual representations of the clearance communications that were directed to the ownship aircraft by ATC, provided by the ownship aircraft to ATC, broadcasted by ATIC or otherwise received from ATIS to identify the operational subject(s) of the clearance communications and any operational parameter value(s) and/or aircraft action(s) associated with the clearance communications, which are then stored or otherwise maintained in association with the transcribed audio content of the received audio communication in the clearance table 226”, see P[0031] and “…the command system 204 and/or the voice command recognition application 240 receives indicia of the current operational context for the aircraft (e.g., the current location of the aircraft with respect to a taxi clearance, a flight plan, or other defined route or manner of operation of the aircraft, the current flight phase, the current geographic location of the aircraft, the current altitude of the aircraft, the current physical configuration of the aircraft, and/or the like) from one or more onboard system(s) 208 in addition to retrieving or otherwise obtaining the current conversational context associated with the aircraft (e.g., the subset of ATC clearance communications directed to and/or sent from the ownship aircraft) from the clearance table 226”, see P[0043]).
Furthermore, the limitation “identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” is a contingent limitation that does not need to occur if neither “when” condition is satisfied. Furthermore, the step “and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” is not required to occur, as the claim already requires “receive intended route data for the one or more other aircraft”, meaning the “intended route data” is required to be received and is therefore “available”, therefore, the condition that “the intended route data is unavailable or inconsistent” is not required to occur and therefore is not required by the prior art. For compact prosecution, this limitation is considered in view of the prior art. Doyen et al. (9,830,829) teaches determining an intended route using voice communications and also determining a deviation from an intended route (Doyen et al.; “The exemplary system 200 may include at least one voice recognition device/unit 240, which may be available to convert voice communications received from any source to data elements, or conversely to convert data elements received from any source to voice communications. An objective of the inclusion of such a voice recognition device/unit 240 may be to provide an automated transcription of the voice communications into a format that is usable by the exemplary system 240 to supplement other source data to generate and/or modify indications of intent for the operation of the aircraft”, see col.11, particularly lines 6-21 and “…the disclosed schemes may detect deviations from intended operations, operating parameters or operating values that may be indicative of, for example, a deviation from an intended flight planned route…”, see col.6, particularly lines 31-53, also see col.9, particularly lines 7-18), where the Examiner has determined that the limitation “lost aircraft” is simply a label that encompasses an aircraft that has deviated from an intended route.
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Coles et al., Baladhandapani et al. and Doyen et al., and determine a future position of the present aircraft based on the current location, velocity, acceleration, and route, and automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor, and airport maps, and identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent, as rendered obvious by Coles et al., Baladhandapani et al. and Doyen et al., in order to determine if “there is an intersection between the probabilistic volumes of two aircraft in the near future” (Coles et al.; see col.5, lines 48-52), in order to “identify or otherwise determine a conversational context” (Baladhandapani et al.; see P[0032]), and in order to “identify deviations from intended operation of a particular aircraft” (Doyen et al.; see col.2, lines 14-30).
Regarding Claim 2, Bharadwaja et al. does not expressly recite the claimed computer apparatus of Claim 1, wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control.
However, Baladhandapani et al. (2023/0215431) teaches wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control (Baladhandapani et al.; “…natural language processing may be applied to the textual representations of the clearance communications that were directed to the ownship aircraft by ATC, provided by the ownship aircraft to ATC, broadcasted by ATIC or otherwise received from ATIS to identify the operational subject(s) of the clearance communications and any operational parameter value(s) and/or aircraft action(s) associated with the clearance communications, which are then stored or otherwise maintained in association with the transcribed audio content of the received audio communication in the clearance table 226”, see P[0031] and “…the command system 204 and/or the voice command recognition application 240 receives indicia of the current operational context for the aircraft (e.g., the current location of the aircraft with respect to a taxi clearance, a flight plan, or other defined route or manner of operation of the aircraft, the current flight phase, the current geographic location of the aircraft, the current altitude of the aircraft, the current physical configuration of the aircraft, and/or the like) from one or more onboard system(s) 208 in addition to retrieving or otherwise obtaining the current conversational context associated with the aircraft (e.g., the subset of ATC clearance communications directed to and/or sent from the ownship aircraft) from the clearance table 226”, see P[0043]).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Baladhandapani et al., and wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control, as rendered obvious by Baladhandapani et al., in order to “identify or otherwise determine a conversational context” (Baladhandapani et al.; see P[0032]).
Regarding Claim 2, Bharadwaja et al. does not expressly recite the claimed computer apparatus of Claim 1, wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control.
However, Doyen et al. (9,830,829) teaches wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control (Doyen et al.; “The exemplary system 200 may include at least one voice recognition device/unit 240, which may be available to convert voice communications received from any source to data elements, or conversely to convert data elements received from any source to voice communications. An objective of the inclusion of such a voice recognition device/unit 240 may be to provide an automated transcription of the voice communications into a format that is usable by the exemplary system 240 to supplement other source data to generate and/or modify indications of intent for the operation of the aircraft”, see col.11, particularly lines 6-21 and “…the disclosed schemes may detect deviations from intended operations, operating parameters or operating values that may be indicative of, for example, a deviation from an intended flight planned route…”, see col.6, particularly lines 31-53, also see col.9, particularly lines 7-18).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Doyen et al., and wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control, as rendered obvious by Doyen et al., in order to “identify deviations from intended operation of a particular aircraft” (Doyen et al.; see col.2, lines 14-30).
Regarding Claim 3, Bharadwaja et al. does not expressly recite the claimed computer apparatus of Claim 1, wherein the at least one processor is further configured to confirm that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation.
However, Doyen et al. (9,830,829) teaches wherein the at least one processor is further configured to confirm that the one or more other aircraft are adhering to the corresponding intended route data and also determining a deviation from an intended route (Doyen et al.; “The exemplary system 200 may include at least one voice recognition device/unit 240, which may be available to convert voice communications received from any source to data elements, or conversely to convert data elements received from any source to voice communications. An objective of the inclusion of such a voice recognition device/unit 240 may be to provide an automated transcription of the voice communications into a format that is usable by the exemplary system 240 to supplement other source data to generate and/or modify indications of intent for the operation of the aircraft”, see col.11, particularly lines 6-21 and “…the disclosed schemes may detect deviations from intended operations, operating parameters or operating values that may be indicative of, for example, a deviation from an intended flight planned route…”, see col.6, particularly lines 31-53, also see col.9, particularly lines 7-18).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Doyen et al., and wherein the at least one processor is further configured to confirm that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation, as rendered obvious by Doyen et al., in order to “identify deviations from intended operation of a particular aircraft” (Doyen et al.; see col.2, lines 14-30).
Regarding Claim 6, Bharadwaja et al. teaches the claimed computer apparatus of Claim 1, wherein:
the intended route data is at least partially based on the airport operations data including airport flow and routing constraints incorporated via the airport operations data (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]).
Regarding Claim 7, Bharadwaja et al. teaches the claimed computer apparatus of Claim 1, wherein the at least one processor is further configured to produce an alert if any predicted future position is determined to intersect with the future position of the present aircraft (“If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict…”, see P[0038]).
Regarding Claim 14, Bharadwaja et al. teaches the claimed on-aircraft system comprising:
at least one processor in data communication with a memory storing processor executable code (see P[0027]-P[0028]) for configuring the at least one processor to:
determine a current location, velocity, acceleration, and route for a present aircraft including the computer apparatus (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029]);
determine a future position of the present aircraft based on the current location…and route (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]);
receive position and velocity data for one or more other aircraft, distinct from the present aircraft (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029]);
automatically aggregate and correlate airport operations data (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031])…, ADS-B data (“…the tracking sub-system is an automatic dependent surveillance-broadcast (ADS-B) tracking sub-system”, see P[0029]), onboard sensor data (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031] and FIG. 2), and airport maps (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031] and FIG. 2);
receive intended route data for the one or more other aircraft (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031]);
predict a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029] and “In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]);
…; and
determine if the intended route intersects with the route of the present aircraft (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]) by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps (see FIG. 2 and “…intersection of an intended path graphic with a structure, or another intended path graphic, and/or the like”, see P[0046]).
Bharadwaja et al. does not expressly recite the bolded portions of the claimed
determine a future position of the present aircraft based on the current location, velocity, acceleration, and route.
However, Coles et al. (5,596,332) teaches determining a probability of path intersections between two aircraft based on relative position, velocities, and paths of travel of both aircraft (Coles et al.; see col.10, particularly lines 55-62).
Bharadwaja et al. does not expressly recite the bolded portions of the claimed
automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps
or the claimed
identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent.
However, Baladhandapani et al. (2023/0215431) teaches determining aircraft context and determining aircraft actions based on “voice recognition output” (Baladhandapani et al.; “…natural language processing may be applied to the textual representations of the clearance communications that were directed to the ownship aircraft by ATC, provided by the ownship aircraft to ATC, broadcasted by ATIC or otherwise received from ATIS to identify the operational subject(s) of the clearance communications and any operational parameter value(s) and/or aircraft action(s) associated with the clearance communications, which are then stored or otherwise maintained in association with the transcribed audio content of the received audio communication in the clearance table 226”, see P[0031] and “…the command system 204 and/or the voice command recognition application 240 receives indicia of the current operational context for the aircraft (e.g., the current location of the aircraft with respect to a taxi clearance, a flight plan, or other defined route or manner of operation of the aircraft, the current flight phase, the current geographic location of the aircraft, the current altitude of the aircraft, the current physical configuration of the aircraft, and/or the like) from one or more onboard system(s) 208 in addition to retrieving or otherwise obtaining the current conversational context associated with the aircraft (e.g., the subset of ATC clearance communications directed to and/or sent from the ownship aircraft) from the clearance table 226”, see P[0043]).
Furthermore, the limitation “identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” is a contingent limitation that does not need to occur if neither “when” condition is satisfied. Furthermore, the step “and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” is not required to occur, as the claim already requires “receive intended route data for the one or more other aircraft”, meaning the “intended route data” is required to be received and is therefore “available”, therefore, the condition that “the intended route data is unavailable or inconsistent” is not required to occur and therefore is not required by the prior art. For compact prosecution, this limitation is considered in view of the prior art. Doyen et al. (9,830,829) teaches determining an intended route using voice communications and also determining a deviation from an intended route (Doyen et al.; “The exemplary system 200 may include at least one voice recognition device/unit 240, which may be available to convert voice communications received from any source to data elements, or conversely to convert data elements received from any source to voice communications. An objective of the inclusion of such a voice recognition device/unit 240 may be to provide an automated transcription of the voice communications into a format that is usable by the exemplary system 240 to supplement other source data to generate and/or modify indications of intent for the operation of the aircraft”, see col.11, particularly lines 6-21 and “…the disclosed schemes may detect deviations from intended operations, operating parameters or operating values that may be indicative of, for example, a deviation from an intended flight planned route…”, see col.6, particularly lines 31-53, also see col.9, particularly lines 7-18), where the Examiner has determined that the limitation “lost aircraft” is simply a label that encompasses an aircraft that has deviated from an intended route.
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Coles et al., Baladhandapani et al. and Doyen et al., and determine a future position of the present aircraft based on the current location, velocity, acceleration, and route, and automatically aggregate and correlate airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps, and identify a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent, as rendered obvious by Coles et al., Baladhandapani et al. and Doyen et al., in order to determine if “there is an intersection between the probabilistic volumes of two aircraft in the near future” (Coles et al.; see col.5, lines 48-52), in order to “identify or otherwise determine a conversational context” (Baladhandapani et al.; see P[0032]), and in order to “identify deviations from intended operation of a particular aircraft” (Doyen et al.; see col.2, lines 14-30).
Regarding Claim 15, Bharadwaja et al. does not expressly recite the claimed system of Claim 14, wherein the intended route data is derived via voice recognition applied to communications between the or more other aircraft and a ground control.
However, Baladhandapani et al. (2023/0215431) teaches wherein the intended route data is derived via voice recognition applied to communications between one or more other aircraft and a ground control (Baladhandapani et al.; “…natural language processing may be applied to the textual representations of the clearance communications that were directed to the ownship aircraft by ATC, provided by the ownship aircraft to ATC, broadcasted by ATIC or otherwise received from ATIS to identify the operational subject(s) of the clearance communications and any operational parameter value(s) and/or aircraft action(s) associated with the clearance communications, which are then stored or otherwise maintained in association with the transcribed audio content of the received audio communication in the clearance table 226”, see P[0031] and “…the command system 204 and/or the voice command recognition application 240 receives indicia of the current operational context for the aircraft (e.g., the current location of the aircraft with respect to a taxi clearance, a flight plan, or other defined route or manner of operation of the aircraft, the current flight phase, the current geographic location of the aircraft, the current altitude of the aircraft, the current physical configuration of the aircraft, and/or the like) from one or more onboard system(s) 208 in addition to retrieving or otherwise obtaining the current conversational context associated with the aircraft (e.g., the subset of ATC clearance communications directed to and/or sent from the ownship aircraft) from the clearance table 226”, see P[0043]).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Baladhandapani et al., and wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control, as rendered obvious by Baladhandapani et al., in order to “identify or otherwise determine a conversational context” (Baladhandapani et al.; see P[0032]).
Regarding Claim 16, Bharadwaja et al. does not expressly recite the claimed system of Claim 14, wherein the at least one processor is further configured to confirm that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation.
However, Doyen et al. (9,830,829) teaches wherein the at least one processor is further configured to confirm that the one or more other aircraft are adhering to the corresponding intended route data and also determining a deviation from an intended route (Doyen et al.; “The exemplary system 200 may include at least one voice recognition device/unit 240, which may be available to convert voice communications received from any source to data elements, or conversely to convert data elements received from any source to voice communications. An objective of the inclusion of such a voice recognition device/unit 240 may be to provide an automated transcription of the voice communications into a format that is usable by the exemplary system 240 to supplement other source data to generate and/or modify indications of intent for the operation of the aircraft”, see col.11, particularly lines 6-21 and “…the disclosed schemes may detect deviations from intended operations, operating parameters or operating values that may be indicative of, for example, a deviation from an intended flight planned route…”, see col.6, particularly lines 31-53, also see col.9, particularly lines 7-18).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Doyen et al., and wherein the at least one processor is further configured to confirm that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation, as rendered obvious by Doyen et al., in order to “identify deviations from intended operation of a particular aircraft” (Doyen et al.; see col.2, lines 14-30).
Regarding Claim 20, Bharadwaja et al. teaches the claimed system of Claim 14, wherein the at least one processor is further configured to produce an alert if any predicted future position is determined to intersect with the future position of the present aircraft (“If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict…”, see P[0038]).
Claims 4, 5 and 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Bharadwaja et al. (2025/0269978) in view of Coles et al. (5,596,332) further in view of Baladhandapani et al. (2023/0215431), further in view of Doyen et al. (9,830,829), further in view of Roberts et al. (8,255,147).
Regarding Claim 4, Bharadwaja et al. does not expressly recite the claimed computer apparatus of Claim 1, wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions.
However, Roberts et al. (8,255,147) teaches wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions (Roberts et al.; “…the uncertainty regions of the two aircraft…”, see col.8, particularly lines 20-43).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Roberts et al., and wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions, as rendered obvious by Roberts et al., in order to “aiding air traffic control” (Roberts et al.; see col.1, lines 14-15).
Regarding Claim 5, Bharadwaja et al. does not expressly recite the claimed computer apparatus of Claim 4, wherein each confidence metric is at least partially based on observations of the one or more other aircraft over time including correlations of historical movement patterns.
However, Roberts et al. (8,255,147) teaches wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions (Roberts et al.; “…the uncertainty regions of the two aircraft…”, see col.8, particularly lines 20-43 and “…the current position…”, see col.4, particularly lines 24-39 and “The rate of change of position and each of the variables above is calculated, and from this, the state at future point (i+1) is calculated by moving forward in time to time (t.sub.i+1), applying the rates of change calculated”, see col.6, particularly lines 5-52).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Roberts et al., and wherein each confidence metric is at least partially based on observations of the one or more other aircraft over time including correlations of historical movement patterns, as rendered obvious by Roberts et al., in order to “aiding air traffic control” (Roberts et al.; see col.1, lines 14-15).
Regarding Claim 17, Bharadwaja et al. does not expressly recite the claimed system of Claim 14, wherein the at least one processor is further configured to determine a confidence metric associated with each predicted future position.
However, Roberts et al. (8,255,147) teaches wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions (Roberts et al.; “…the uncertainty regions of the two aircraft…”, see col.8, particularly lines 20-43).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Roberts et al., and wherein the at least one processor is further configured to determine a confidence metric associated with each predicted future position, as rendered obvious by Roberts et al., in order to “aiding air traffic control” (Roberts et al.; see co.1, lines 14-15).
Regarding Claim 18, Bharadwaja et al. teaches the claimed system of Claim 17, wherein:
the intended route data is at least partially based on the airport operations data including airport flow and routing constraints incorporated via the airport operations data (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]).
Regarding Claim 19, Bharadwaja et al. does not expressly recite the claimed system of Claim 17, wherein each confidence metric is at least partially based on observations of the one or more other aircraft over time including correlations of historical movement patterns.
However, Roberts et al. (8,255,147) teaches wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions (Roberts et al.; “…the uncertainty regions of the two aircraft…”, see col.8, particularly lines 20-43 and “…the current position…”, see col.4, particularly lines 24-39 and “The rate of change of position and each of the variables above is calculated, and from this, the state at future point (i+1) is calculated by moving forward in time to time (t.sub.i+1), applying the rates of change calculated”, see col.6, particularly lines 5-52).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Roberts et al., and wherein each confidence metric is at least partially based on observations of the one or more other aircraft over time including correlations of historical movement patterns, as rendered obvious by Roberts et al., in order to “aiding air traffic control” (Roberts et al.; see col.1, lines 14-15).
Claims 8, 9 and 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Bharadwaja et al. (2025/0269978) in view of Coles et al. (5,596,332) further in view of Baladhandapani et al. (2023/0215431), further in view of Roberts et al. (8,255,147).
Examiner’s Note:
Regarding Claim 8, the limitation “identifying a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” is a contingent limitation that does not need to occur if neither “when” condition is satisfied, therefore, this limitation is not required by the prior art under the present interpretation of Claim 8. Furthermore, the step “and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent” is not required to occur, as the claim already requires “receiving intended route data for the one or more other aircraft”, meaning the “intended route data” is required to be received and is therefore “available”, therefore, the condition that “the intended route data is unavailable or inconsistent” is not required to occur and therefore is not required by the prior art.
Regarding Claim 8, Bharadwaja et al. teaches the claimed method comprising:
determining a current location, velocity, acceleration, and route for a present aircraft (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029]);
determining a future position of the present aircraft based on the current location…and route (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]);
receiving position and velocity data for one or more other aircraft, distinct from the present aircraft (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029]);
automatically aggregating and correlating airport operations data (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031]),…ADS-B data (“…the tracking sub-system is an automatic dependent surveillance-broadcast (ADS-B) tracking sub-system”, see P[0029]), onboard sensor data (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031] and FIG. 2), and airport maps (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031] and FIG. 2);
receiving intended route data for the one or more other aircraft (“The tracking sub-system 118 receives the transmitted position signal from the position receivers to determine a current and real time position, heading, velocity, and the like of the aircraft 102”, see P[0031]);
predicting a future position of each of the one or more other aircraft based on the intended route data and the aggregated and correlated data (“The position sensors 116 output signals indicative of one or more of the position, altitude, heading, acceleration, velocity, and/or the like of the various aircraft 102. The signals are received by the tracking sub-system 118”, see P[0029] and “In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]);
…;
identifying a lost aircraft when observed movement of one of the one or more other aircraft does not match the corresponding intended route data, and revert to predictions based solely on position and velocity when the intended route data is unavailable or inconsistent; and
determining if any predicted future position intersects with the future position of the present aircraft (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]) by projecting the predicted future positions onto airport-surface geometry including runway and taxiway structures as defined by the airport maps (see FIG. 2 and “…intersection of an intended path graphic with a structure, or another intended path graphic, and/or the like”, see P[0046]).
Bharadwaja et al. does not expressly recite the bolded portions of the claimed
determining a future position of the present aircraft based on the current location, velocity, acceleration, and route.
However, Coles et al. (5,596,332) teaches determining a probability of path intersections between two aircraft based on relative position, velocities, and paths of travel of both aircraft (Coles et al.; see col.10, particularly lines 55-62).
Bharadwaja et al. does not expressly recite the bolded portions of the claimed
automatically aggregating and correlating airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps.
However, Baladhandapani et al. (2023/0215431) teaches determining aircraft context and determining aircraft actions based on “voice recognition output” (Baladhandapani et al.; “…natural language processing may be applied to the textual representations of the clearance communications that were directed to the ownship aircraft by ATC, provided by the ownship aircraft to ATC, broadcasted by ATIC or otherwise received from ATIS to identify the operational subject(s) of the clearance communications and any operational parameter value(s) and/or aircraft action(s) associated with the clearance communications, which are then stored or otherwise maintained in association with the transcribed audio content of the received audio communication in the clearance table 226”, see P[0031] and “…the command system 204 and/or the voice command recognition application 240 receives indicia of the current operational context for the aircraft (e.g., the current location of the aircraft with respect to a taxi clearance, a flight plan, or other defined route or manner of operation of the aircraft, the current flight phase, the current geographic location of the aircraft, the current altitude of the aircraft, the current physical configuration of the aircraft, and/or the like) from one or more onboard system(s) 208 in addition to retrieving or otherwise obtaining the current conversational context associated with the aircraft (e.g., the subset of ATC clearance communications directed to and/or sent from the ownship aircraft) from the clearance table 226”, see P[0043]).
Bharadwaja et al. does not expressly recite the claimed
determining a confidence metric associated with each predicted future position.
However, Roberts et al. (8,255,147) teaches wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions (Roberts et al.; “…the uncertainty regions of the two aircraft…”, see col.8, particularly lines 20-43).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Coles et al., Baladhandapani et al. and Roberts et al., and determining a future position of the present aircraft based on the current location, velocity, acceleration, and route, and automatically aggregating and correlating airport operations data, ATC voice-recognition output, ADS-B data, onboard sensor data, and airport maps, and determining a confidence metric associated with each predicted future position, as rendered obvious by Coles et al., Baladhandapani et al. and Roberts et al., in order to determine if “there is an intersection between the probabilistic volumes of two aircraft in the near future” (Coles et al.; see col.5, lines 48-52), and in order to “identify or otherwise determine a conversational context” (Baladhandapani et al.; see P[0032]), and in order to “aiding air traffic control” (Roberts et al.; see col.1, lines 14-15).
Regarding Claim 9, Bharadwaja et al. does not expressly recite the claimed method of Claim 8, wherein the intended route data is derived via voice recognition applied to communications between the or more other aircraft and a ground control.
However, Baladhandapani et al. (2023/0215431) teaches wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control (Baladhandapani et al.; “…natural language processing may be applied to the textual representations of the clearance communications that were directed to the ownship aircraft by ATC, provided by the ownship aircraft to ATC, broadcasted by ATIC or otherwise received from ATIS to identify the operational subject(s) of the clearance communications and any operational parameter value(s) and/or aircraft action(s) associated with the clearance communications, which are then stored or otherwise maintained in association with the transcribed audio content of the received audio communication in the clearance table 226”, see P[0031] and “…the command system 204 and/or the voice command recognition application 240 receives indicia of the current operational context for the aircraft (e.g., the current location of the aircraft with respect to a taxi clearance, a flight plan, or other defined route or manner of operation of the aircraft, the current flight phase, the current geographic location of the aircraft, the current altitude of the aircraft, the current physical configuration of the aircraft, and/or the like) from one or more onboard system(s) 208 in addition to retrieving or otherwise obtaining the current conversational context associated with the aircraft (e.g., the subset of ATC clearance communications directed to and/or sent from the ownship aircraft) from the clearance table 226”, see P[0043]).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Baladhandapani et al., and wherein the intended route data is derived via voice recognition applied to communications between the one or more other aircraft and a ground control, as rendered obvious by Baladhandapani et al., in order to “identify or otherwise determine a conversational context” (Baladhandapani et al.; see P[0032]).
Regarding Claim 11, Bharadwaja et al. does not expressly recite the claimed method of Claim 8, further comprising receiving airport operations data (“…the aircraft 102 is tracked by the tracking sub-system 118. The control unit 120 receives the tracking data from the tracking sub-system 118 to determine a position of the aircraft 102 on the ground path(s) 104 at the airport 106”, see P[0034]), wherein the intended route data is at least partially based on the airport operations data (“In at least one example, the control unit 120 can also provide an intended path graphic in relation to the aircraft model 124 for the aircraft 102 shown on the display 112. The intended path graphic can be a polygon that shows a future location of the aircraft 102 continuing on a current course for a predetermined period of time, such as 30 seconds or less. Optionally, the predetermined period of time can be less than 30 seconds (such as 10 seconds), or greater than 30 seconds (such as 1 minute or more). If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict, thereby allowing the operator of the aircraft 102 to take a corrective action”, see P[0038]).
Regarding Claim 12, Bharadwaja et al. does not expressly recite the claimed method of Claim 8, wherein each confidence metric is at least partially based on observations of the one or more other aircraft over time including correlations of historical movement patterns.
However, Roberts et al. (8,255,147) teaches wherein the at least one processor is further configured to determine a confidence metric associated with each of the predicted future positions (Roberts et al.; “…the uncertainty regions of the two aircraft…”, see col.8, particularly lines 20-43 and “…the current position…”, see col.4, particularly lines 24-39 and “The rate of change of position and each of the variables above is calculated, and from this, the state at future point (i+1) is calculated by moving forward in time to time (t.sub.i+1), applying the rates of change calculated”, see col.6, particularly lines 5-52).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Roberts et al., and wherein each confidence metric is at least partially based on observations of the one or more other aircraft over time including correlations of historical movement patterns, as rendered obvious by Roberts et al., in order to “aiding air traffic control” (Roberts et al.; see col.1, lines 14-15).
Regarding Claim 13, Bharadwaja et al. teaches the claimed method of Claim 8, wherein the at least one processor is further configured to produce an alert if any predicted future position is determined to intersect with the future position of the present aircraft (“If the intended path graphic intersects with an intended path graphic of another aircraft 102 or ground vehicle as shown on the display through respective models, the control unit 120 can output an alert to the aircraft 102 regarding a potential conflict…”, see P[0038]).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Bharadwaja et al. (2025/0269978) in view of Coles et al. (5,596,332) further in view of Baladhandapani et al. (2023/0215431), further in view of Roberts et al. (8,255,147), further in view of Doyen et al. (9,830,829).
Regarding Claim 10, Bharadwaja et al. does not expressly recite the claimed method of Claim 8, further comprising confirming that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation.
However, Doyen et al. (9,830,829) teaches wherein the at least one processor is further configured to confirm that the one or more other aircraft are adhering to the corresponding intended route data and also determining a deviation from an intended route (Doyen et al.; “The exemplary system 200 may include at least one voice recognition device/unit 240, which may be available to convert voice communications received from any source to data elements, or conversely to convert data elements received from any source to voice communications. An objective of the inclusion of such a voice recognition device/unit 240 may be to provide an automated transcription of the voice communications into a format that is usable by the exemplary system 240 to supplement other source data to generate and/or modify indications of intent for the operation of the aircraft”, see col.11, particularly lines 6-21 and “…the disclosed schemes may detect deviations from intended operations, operating parameters or operating values that may be indicative of, for example, a deviation from an intended flight planned route…”, see col.6, particularly lines 31-53, also see col.9, particularly lines 7-18).
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bharadwaja et al. with the teachings of Doyen et al., and confirming that the one or more other aircraft are adhering to the corresponding intended route data based on discrepancies identified during aggregation and correlation, as rendered obvious by Doyen et al., in order to “identify deviations from intended operation of a particular aircraft” (Doyen et al.; see col.2, lines 14-30).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ISAAC G SMITH whose telephone number is (571)272-9593. The examiner can normally be reached Monday-Thursday, 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ANISS CHAD can be reached at 571-270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ISAAC G SMITH/ Primary Examiner, Art Unit 3662