DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
“a data input module configured to obtain object data…”
“a data output module configured to output the contextualized list of detected objects…”
“external surveillance systems and services configured to detect data regarding objects on the aerodrome surface”
“the non-cooperative sensing systems which, using the contextualised list of detected objects, are configured to support internal detection”
“ the external surveillance systems and services which, using the contextualised list of detected objects, are configured to improve the external surveillance systems” in claims 1-10.
A review of the specification shows that the following appears to be the corresponding structure for the above limitation described in the specification: (see at least Applicant Specification, para. [0062-0064]: The processing system 130 comprise the data input module 132, a processor 134 and a data output module 136. The processor 134 processes, at the data input module 132, the data and information collected by the input systems 110 and the support systems 120 to derive a contextualised list of objects detected and their features, including, but not limited to, threat levels for each object. This provides a unified situational awareness of the objects detected close to the ownship or along its path, particularly of objects which may present a danger to the ownship. The processor 134 is configured to carry out the various processes or methods described in the present disclosure….The output systems 140 comprise downstream consumers which receive output information from the data output module 136 of the processing system 130 in the form of the contextualised list of objects detected and their features, including, but not limited to, threat levels for each object. The downstream consumers include human-machine interfaces (HMIs) 142, ownship guidance systems 144, the non-cooperative sensing systems 112 and the external surveillance systems and services 116….The HMIs 142 comprise dedicated HMIs in the flight deck, and provide information to the pilot and/or flight crew. The HMIs communicate information via audio, visual and/or tactile means, e.g., via one or more of a screen, a dashboard, an audio alert and a vibrating seatback.).
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The term “near the ground” in claims 1 & 11 is a relative term which renders the claim indefinite. The term “near” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention.
The term “near-field and far field” in claim 10 is a relative term which renders the claim indefinite. The term “near and far” are not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 1-14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
A claim that recites an abstract idea, a law of nature, or a natural phenomenon is directed to a judicial exception. Abstract ideas include the following groupings of subject matter, when recited as such in a claim limitation: (a) Mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations; (b) Certain methods of organizing human activity – fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and (c) Mental processes – concepts performed in the human mind (including an observation, evaluation, judgment, opinion). See the 2019 Revised Patent Subject Matter Eligibility Guidance.
Even when a judicial element is recited in the claim, an additional claim element(s) that integrates the judicial exception into a practical application of that exception renders the claim eligible under §101. A claim that integrates a judicial exception into a practical application will apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception. The following examples are indicative that an additional element or combination of elements may integrate the judicial exception into a practical application:
the additional element(s) reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field;
the additional element(s) that applies or uses a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition;
the additional element(s) implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim;
the additional element(s) effects a transformation or reduction of a particular article to a different state or thing; and
the additional element(s) applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception.
Examples in which the judicial exception has not been integrated into a practical application include:
the additional element(s) merely recites the words ‘‘apply it’’ (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea;
the additional element(s) adds insignificant extra-solution activity to the judicial exception; and
the additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use.
See the 2019 Revised Patent Subject Matter Eligibility Guidance.
Claims 1 & 11 recite obtain object data and contextual data, object data relates to objects detected around the aircraft, and the contextual data relates to information about the aircrafts route and environment, combine the data into an aggregated list of detected objects, label the aggregated list of detected objects using the contextual data to form a contextualized list of detected objects for use in determining collision avoidance, as drafted, is a device & process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer elements. The claim is practically able to be performed in the mind. For example, but for the “A collision avoidance system tor aggregating and processing data when an aircraft is on or near the ground, the collision avoidance system comprising, a data input module, a processor configured to, a data output module, A method of aggregating and processing data when an aircraft is on or near the ground, the method comprising,” language, “obtain object data and contextual data, object data relates to objects detected around the aircraft, and the contextual data relates to information about the aircrafts route and environment, combine the data into an aggregated list of detected objects, label the aggregated list of detected objects using the contextual data to form a contextualized list of detected objects for use in determining collision avoidance” in the context of this claim encompasses the user detecting objects and ranking their threat level based on aircrafts route to collide with the objects. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application. In particular, the claim only recites additional elements – using “A collision avoidance system tor aggregating and processing data when an aircraft is on or near the ground, the collision avoidance system comprising, a data input module, a processor configured to, a data output module, A method of aggregating and processing data when an aircraft is on or near the ground, the method comprising,”. The devices are recited at a high-level of generality (i.e., device configured to detect objects around the aircraft) such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
The claim(s) do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements, as discussed above with respect to integration of the abstract idea into a practical application, the additional elements of using “A collision avoidance system tor aggregating and processing data when an aircraft is on or near the ground, the collision avoidance system comprising, a data input module, a processor configured to, a data output module, A method of aggregating and processing data when an aircraft is on or near the ground, the method comprising,”, amounts to no more than mere instructions to apply the exception using generic computer components. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. The claim is not patent eligible.
Similarly for claims 2-10, & 12-14, is a device that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. In the context of this claim encompasses the user collecting and combing object data and measuring ranges of how close the objects are to the aircraft. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application. In particular, the claim only recites additional elements. The claim(s) do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The devices are recited at a high-level of generality (i.e., device configured to detect objects around the aircraft) such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. The claim is not patent eligible.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 3-7, & 11-12 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by US 2025/0131836A1 (“Gu”).
As per claim 1 Gu discloses
A collision avoidance system for aggregating and processing data when an aircraft is on or near the ground (see at least Gu, para. [0149]: In a further example of the system 2…the risks associated with ground operations (Bl to B16) in the aviation environment near and at the airport can be more particularly monitored including taxiing collision/near collision, foreign object damage/debris 55, objects falling from aircraft 56, jet blast/propeller/rotor wash 57, fire/fume/smoke 58, fuel leaks 59, damage to aircraft fuselage/wings/empennage 60…), the collision avoidance system comprising:
a data input module configured to obtain object data and contextual data from a plurality of aircraft systems (see at least Gu, para. [0150-0153]: Further the system 2 in step 304 is configured to use the fused information to detect and identify at least one object, such as the aircraft(s) 16, ground vehicles and crew 18, 20, and airport infrastructure such as the boarding gates and the like, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties. For example, the aircrafts' position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft of interest and object of interest, e.g. the ground vehicles and crew, boarding gates, gate boundaries and the like.), wherein:
the object data relates to objects detected around the aircraft (see at least Gu, para. [0150-0153]: Further the system 2 in step 304 is configured to use the fused information to detect and identify at least one object, such as the aircraft(s) 16, ground vehicles and crew 18, 20, and airport infrastructure such as the boarding gates and the like, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties.); and
the contextual data relates to information about the aircraft's route and environment (see at least Gu, para. [0150-0153]: For example, the aircrafts' position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft of interest and object of interest, e.g. the ground vehicles and crew, boarding gates, gate boundaries and the like.);
a processor configured to:
combine the object data into an aggregated list of detected objects (see at least Gu, para. [0137]: In the right column of Table 3, there are shown detection and tracking multiple objects data processing capability and brief safe operation criteria that are required for each occurrence group, including object types, classes, different physical (both current and predicted) properties of each monitored object, and the types of risks and accompanying safe operation criteria that is associated with each occurrence type.); and
label the aggregated list of detected objects using the contextual data to form a contextualised list of detected objects for use in determining collision avoidance (see at least Gu, para. [0137]: Table 4 provides additional details into the particular safe and unsafe operation criteria (left column) for each of the occurrence types and in the right hand column there is provided the examples of assessment criteria/method for each of the safety operation criteria.); and
a data output module configured to output the contextualised list of detected objects to a set of output systems (see at least Gu, para. [0108]: Once installed, the software application of the host service 4 provides an interface that enables the host service 4 to facilitate communication of information and/or alerts, including sensor information, raw or processed, to a predetermined user 16, 18, 20. & para. [0133] & para. [0171]: The system can provide real-time monitoring of aviation activities, detection of unsafe aviation activity and generation of alerts, which can be displayed on at least one standalone screen or can be integrated with existing systems located in at least a cockpit of said aircraft, air traffic control towers/centres, ground control locations and airport emergency response team locations. The display format may include 3-D map and panoramic view.).
As per claim 3 Gu discloses
wherein the plurality of aircraft systems comprises input systems configured to provide the object data, optionally wherein the plurality of aircraft systems comprises support systems configured to provide the contextual data (see at least Gu, para. [0150-0153]: Further the system 2 in step 304 is configured to use the fused information to detect and identify at least one object, such as the aircraft(s) 16, ground vehicles and crew 18, 20, and airport infrastructure such as the boarding gates and the like, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties. For example, the aircrafts' position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft of interest and object of interest, e.g. the ground vehicles and crew, boarding gates, gate boundaries and the like.).
As per claim 4 Gu discloses
wherein the input systems comprise one or more of:
non-cooperative sensing systems comprising sensors on board the aircraft configured to detect objects not actively providing information about themselves;
cooperative sensing systems comprising sensors configured to detect data transmitted by other vehicles relating to the position and velocity of the other vehicles; and
external surveillance systems and services configured to detect data regarding objects on the aerodrome surface (see at least Gu, para. [0111]: Referring particularly to FIGS. 1, 2 and 4, there are provided about 10 or more monitoring units 22 which each include one of each of the least two sensors 26, 28, 30 and which are provided in multiple locations throughout the aviation environment near and at airport including runway, taxiway, apron, ramp areas, passenger boarding bridges, ground service vehicles, ground support vehicles, ground crew, airport building structures including gates. Monitoring units 22 are mounted on aircraft 16 and/or in locations on ground service vehicles and/or ground support vehicles 18 and equipment, ground personnel 20.).
As per claim 5 Gu discloses
wherein the support systems comprise one or more of:
navigation systems configured to provide information about one or more of the aircraft's position, velocity, and heading (see at least Gu, para. [0150-0153]: Further the system 2 in step 304 is configured to use the fused information to detect and identify at least one object, such as the aircraft(s) 16, ground vehicles and crew 18, 20, and airport infrastructure such as the boarding gates and the like, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties. For example, the aircrafts' position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft of interest and object of interest, e.g. the ground vehicles and crew, boarding gates, gate boundaries and the like.);
taxi navigation and management systems configured to provide information about one or more of the aircraft's position, taxi route, and the trajectory of other vehicles;
databases configured to provide information about one or more of airport runways, airport taxiways, non-movement area layouts, and aerodrome structures (see at least Gu, para. [0140]: Additional information 54 can be received by the processing system 4 to assist and/or facilitate calculation of the objects' physical properties and estimation/prediction of their physical properties and/or safe operation criteria, including runway data, such as length, boundaries, entries and exits, surface characteristics such as material or friction coefficients, and/or surface conditions such as wet, ice/snow, metrological data such as wind, temperature and the like, and aircraft data, such air craft type and capabilities/characteristics, weight, flying phase and/or intended or reference position or motion.); and
the non-cooperative sensing systems.
As per claim 6 Gu discloses
wherein the processor is configured to aggregate data from multiple sensing systems using one or more of heuristic algorithms, machine learning models, and neural networks (see at least Gu, para. [0129]: In the example 'Stage 3' summarised in Table 2, the processing system 4 can first process the camera sensor information received from the camera images/video frames to detect and/or identify the objects. In particular, the artificial intelligence-based data processing system 4 employs machine- or deep-learning-based object detection and/or classification models, which are trained, validated, verified and optimised for detection and classification of objects involved in aviation activities in an aviation environment near and at airport. The object detection and/or classification models that can be utilised include You Only Look Once (YOLO) or Fully Convolutional One-Stage (FCOS) models although it is expected that other artificial intelligence-based models could equally be used instead for similar effect.).
As per claim 7 Gu discloses
wherein the output systems comprise one or more of:
human-machine interfaces configured to provide information to the pilot and/or flight crew (see at least Gu, para. [0108]: Once installed, the software application of the host service 4 provides an interface that enables the host service 4 to facilitate communication of information and/or alerts, including sensor information, raw or processed, to a predetermined user 16, 18, 20. & para. [0133] & para. [0171]: The system can provide real-time monitoring of aviation activities, detection of unsafe aviation activity and generation of alerts, which can be displayed on at least one standalone screen or can be integrated with existing systems located in at least a cockpit of said aircraft, air traffic control towers/centres, ground control locations and airport emergency response team locations. The display format may include 3-D map and panoramic view.);
ownship guidance systems comprising taxi guidance systems configured to provide automated control for movement of the aircraft on the aerodrome surface;
the non-cooperative sensing systems which, using the contextualised list of detected objects, are configured to support internal detection and/or tracking and
resolving ambiguities in their detection algorithms; and
the external surveillance systems and services which, using the contextualised list of detected objects, are configured to improve the external surveillance systems and services' situational awareness of connected clients, and/or improving the situational awareness of the connected clients.
As per claim 11 Gu discloses
A method of aggregating and processing data when an aircraft is on or near the ground (see at least Gu, para. [0149]: In a further example of the system 2…the risks associated with ground operations (Bl to B16) in the aviation environment near and at the airport can be more particularly monitored including taxiing collision/near collision, foreign object damage/debris 55, objects falling from aircraft 56, jet blast/propeller/rotor wash 57, fire/fume/smoke 58, fuel leaks 59, damage to aircraft fuselage/wings/empennage 60…), the method comprising:
obtaining, using a data input module, object data and contextual data from a plurality of aircraft systems (see at least Gu, para. [0150-0153]: Further the system 2 in step 304 is configured to use the fused information to detect and identify at least one object, such as the aircraft(s) 16, ground vehicles and crew 18, 20, and airport infrastructure such as the boarding gates and the like, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties. For example, the aircrafts' position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft of interest and object of interest, e.g. the ground vehicles and crew, boarding gates, gate boundaries and the like.), wherein;
the object data relates to objects detected around the aircraft (see at least Gu, para. [0150-0153]: Further the system 2 in step 304 is configured to use the fused information to detect and identify at least one object, such as the aircraft(s) 16, ground vehicles and crew 18, 20, and airport infrastructure such as the boarding gates and the like, to calculate the at least one objects' physical properties, and to predict the at least one objects' physical properties.); and
the contextual data relates to information about the aircraft's route and environment (see at least Gu, para. [0150-0153]: For example, the aircrafts' position, travel direction, velocity, acceleration, altitude and attitude is monitored as well as the distance between aircraft of interest and object of interest, e.g. the ground vehicles and crew, boarding gates, gate boundaries and the like.);
combining, at a processor, the object data into an aggregated list of detected objects (see at least Gu, para. [0137]: In the right column of Table 3, there are shown detection and tracking multiple objects data processing capability and brief safe operation criteria that are required for each occurrence group, including object types, classes, different physical (both current and predicted) properties of each monitored object, and the types of risks and accompanying safe operation criteria that is associated with each occurrence type.);
labelling, at the processor, the aggregated list of detected objects using the contextual data to form a contextualised list of detected objects for use in determining collision avoidance (see at least Gu, para. [0137]: Table 4 provides additional details into the particular safe and unsafe operation criteria (left column) for each of the occurrence types and in the right hand column there is provided the examples of assessment criteria/method for each of the safety operation criteria.); and
outputting, using a data output module, the contextualised list of detected objects to a set of output systems (see at least Gu, para. [0108]: Once installed, the software application of the host service 4 provides an interface that enables the host service 4 to facilitate communication of information and/or alerts, including sensor information, raw or processed, to a predetermined user 16, 18, 20. & para. [0133] & para. [0171]: The system can provide real-time monitoring of aviation activities, detection of unsafe aviation activity and generation of alerts, which can be displayed on at least one standalone screen or can be integrated with existing systems located in at least a cockpit of said aircraft, air traffic control towers/centres, ground control locations and airport emergency response team locations. The display format may include 3-D map and panoramic view.).
As per claim 12 Gu discloses
wherein the combining comprises aggregating data from the aircraft sensors, optionally wherein the combining comprises aggregating data from sensors external to the aircraft (see at least Gu, para. [0111]: Referring particularly to FIGS. 1, 2 and 4, there are provided about 10 or more monitoring units 22 which each include one of each of the least two sensors 26, 28, 30 and which are provided in multiple locations throughout the aviation environment near and at airport including runway, taxiway, apron, ramp areas, passenger boarding bridges, ground service vehicles, ground support vehicles, ground crew, airport building structures including gates. Monitoring units 22 are mounted on aircraft 16 and/or in locations on ground service vehicles and/or ground support vehicles 18 and equipment, ground personnel 20.).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over Gu, in view of US 2013/0110323A1 (“Knight”).
As per claim 2 Gu does not explicitly disclose
wherein the collision avoidance system is configured to, when the aircraft is within a specified range of the aerodrome surface, commence operation as the aircraft approaches the ground, remain active during aerodrome surface operations, and cease operation after takeoff.
Knight teaches
wherein the collision avoidance system is configured to, when the aircraft is within a specified range of the aerodrome surface, commence operation as the aircraft approaches the ground, remain active during aerodrome surface operations, and cease operation after takeoff (see at least Knight, para. [0047-0048]: At block/task/step 315, the processor 220 determines whether the aircraft 100 is on the ground and moving below a threshold ground speed. When the processor 220 determines that the aircraft 100is either (1) not on the ground, or (2) is not moving or (3) is moving above a threshold ground speed, method 300 loops back to block/task/step 315. This way, when the aircraft is in the air (i.e., not on the ground), or alternatively is on the ground and not moving, the system is effectively disabled to prevent the cockpit display from being activated and displaying the video images in cases where it would not be useful…By contrast, when the processor 220 determines that the aircraft 100 is both on the ground and moving below the threshold ground speed, the system 200 is enabled and the method 300 proceeds to block/task/step 320. At 320, the video imagers acquire video images of various regions around the aircraft 100 that correspond to each of the video imagers. In some operational scenarios, the video imagers will already be enabled and is use for other purposes (e.g., to display views outside the aircraft to the crew or passengers).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Gu to incorporate the teaching of wherein the collision avoidance system is configured to, when the aircraft is within a specified range of the aerodrome surface, commence operation as the aircraft approaches the ground, remain active during aerodrome surface operations, and cease operation after takeoff of Knight, with a reasonable expectation of success, in order to provide methods, systems and apparatus that can reduce the likelihood of and/or prevent collisions with the detected obstacles (see at least Knight, para. [0005]).
Claim(s) 8-9, & 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Gu, in view of US 2015/0194060A1 (“Mannon”).
As per claim 8 Gu does not explicitly disclose
wherein the aggregated list of detected objects is provided with georeferenced information regarding the position, velocity, and heading of each object.
Mannon teaches
wherein the aggregated list of detected objects is provided with georeferenced information regarding the position, velocity, and heading of each object (see at least Mannon, para. [0035]: a ground obstacle collision alert indicative of a ground obstacle collision condition, which can include, for example, a condition in which there is a potential for a collision between the aircraft and an obstacle while the aircraft is on the ground, e.g., due to the distance between the aircraft and the obstacle, due to the velocity and direction of the aircraft relative to the obstacle, or any combination thereof. & para. [0095]: As discussed above, in some examples, processor 16 is configured to determine a location of a detected obstacle based on a radial coordinate system, which may be determined relative to one or more fixed points on aircraft 12, which can be, for example, on the two wings of aircraft 12. Thus, processor 16 can determine both the distance between aircraft 12 and a detected obstacle, as well as the angular direction of the detected obstacle relative to aircraft 12, and position graphical representation of detected obstacle 40 relative to graphical representation of aircraft 38 based on the determined distance and angular direction.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Gu to incorporate the teaching of wherein the aggregated list of detected objects is provided with georeferenced information regarding the position, velocity, and heading of each object of Mannon,with a reasonable expectation of success, in order to help improve crew member awareness of obstacles (see at least Mannon, para. [0026]).
As per claim 9 Gu does not explicitly disclose
wherein the aircraft's environment is divided into proximity zones based on proximity to the aircraft, such that the collision avoidance system is configured to track any one object as the object moves through different proximity zones.
Mannon teaches
wherein the aircraft's environment is divided into proximity zones based on proximity to the aircraft, such that the collision avoidance system is configured to track any one object as the object moves through different proximity zones (see at least Mannon, para. [0064]: For example, processor 16 can characterize detected obstacles as one of primary targets, intermediate targets, and secondary targets, based on the proximity of the detected aircraft to aircraft 12. The characterization of a detected obstacle as one of these types of targets may indicate a threat level of the detected obstacle, e.g., as a function of the possibility aircraft 12 will collide with the detected obstacle. & para. [0067-0068]: Memory 24 (FIG. 1) of aircraft 12 or another memory can store the parameters (e.g., vertical heights and lateral distances) with which processor 16 determines a threat level of a detected obstacle, e.g., the parameters with which processor 16 characterizes a detected obstacle as a primary, an intermediate, or a secondary target. In some examples, a primary target is an object on the ground within direct Strike Zone of a structure of aircraft 12, Such as a wing, wingtip or nacelle. The direct Zone is a Zone in which the aircraft 12 will strike the obstacle if aircraft 12 continues on its current heading. In addition, in some examples, an intermediate target is an object on the ground located just outside the direct strike Zone of a structure of aircraft 12, such as up to 10 feet or up to 3 meters laterally relative to the aircraft wing, where the lateral direction is in a direction Substantially perpendicular to the heading of aircraft 12.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Gu to incorporate the teaching of wherein the aircraft's environment is divided into proximity zones based on proximity to the aircraft, such that the collision avoidance system is configured to track any one object as the object moves through different proximity zones of Mannon,with a reasonable expectation of success, in order to help improve crew member awareness of obstacles (see at least Mannon, para. [0026]).
As per claim 13 Gu does not explicitly disclose
wherein the combining of the object data into an aggregated list of detected objects further comprises providing georeferenced information regarding the position, velocity, and heading of each object.
Mannon teaches
wherein the combining of the object data into an aggregated list of detected objects further comprises providing georeferenced information regarding the position, velocity, and heading of each object (see at least Mannon, para. [0035]: a ground obstacle collision alert indicative of a ground obstacle collision condition, which can include, for example, a condition in which there is a potential for a collision between the aircraft and an obstacle while the aircraft is on the ground, e.g., due to the distance between the aircraft and the obstacle, due to the velocity and direction of the aircraft relative to the obstacle, or any combination thereof. & para. [0095]: As discussed above, in some examples, processor 16 is configured to determine a location of a detected obstacle based on a radial coordinate system, which may be determined relative to one or more fixed points on aircraft 12, which can be, for example, on the two wings of aircraft 12. Thus, processor 16 can determine both the distance between aircraft 12 and a detected obstacle, as well as the angular direction of the detected obstacle relative to aircraft 12, and position graphical representation of detected obstacle 40 relative to graphical representation of aircraft 38 based on the determined distance and angular direction.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Gu to incorporate the teaching of wherein the combining of the object data into an aggregated list of detected objects further comprises providing georeferenced information regarding the position, velocity, and heading of each object of Mannon,with a reasonable expectation of success, in order to help improve crew member awareness of obstacles (see at least Mannon, para. [0026]).
As per claim 14 Gu discloses
wherein the labelling comprises using the contextual data to determine the relevance of the detected objects to the aircraft, further comprising determining one or more of threat level information, alerts, and indications for the detected objects (see at least Gu, para. [0137]: In the right column of Table 3, there are shown detection and tracking multiple objects data processing capability and brief safe operation criteria that are required for each occurrence group, including object types, classes, different physical (both current and predicted) properties of each monitored object, and the types of risks and accompanying safe operation criteria that is associated with each occurrence type. Table 4 provides additional details into the particular safe and unsafe operation criteria (left column) for each of the occurrence types and in the right hand column there is provided the examples of assessment criteria/method for each of the safety operation criteria. & para. [0155-0157]: Alternatively, the system 2 is configured to determine that the comparison shows that risk of collision or near collision is medium or high, i.e. runway excursion may occur in the next 120 seconds, or in the next 20 seconds, and the system is further configured to transmit at least one alert to at least one user accordingly i.e. yellow alert or red alert.).
However Gu does not explicitly disclose
for each of the detected objects.
Mannon teaches
further comprising determining one or more of threat level information, alerts, and indications for each of the detected objects (see at least Mannon, para. [0064]: For example, processor 16 can characterize detected obstacles as one of primary targets, intermediate targets, and secondary targets, based on the proximity of the detected aircraft to aircraft 12. The characterization of a detected obstacle as one of these types of targets may indicate a threat level of the detected obstacle, e.g., as a function of the possibility aircraft 12 will collide with the detected obstacle. & para. [0067-0068]: Memory 24 (FIG. 1) of aircraft 12 or another memory can store the parameters (e.g., vertical heights and lateral distances) with which processor 16 determines a threat level of a detected obstacle, e.g., the parameters with which processor 16 characterizes a detected obstacle as a primary, an intermediate, or a secondary target. In some examples, a primary target is an object on the ground within direct Strike Zone of a structure of aircraft 12, Such as a wing, wingtip or nacelle. The direct Zone is a Zone in which the aircraft 12 will strike the obstacle if aircraft 12 continues on its current heading. In addition, in some examples, an intermediate target is an object on the ground located just outside the direct strike Zone of a structure of aircraft 12, such as up to 10 feet or up to 3 meters laterally relative to the aircraft wing, where the lateral direction is in a direction Substantially perpendicular to the heading of aircraft 12.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Gu to incorporate the teaching of further comprising determining one or more of threat level information, alerts, and indications for each of the detected objects of Mannon,with a reasonable expectation of success, in order to help improve crew member awareness of obstacles (see at least Mannon, para. [0026]).
Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Gu, in view of Mannon, in view of US 2025/0058700A1 (“Nagaraja”).
As per claim 10 Gu does not explicitly disclose
wherein the proximity zones comprise a near-field zone and a far-field zone, optionally wherein the nearfield zone is represented using occupancy grid maps and the far-field zone provides information and predictions about object position and/or velocity