Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-20 are pending
Claims 1, 6, 14, and 19 are amended
Response to Arguments
35 U.S.C. § 103
Applicant’s arguments with respect to claims 1 and 14 have been considered but are moot in view of the new grounds of rejection as necessitated by applicant's amendments.
Please see 35 U.S.C. § 103 rejection below.
Objections to the Claims
Given applicants amendments to claims 6 and 19, examiner agrees that the claim objections are overcome. The claim objections for claims 6 and 19 are withdrawn.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The term “necessary information” in claim 1 and 14 is a relative term which renders the claim indefinite. The term “necessary information” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The term “necessary information” is broad and cannot be further narrowed by a person having ordinary skill in the art as the word “necessary” is heavily subjective.
The examiner understands applicant’s intentions to direct the data displayed to be filtered to merely what is required for the different flight modes in consistency with ¶ 0073. Further defining what the “necessary information” entails in relation with each flight mode in consistency with the specification may overcome the 35 U.S.C. 112(b) rejection. An alternative solution to the 112(b) rejection would be defining the “necessary information” to be the dynamic necessary information system within ¶ 0073 wherein “when the parameters approach the limits of their respective normal operating ranges, more details and warning indicators may be displayed to the pilot indicating that action needs to be taken”.
Claims 2-13 and 15-20 are rejected under 35 U.S.C. 112(b) due to their dependence on claims 1 and 14 respectively.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
Claims 1-7, 9, 10, and 14-20 are rejected under 35 U.S.C. 103 as being unpatentable over MARGOLIN (US 5904724 A) in view of KNEUPER (US 20170344181 A1) in further view of TIANA (US 20200279494 A1).
Regarding claim 1:
MARGOLIN discloses:
A system for providing synthetic vision to a human operator, the system comprising: (see at least MARGOLIN, Col 1 lines 49-55, “The aircraft uses a communications link to send its location, attitude, and other operating conditions to the remote pilot station. The remote pilot station receives the data and uses a database describing the terrain and manmade structures in the remote aircrafts environment to produce a 3D view of the remote aircraft environment and present it to the remote human pilot.”)
a display device disposed at a control station remote from a movable object capable of translational and rotational movement; (see at least MARGOLIN, Col 2 lines 35-39, “In the first embodiment, the remote pilot is provided with a standard video display. Additional display channels can be provided to give the remote pilot a greater field of view. There can even be a display channel to give a rearward facing view.”; Col 3 lines 20-27, “FIG. 1 is a general illustration showing a remote pilot at a remote pilot station operating a remote aircraft according to one embodiment of the invention. FIG. 1 shows Remote Pilot 102 interacting with Remote Pilot Station 101 and controlling Remote Aircraft 103. Remote Pilot Station 101 and Remote Aircraft 103 respectively include an Antenna 104 and an Antenna 105 for communicating Information 106.”; Col 3 lines 40-44, “A second embodiment uses a head mounted display for the remote pilot instead of a standard display. This permits the remote station to be made more compact so that it can be used in a wider variety of installations. An example would be in a manned aircraft flying several hundred miles away.”)
one or more processors configured to perform operations including comprising: (see at least MARGOLIN, Col 7 lines 45-56, “Computer mediated control systems use a computer between the pilot controls and the control surfaces. The pilot controls are read by the computer, the data are modified in a particular way, and the computer sends control signals to the control surfaces. The computer may also sense the forces on the control surface and use it to control force feedback to the pilot controls. This type of computer mediated control may be used to fly an airplane that would otherwise be unstable, such as the F16 or the F117. Aircraft such as the F16 and F117 are also second order systems because the position of the pilot's joystick represents rate of rotation.”)
ii) real-time data to be transmitted to the control station, wherein (see at least MARGOLIN, Col 1 lines 49-55)
accessing data stored in a local or cloud storage device to construct a virtual view, (see at least MARGOLIN, Col 5 lines 35-54, “As previously described, the status information received by Computer 405 includes the three dimensional position and the orientation of Remote Aircraft 103. The status information may also include information concerning the flight surfaces, flight sensors, the engine, an additional altitude reading, etc. Computer 405 uses this status information to retrieve data from Digital Database 107 which contains a three-dimensional description of terrain and manmade structures over which Remote Aircraft 103 is flying. The composition and creation of the Digital Database 107 is further described later. Based on the three dimensional data retrieved from Digital Database 107, Computer 405 performs the mathematical operations to transform and project the three dimensional data to generate video data representing a synthesized three-dimensional projected view of the terrain (and, if desired, manmade structures) in the vicinity or environment of Remote Aircraft 103. This video data is transmitted to Graphics System 406, which displays the synthesized three-dimensional projected view on Video Display 407.”)
wherein the virtual view displays different necessary information for the human operator to operate the movable object (see at least MARGOLIN, Col 5 Lines 35-54; Col 5 lines 55-67, “Since the image is generated from the digital database, virtually any image of the environment of the Remote Aircraft 103 can be generated. As examples, the pilot may select the environment to be: 1) a simulated image of what would be seen out of the cockpit of a manned aircraft on a similar flight path; 3) a simulated image of what would be seen when looking in any direction (e.g., backwards, out a side window, etc.); 3) a simulated image of what would be seen if a camera were tailing the remotely piloted aircraft; etc. In addition, the simulated image may be set to any magnification. Thus, the phrase environment of Remote Aircraft 103 is intended to include any image generated with reference to the remote aircraft's position.”)
wherein the virtual view comprises a first-person view (FPV) or a third- person view (TPV) and wherein either the FPV or the TPV (see at least MARGOLIN, Col 5 lines 55-67)
MARGOLIN does not disclose, but KNEUPER teaches:
(a) determining, based at least in part on a real-time visibility condition or an operating rule, (see at least KNEUPER, ¶ 0167, “The menu along the right side of the panel in FIG. 4A includes options to select alternate views for mapping interface 429 including views based on high instrument flight rules (IFR) 424, low IFR 425, visual flight rules (VFR) 426, satellite imagery (SAT) 427, and terrain representation (TERR) 428 for example. Panel 400 may be configured to display greater or fewer menu items along the right of the panel or to arrange items differently without departing from the scope hereof.”)
i) necessary information to be displayed to the human operator and (see at least KNEUPER, ¶ 0167; ¶ 0207, “FIG. 5A provides an exemplary graphical user interface (GUI) 501 illustrating a flight guide application. A real-time image 502 is provided via the TSIP 210 and the flight guide application is provided such that it is overlaying the real-time image 502. The flight guide application is embodied in GUI 300 as a flight path 503 comprising a plurality of planes, or path indicators. The plurality of planes/path indicators may be used to highlight the flight path 503 of an aircraft. The plurality of planes may each be associated with various coordinates (e.g., physical locations in space), glide slopes, and the like. In an embodiment, the information associated with each plane/path indicator (e.g., glide slope, etc.) is displayed to a user upon an indication such as selection of the plane, hovering over the plane/path indicator, etc.”; ¶ 0213, “FIG. 5E provides an exemplary GUI 520 of an exemplary descent screen. As in FIG. 5C, a flight guide 521 is provided with one or more path indicators illustrated. The concept described with reference to FIG. 5C is applicable in this example as well but is directed to a descent, specifically. As previously described, the one or more path indicators may be configured to convey information based on a distance to or from a waypoint, the aircraft, or the like. In a descent situation, the one or more path indicators proximate to the destination will indicate a descent is approaching and may be proximate to a waypoint 522 (e.g., destination airport). Similar to previous examples, this may be illustrated by displaying the path indicators differently to draw attention to them by, for example, using different colors, flashing the path indicators, etc. It is noted that the flight guides provided in FIGS. 5A-5E are overlaying a three-dimensional real-time image on the TSIP.”)
the operating rule is at least one of a Visual Flight Rules (VFR) condition or an Instrument Flight Rules (IFR) condition, (see at least KNEUPER, ¶ 0151, “FIG. 4A depicts an exemplary panel 400 of the flight planning system. Panel 400 is configured to show a mapping interface 429 based on high instrument flight rules (IFR). Mapping interface 429 includes a displayed image of a map, which may be manipulated by a user with touch gestures, such as zooming and dragging, to view maps of various areas of Earth. Panel 400 includes menus listed, for example, along the bottom, top and sides of the panel. The menus may include icons, names or abbreviations that may be activated by touch, thus serving as links or shortcuts to various features of the flight planning system. The menu along the bottom of panel 400 includes, for example, a title indicator 401, a proximity icon 402, a favorites icon 403, a weather link (WX) 404, a sky track link 405, a waypoints link 406, a procedures link 407, a direct-to link 408, and a standby-plan link 409. Panel 400 may be configured to display greater or fewer menu items along the bottom or to arrange items differently without departing from the scope hereof.”; ¶ 0171, “FIG. 4C depicts flight planning panel 432, which is an example of flight planning panel 400 of FIG. 4A, that is configured to show a mapping interface 433 based on VFR 426. VFR is a set of FAA rules and regulations for flying an aircraft using outside visual cues, wherein reliance on instruments is optional for pilots. VFR 426 illustrates an aeronautical map showing routes based on available visual cues for efficient flight planning.”)
wherein the VFR condition and the IFR condition require different data sources from one or more data sources and (see at least KNEUPER, ¶ 0202, “Embodiments of the present invention are directed to providing navigational aids. Navigational aids have been used in aircraft to assist users in navigation and to improve situational awareness. However, the aids are typically separate components and sometimes multiple sources need to be referenced to gain access to necessary information. Additionally, the displays of previous navigational aid systems were limited and not able to display detailed information related to the navigational aid. For example, the previous displays were typically very small so including detailed information was not feasible since there was no room on the screen to display the information.”)
wherein the necessary information to be displayed under the VFR condition and the IFR condition is different(see at least KNEUPER, ¶ 0151; ¶ 0202)
(b)_receiving real-time data from the one or more data sources corresponding to the operation condition in (a) and (see at least KNEUPER, ¶ 0109, “Databases 230 are digital databases stored in memory of computer 201 on-board the aircraft. Databases 230 include charts, manuals, historical aircraft component data, and checklists. Databases 230 allow pilots to quickly access and search information via computer 201. TSIP 210 displays the information such that pilots maintain a heads-up view while piloting an aircraft. Historical aircraft component data is for example updated during flight with data from aircraft flight equipment 250 (e.g., sensors) via computer 201.”; ¶ 0150, “On-board computer 201 includes a manager for providing navigational views on TSIP 210. The navigational views on TSIP 210 include a mapping interface for displaying one or more maps (see FIGS. 4A-4E), a charts component for displaying one or more aeronautical charts (see FIGS. 4G-4J), a radio frequency component for receiving and displaying one or more radio frequencies (see FIG. 4K), a weather component for displaying one or more weather representations overlaid on the map (see FIGS. 4A-4E), and a virtual flight plan component for displaying one or more simulated flight plans.”)
wherein the one or more data sources comprise at least a real-time data source selected from the group consisting of a weather data source and a traffic data source; and (see at least KNEUPER, ¶ 0022. “In yet another embodiment, one or more computer-storage media having embodied thereon computer-usable instructions that, when executed, facilitate a method for providing navigational aids is provided. The claim recites identifying a location of a first aircraft; identifying any traffic within a predetermined distance of the first aircraft, wherein traffic includes other aircraft; determining that a second aircraft is within the predetermined distance of the first aircraft; generating a traffic user interface panel that includes information associated with the second aircraft including an airspeed of the second aircraft, wherein the traffic user interface panel is provided via a touch-screen instrument panel overlaying a real-time image; and monitoring the predetermined distance from the first aircraft and updating according to an updating location of the first aircraft.”; ¶ 0154, “Weather link (WX) 404 is configured such that selection thereof activates or deactivates a weather component of the flight planning system for displaying real-time and forecasted weather representations overlaid on mapping interface 429. For example, real-time weather is determined from radar 270 and forecasted weather is determined from external communication sources 265, such as the National Weather Service, and depicted on mapping interface 429. Weather may be represented by shaded regions, contour lines or other illustrations, with different shades or colors illustrating rain, snow and heaviness of precipitation, for example. Weather representation 423 is depicted along the bottom and in the bottom right corner of mapping interface 429 of FIGS. 4A-4E. Weather link (WX) 404 provides a convenient one-touch link to display information for flight planning based on real-time and forecasted weather.”)
(c) rendering the virtual view constructed in (b) to the human operator via the display device for controlling an operation of the movable object under the operation condition in (a), (see at least KNEUPER, ¶ 0023, “In an embodiment, a method for displaying a real-time view within an aircraft is provided. The method comprises receiving an indication of a synthetic vision application, wherein the indication enables the synthetic vision application for the real-time view; identifying a synthetic vision application value to apply to the real-time view; applying a synthetic vision enhancement to the real-time view according to the synthetic vision application value; and generating a modified real-time view where the modified real-time view is enhanced by synthetic vision as indicated by the synthetic vision application value.”; ¶ 0207)
under the operating condition in (a), (see at least KNEUPER, ¶ 0151; ¶ 0202)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system of MARGOLIN to incorporate the flight displays for IFR and VFR with selectable data sources within KNEUPER to effectively yield a remote flight controller capable of rendering IFR/VFR in a 3D space for a remote pilot.
EXAMINERS NOTE: Even though MARGOLIN does not explicitly state the use of real-time data, it does anticipate the need for data to be real-time to provide for proper control for the aircraft. This is especially highlighted in the accounting for transmission delay caused by data relays (Col 8 lines 14-36).
MARGOLIN in view of KNEUPER does not disclose, but TIANA teaches:
comprises at least a rendering of a natural object serving as a reference point to the human operator. (see at least TIANA, ¶ 0008, “Data received by the SVS sensor must be processed and formatted for human consumption. Such processing may require extensive processing power as well as filtering of some information in order to display accurately to the human pilot. Traditional sensor system design is driven by the need to produce “natural looking” images to pilots, requiring the complex and often information-occluding task of forming visually pleasant imagery.”; ¶ 0051, "The MS system 100 may include a plurality of optical sensors included within the VS 132. The VS 132 may include a plurality of components and capabilities. One component of the VS 132 may include a Synthetic Vision System (SVS) configured to receive data from a database and provide database generated attributes to the object ID and positioning system 150 for use in positioning. Another component of the VS 132 may include an Enhanced Vision System (EVS) including a camera sensor of a plurality of wavelengths and providing those camera sensed attributes to the object ID and positioning system 150. Additionally contemplated herein, a Combined Vision System (CVS) may incorporate within the VS 132 to provide a synthesis of both database attributes with camera sensed attributes offered to the object ID and positioning system 150 for analysis and autonomous aircraft 120 position determination."; ¶ 0052, “For example, the enhanced SVS 132 may be capable of imaging a specific pattern of terrain such as a mountain range, a runway pattern, a river, or a river valley. In one embodiment, the enhanced SVS 132 may function receiving data from the MS database 174 coupled with additional positioning sensors, offering object attributes to the object ID and positioning system 150 for precise positioning of the autonomous aircraft 120. In additional embodiments, the enhanced SVS 132 may employ a camera to image surrounding objects and offer the sensed data via a video stream data 232 to the object ID and positioning system 150.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system with IFT/VFR displays of MARGOLIN in view of KNEUPER to include the detection, visualization, and identification of sensed objects for visualization within TIANA to yield a safer, more realistic, remote flight 3D control system that tracks nearby landmarks for the pilot.
Regarding claim 2:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 1 and further discloses:
The system of claim 1, wherein the movable object comprises a fly-by-wire control system for controlling an actuator of the movable object in response to a command received from the control station. (see at least MARGOLIN, Col 6 lines 0-17, "The User Flight controls with Force Feedback 408 are used by the remote pilot to input flight path information. The User Flight Controls may be of any number of different types, some of which are further described later herein. The status information received by Computer 405 also includes information received from Aircraft Flight Surfaces and Sensors 310. This information is used to actuate force feedback circuitry in User Flight Controls With Force Feedback 408. Remote Pilot 102 observes the synthesized three-dimensional environment displayed on Video Display 407, feels the forces on User Flight Controls With Force Feedback 408 and moves the controls accordingly. This flight control information is sent through the communications link, to Computer 308, and is used to control the aircraft flight surfaces in Aircraft Flight Surfaces and Sensors 310. Remote Pilot 102 also receives data from Aircraft Engine and Sensors 309 through the communications link and is able to send data back to control the engine.")
Regarding claim 3:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 2 and further discloses:
The system of claim 2, wherein the movable object is a helicopter. (see at least MARGOLIN, Col 3 lines 50-58, "In one embodiment, the Remote Aircraft 103 is a remote controlled plane or helicopter used for recreational purposes. Since remote controlled planes and helicopters tend to be small in size, the circuitry in such remote aircraft to generate and receive Information 106 is minimized. In such systems, the Remote Pilot Station 101 may be implemented by including additional attachments to an existing portable computer. This allows the user to easily transport the remote aircraft and pilot station to an appropriate location for flight.")
Regarding claim 4:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 1 and further discloses:
The system of claim 1, wherein the virtual view is displayed based on measurements of a movement of the human operator's head and/or eyes. (see at least MARGOLIN, Col 9 lines 19-33, "FIG. 5 is a block diagram of a remote pilot station according to another embodiment of the invention. FIG. 5 shows Remote Pilot Station 500. Remote Pilot Station 500 is similar to Remote Pilot Station 400 of FIG. 4, except Video Display 407 is replaced by Head Mounted Display 501. In addition, Head Mounted Display Attitude Sensors 502 are coupled to Computer 405. Head Mounted Display Attitude Sensors 502 measure the attitude of Head Mounted Display 501. This information is used by Computer 405 to produce an additional three dimensional transformation of the data from Digital Database 107 to account for the attitude of the remote pilots Head Mounted Display 501. This does not require any additional data from the remote aircraft. Of course, alternative embodiments could include both a video display and a head mounted display.")
Regarding claim 5:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 1 and MARGOLIN does not disclose, but TIANA teaches:
wherein the real-time data comprise video stream captured by an imaging device onboard the movable object and (see at least TIANA, ¶ 0052, “For example, the enhanced SVS 132 may be capable of imaging a specific pattern of terrain such as a mountain range, a runway pattern, a river, or a river valley. In one embodiment, the enhanced SVS 132 may function receiving data from the MS database 174 coupled with additional positioning sensors, offering object attributes to the object ID and positioning system 150 for precise positioning of the autonomous aircraft 120. In additional embodiments, the enhanced SVS 132 may employ a camera to image surrounding objects and offer the sensed data via a video stream data 232 to the object ID and positioning system 150.”)
wherein the natural object is not visible in the video stream. (see at least TIANA, ¶ 0008, “Data received by the SVS sensor must be processed and formatted for human consumption. Such processing may require extensive processing power as well as filtering of some information in order to display accurately to the human pilot. Traditional sensor system design is driven by the need to produce “natural looking” images to pilots, requiring the complex and often information-occluding task of forming visually pleasant imagery.”; ¶ 0048, “The object ID and positioning system 150 may be configured to receive sensor data from one or more sensors of the sensor suite 130 where the sensor data may include the plurality of attributes associated with a sensed object. Here, a sensed object may be defined as any object within a field of view (FOV) of one of the sensors and able to be imaged and therefore measured by the sensor. Also, a sensed object may include a terrain object, a geographical object, a natural object, a man-made object, an airport prepared surface, and a landing surface. An attribute of the sensed object may include characteristics of the sensed object which may highlight the object to the specific sensor.”, ¶ 0051, "The MS system 100 may include a plurality of optical sensors included within the VS 132. The VS 132 may include a plurality of components and capabilities. One component of the VS 132 may include a Synthetic Vision System (SVS) configured to receive data from a database and provide database generated attributes to the object ID and positioning system 150 for use in positioning. Another component of the VS 132 may include an Enhanced Vision System (EVS) including a camera sensor of a plurality of wavelengths and providing those camera sensed attributes to the object ID and positioning system 150. Additionally contemplated herein, a Combined Vision System (CVS) may incorporate within the VS 132 to provide a synthesis of both database attributes with camera sensed attributes offered to the object ID and positioning system 150 for analysis and autonomous aircraft 120 position determination."; ¶ 0052, "For example, the enhanced SVS 132 may be capable of imaging a specific pattern of terrain such as a mountain range, a runway pattern, a river, or a river valley. In one embodiment, the enhanced SVS 132 may function receiving data from the MS database 174 coupled with additional positioning sensors, offering object attributes to the object ID and positioning system 150 for precise positioning of the autonomous aircraft 120. In additional embodiments, the enhanced SVS 132 may employ a camera to image surrounding objects and offer the sensed data via a video stream data 232 to the object ID and positioning system 150."; ¶ 0114, “Referring now to FIG. 9, a diagram of a short final view exemplary of one embodiment of the inventive concepts disclosed herein is shown. The short final view 900 may include runway specific objects as well as distant objects sensed previously. Coghlan island 522 and Juneau hill 526 are still available for the object ID and positioning system 150 to determine a position. On short final, blast pad markings corner 528, aiming point markings 530, and hold short line 532 may be visible to the optical sensor when meteorological conditions may permit.”; ¶ 0115, “The object ID and positioning system 150 may function to verify the position of the autonomous aircraft 120 using installed position aids 224 such as an external physical marker tuned to be sensed by one or more sensors to verify the accuracy of the position solution. The installed position aids 224 may inherently possess data which, when interpreted by the object ID and positioning system 150, offer an indication of position. For example, a pattern of RCS objects in a unique formation may communicate to the object ID and positioning system 150 that the autonomous aircraft 120 is a specific position. Here, a left MS runway ident 920 and a right MS runway ident 922 may function to enable the object ID and positioning system 150 to verify the autonomous aircraft 120 is landing on the PAJN RW26 504.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system with IFT/VFR displays of MARGOLIN in view of KNEUPER to include the detection, visualization, identification, and tracking of sensed objects for visualization within TIANA to yield a safer, more realistic, remote flight 3D control system that tracks nearby landmarks for the pilot.
EXAMINERS NOTE: Even though TIANA does not explicitly teach non-visible objects in the video system, it does anticipate the incorporation of synthetic positional aids based on sensor data to align the aircraft to its location and aiding in landing.
Regarding claim 6:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 1 and MARGOLIN does not disclose, but TIANA teaches:
wherein the natural object is not sensed by the real-time data. (see at least TIANA, ¶ 0048, “The object ID and positioning system 150 may be configured to receive sensor data from one or more sensors of the sensor suite 130 where the sensor data may include the plurality of attributes associated with a sensed object. Here, a sensed object may be defined as any object within a field of view (FOV) of one of the sensors and able to be imaged and therefore measured by the sensor. Also, a sensed object may include a terrain object, a geographical object, a natural object, a man-made object, an airport prepared surface, and a landing surface. An attribute of the sensed object may include characteristics of the sensed object which may highlight the object to the specific sensor.”; ¶ 0051, "The MS system 100 may include a plurality of optical sensors included within the VS 132. The VS 132 may include a plurality of components and capabilities. One component of the VS 132 may include a Synthetic Vision System (SVS) configured to receive data from a database and provide database generated attributes to the object ID and positioning system 150 for use in positioning. Another component of the VS 132 may include an Enhanced Vision System (EVS) including a camera sensor of a plurality of wavelengths and providing those camera sensed attributes to the object ID and positioning system 150. Additionally contemplated herein, a Combined Vision System (CVS) may incorporate within the VS 132 to provide a synthesis of both database attributes with camera sensed attributes offered to the object ID and positioning system 150 for analysis and autonomous aircraft 120 position determination."; ¶ 0052, "For example, the enhanced SVS 132 may be capable of imaging a specific pattern of terrain such as a mountain range, a runway pattern, a river, or a river valley. In one embodiment, the enhanced SVS 132 may function receiving data from the MS database 174 coupled with additional positioning sensors, offering object attributes to the object ID and positioning system 150 for precise positioning of the autonomous aircraft 120. In additional embodiments, the enhanced SVS 132 may employ a camera to image surrounding objects and offer the sensed data via a video stream data 232 to the object ID and positioning system 150."; ¶ 0133, “Referring now to FIG. 14, a diagram of a short final view in accordance with one embodiment of the inventive concepts disclosed herein is shown. KLAS short final view 1400 may indicate an additional example of a principal component analysis available to the object ID and positioning system 150 to verify accurate positioning. Here, a KLAS runway diagram 1450 may indicate a general layout of the runways. Here a KLAS RW 19L 652, a blast fence east 654 near a road intersection 656 may offer a perspective of objects available. KLAS RW 19R 658 may be a desired destination object here. A hotel NE corner 660 coupled with a hotel HVAC 662 unit on the roof may indicate objects available to the object ID and positioning system 150. K:AS taxiway H 664, blast fence west 666, hangar south 668, and casino south 670 may make up some principal components available for analysis by the object ID and positioning system 150.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system with IFT/VFR displays of MARGOLIN in view of KNEUPER to include position tracking with object id visualization for the detection, visualization, identification, and tracking of sensed objects within TIANA to yield a safer, more realistic, remote flight 3D control system that tracks nearby landmarks for the pilot.
EXAMINERS NOTE: Even though TIANA does not explicitly teach non-visible objects in the video system, it does anticipate the incorporation of synthetic positional aids based on sensor data to align the aircraft to its location and aiding in landing.
Regarding claim 7:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 1 and MARGOLIN does not disclose, but TIANA teaches:
The system of claim 1, wherein the TPV is configurable by changing a virtual TPV camera location. (see at least MARGOLIN, Col 5 lines 55-67, "Since the image is generated from the digital database, virtually any image of the environment of the Remote Aircraft 103 can be generated. As examples, the pilot may select the environment to be: 1) a simulated image of what would be seen out of the cockpit of a manned aircraft on a similar flight path; 3) a simulated image of what would be seen when looking in any direction (e.g., backwards, out a side window, etc.); 3) a simulated image of what would be seen if a camera were tailing the remotely piloted aircraft; etc. In addition, the simulated image may be set to any magnification. Thus, the phrase environment of Remote Aircraft 103 is intended to include any image generated with reference to the remote aircraft's position.")
Regarding claim 9:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 1 and MARGOLIN does not disclose, but TIANA teaches:
The system of claim 1, wherein the virtual view comprises a rendering of a dynamic obstacle. (see at least TIANA, ¶ 0008, "Data received by the SVS sensor must be processed and formatted for human consumption. Such processing may require extensive processing power as well as filtering of some information in order to display accurately to the human pilot. Traditional sensor system design is driven by the need to produce “natural looking” images to pilots, requiring the complex and often information-occluding task of forming visually pleasant imagery."; ¶ 0049, "For example, a desired object may include any object to which or from which an operator of the autonomous aircraft 120 may desire navigation or positioning. Here, an object may include a sensor significant object able to be sensed by any of the sensors within the sensor suite 130. For example, a desirable object may include a building, a road intersection, a RADAR significant object, a flight deck, an aircraft, and a target of interest. Each sensed object may inherently possess a plurality of attributes which may describe the object."; ¶ 0050, "For example, an attribute of a sensed object may include an object three-dimensional position relative to the datum (e.g., latitude, longitude, MSL altitude), a visibly distinct difference from surrounding terrain (e.g., color texture, size, terrain flow), a RADAR cross section (RCS), a specific map feature, a shape, a size, a reflectivity level, a radar cross section, and a frequency of RF radiation. Each sensor within the sensor suite 130 may sense a specific attribute of an object and operate solely (positioning) or in concert (hybrid positioning) to assist the object ID and positioning system 150 in determining a precise position of the autonomous aircraft 120.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system with IFT/VFR displays of MARGOLIN in view of KNEUPER to include the detection, visualization, identification, and tracking of sensed objects for visualization within TIANA to yield a safer, more realistic, remote flight 3D control system that tracks nearby landmarks for the pilot.
Regarding claim 10:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 9 and MARGOLIN does not disclose, but TIANA teaches:
wherein the dynamic obstacle is tracked by processing sensor data collected from movable object. (see at least TIANA, ¶ 0048, "The object ID and positioning system 150 may be configured to receive sensor data from one or more sensors of the sensor suite 130 where the sensor data may include the plurality of attributes associated with a sensed object. Here, a sensed object may be defined as any object within a field of view (FOV) of one of the sensors and able to be imaged and therefore measured by the sensor. Also, a sensed object may include a terrain object, a geographical object, a natural object, a man-made object, an airport prepared surface, and a landing surface. An attribute of the sensed object may include characteristics of the sensed object which may highlight the object to the specific sensor."; ¶ 0049, "For example, a desired object may include any object to which or from which an operator of the autonomous aircraft 120 may desire navigation or positioning. Here, an object may include a sensor significant object able to be sensed by any of the sensors within the sensor suite 130. For example, a desirable object may include a building, a road intersection, a RADAR significant object, a flight deck, an aircraft, and a target of interest. Each sensed object may inherently possess a plurality of attributes which may describe the object.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system with IFT/VFR displays of MARGOLIN in view of KNEUPER to include the detection, visualization, identification, and tracking of sensed objects for visualization within TIANA to yield a safer, more realistic, remote flight 3D control system that tracks nearby landmarks for the pilot.
Regarding claim 14:
With regards to claim 14, this claim is the method claim for system claim 1 and is substantially similar to claim 1 and is therefore rejected using the same references and rationale.
Regarding claim 15:
With regards to claim 15, this claim is substantially similar to claim 2 and is therefore rejected using the same references and rationale.
Regarding claim 16:
With regards to claim 16, this claim is substantially similar to claim 3 and is therefore rejected using the same references and rationale.
Regarding claim 17:
With regards to claim 17, this claim is substantially similar to claim 4 and is therefore rejected using the same references and rationale.
Regarding claim 18:
With regards to claim 18, this claim is substantially similar to claim 5 and is therefore rejected using the same references and rationale.
Regarding claim 19:
With regards to claim 19, this claim is substantially similar to claim 6 and is therefore rejected using the same references and rationale.
Regarding claim 20:
With regards to claim 20, this claim is substantially similar to claim 7 and is therefore rejected using the same references and rationale.
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over MARGOLIN (US 5904724 A) in view of KNEUPER (US 20170344181 A1) in further view of TIANA (US 20200279494 A1) in further view of HERMAN (US 10896335 B2).
Regarding claim 8:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 1 and MARGOLIN does not disclose, but TIANA teaches:
when the movable object is approaching a destination. (see at least TIANA, ¶ 0012, “For control of the object ID and positioning system, the system may include a tangible, non-transitory memory within the storage configured to communicate with the processor, the tangible, non-transitory memory having instructions stored therein that, in response to execution by the processor, cause the processor to execute commands. The object ID and positioning system may receive a flight plan, the flight plan including a desired path and a desired destination object and receive sensor data from sensor of the autonomous aircraft sensor suite, the sensor data including an attribute of a sensed object.”; ¶ 0113, “In one embodiment, the object ID and positioning system 150 may analyze motion flow (structure from motion) using a single visual sensor mounted on the autonomous aircraft 120 detecting features within the image but utilizing information about the motion of the autonomous aircraft 120 over time to provide a 3D representation of the approaching runway environment. This may require a hybrid solution from GPS/IRS to provide accurate information regarding the motion flow over time but may allow the object ID and positioning system 150 to determine position information without prior knowledge of runway dimensions.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system with IFT/VFR displays of MARGOLIN in view of KNEUPER to include the detection, visualization, identification, and tracking of sensed objects for visualization and tracking of the runway within TIANA to yield a safer, more realistic, remote flight 3D control system that tracks nearby landmarks for the pilot.
HERMAN teaches:
The system of claim 1, wherein the operations further include activating a transparency mode in the TPV (see at least HERMAN, Col 3 lines 3-24, "The invention preferably monitors motion of the objects of interest and evaluates the probability of impacts of the objects with the vehicle. When there is significant relative motion, then the transparent section of the virtual vehicle image can be expanded to reveal the motion path of the object. When the probability of an impact becomes high, then the viewpoint (i.e., virtual camera position and perspective) can be alerted to a location when the moving object and its motion path are more directly visible. In other cases, the host vehicle's own present or expected motion may result in a hazard. Similarly, portions of the virtual vehicle may be made transparent to bring into better view nearby objects, present wheel angles, or virtual path lines which may be augmented in the image as a guide to the driver. In another example, the virtual camera view may show future path lines of objects based on current relative motion, such as the expected path of vehicle, in order to guide the user. Such object path lines may similarly be blocked by the virtual 3D model of the host vehicle. In such cases, portions of the host vehicle 3D model may be made transparent to maintain the visibility of such lines.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the third person view and sensor landing system of MARGOLIN in view of TIANA with the transparency sections of HERMAN to yield a safer third person landing view by avoiding obstructions in the view presented by rendering the vehicle model.
EXAMINERS NOTE: Even though Herman does not explicitly name a third person view, from the figures referenced (Fig 5), it can be seen that the transparency mode is actuated from a third person point of view.
Claims 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over MARGOLIN (US 5904724 A) in view of KNEUPER (US 20170344181 A1) in further view of TIANA (US 20200279494 A1) in further view of LUO (US 20190147372 A1).
Regarding claim 11:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 10 and MARGOLIN does not disclose, but LUO teaches:
wherein a location of the dynamic obstacle is tracked by applying a feed-forward model to the sensor data. (see at least LUO, ¶ 0141, “At 902, the method 900 can include determining one or more traveled paths of the one or more objects (e.g., the one or more objects of the method 700) based at least in part on one or more locations of the one or more objects over a sequence of the one or more time intervals including a last time interval associated with a current time and the one or more time intervals prior to the current time. For example, the computing system 112 can determine one or more traveled paths of the one or more objects based at least in part on sensor data (e.g., the sensor data of the method 700) including one or more locations of the one or more objects over a sequence of the one or more time intervals including a last time interval associated with a current time and the one or more time intervals prior to the current time.”; ¶ 0151, “At 1102, the method 1100 can include generating the machine-learned model (e.g., the machine-learned model of the method 700) based at least in part on training data including a plurality of training objects associated with a plurality of classified features and/or a plurality of classified object labels. For example, the computing system 1210 and/or the machine-learning computing system 1250 can include, employ, and/or otherwise leverage a machine-learned object detection and prediction model. The machine-learned object detection and prediction model can be or can otherwise include one or more various models including, for example, neural networks (e.g., deep neural networks), or other multi-layer non-linear models. Neural networks can include convolutional neural networks, recurrent neural networks (e.g., long short-term memory recurrent neural networks), feed-forward neural networks, and/or other forms of neural networks.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the object ID and sensor system within MARGOLIN in view of TIANA with a feed-forward machine learning model of LUO to yield an effective machine learning based object recognition system.
Regarding claim 12:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 11 and MARGOLIN does not disclose, but LUO teaches:
The system of claim 11, wherein an identity of the dynamic obstacle is determined by applying a machine learning algorithm trained model to the sensor data. (see at least LUO, ¶ 0025, “The sensor data can then be quantized into a voxel representation that can be used as input for a machine-learned model. The voxel representation can be used to filter out the sparse areas of the environment (e.g., the areas that do not include objects). The voxel representation can be used as an input representation that is input into a machine-learned mode that is trained to detect (e.g., detect the class of an object), track (e.g., track the motion of an object), and determine the predicted travel path of the one or more objects. Significantly, the computing system associated with a device (e.g., an autonomous vehicle and/or a robotic system) can perform the detection, tracking, and motion path prediction of the objects in a single stage (e.g., simultaneously), which can reduce the accumulation of errors that can occur when the detection, tracking, and motion path prediction are performed sequentially. The computing system associated with the device (e.g., an autonomous vehicle) can then generate one or more bounding shapes (e.g., bounding polygons) that can be used to identify the one or more objects (e.g., size, shape, and/or type) and furthermore to indicate an orientation or travel direction of the one or more objects. Accordingly, the disclosed technology allows for an improvement in operational safety through faster, more accurate, and precise object detection, tracking, and motion prediction that more efficiently utilizes computing resources.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the object ID and sensor system within MARGOLIN in view of TIANA with a feed-forward machine learning model of LUO to yield an effective machine learning based object recognition system.
Regarding claim 13:
MARGOLIN in view of KNEUPER in further view of TIANA disclose the limitations in claim 12 and MARGOLIN does not disclose, but TIANA teaches:
wherein the rendering of the dynamic obstacle (see at least TIANA, ¶ 0008, "Data received by the SVS sensor must be processed and formatted for human consumption. Such processing may require extensive processing power as well as filtering of some information in order to display accurately to the human pilot. Traditional sensor system design is driven by the need to produce “natural looking” images to pilots, requiring the complex and often information-occluding task of forming visually pleasant imagery."; ¶ 0048, "The object ID and positioning system 150 may be configured to receive sensor data from one or more sensors of the sensor suite 130 where the sensor data may include the plurality of attributes associated with a sensed object. Here, a sensed object may be defined as any object within a field of view (FOV) of one of the sensors and able to be imaged and therefore measured by the sensor. Also, a sensed object may include a terrain object, a geographical object, a natural object, a man-made object, an airport prepared surface, and a landing surface. An attribute of the sensed object may include characteristics of the sensed object which may highlight the object to the specific sensor."; ¶ 0049, "For example, a desired object may include any object to which or from which an operator of the autonomous aircraft 120 may desire navigation or positioning. Here, an object may include a sensor significant object able to be sensed by any of the sensors within the sensor suite 130. For example, a desirable object may include a building, a road intersection, a RADAR significant object, a flight deck, an aircraft, and a target of interest. Each sensed object may inherently possess a plurality of attributes which may describe the object."; ¶ 0050, "For example, an attribute of a sensed object may include an object three-dimensional position relative to the datum (e.g., latitude, longitude, MSL altitude), a visibly distinct difference from surrounding terrain (e.g., color texture, size, terrain flow), a RADAR cross section (RCS), a specific map feature, a shape, a size, a reflectivity level, a radar cross section, and a frequency of RF radiation. Each sensor within the sensor suite 130 may sense a specific attribute of an object and operate solely (positioning) or in concert (hybrid positioning) to assist the object ID and positioning system 150 in determining a precise position of the autonomous aircraft 120.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the 3D remote flight system with IFT/VFR displays of MARGOLIN in view of KNEUPER to include the detection, visualization, identification, and tracking of sensed objects for visualization within TIANA to yield a safer, more realistic, remote flight 3D control system that tracks nearby landmarks for the pilot.
MARGOLIN in view of TIANA does not disclose, but LUO further teaches:
is based at least in part on a certainty of the identity and/or the location. (see at least LUO, ¶ 0025, “The sensor data can then be quantized into a voxel representation that can be used as input for a machine-learned model. The voxel representation can be used to filter out the sparse areas of the environment (e.g., the areas that do not include objects). The voxel representation can be used as an input representation that is input into a machine-learned mode that is trained to detect (e.g., detect the class of an object), track (e.g., track the motion of an object), and determine the predicted travel path of the one or more objects. Significantly, the computing system associated with a device (e.g., an autonomous vehicle and/or a robotic system) can perform the detection, tracking, and motion path prediction of the objects in a single stage (e.g., simultaneously), which can reduce the accumulation of errors that can occur when the detection, tracking, and motion path prediction are performed sequentially. The computing system associated with the device (e.g., an autonomous vehicle) can then generate one or more bounding shapes (e.g., bounding polygons) that can be used to identify the one or more objects (e.g., size, shape, and/or type) and furthermore to indicate an orientation or travel direction of the one or more objects. Accordingly, the disclosed technology allows for an improvement in operational safety through faster, more accurate, and precise object detection, tracking, and motion prediction that more efficiently utilizes computing resources.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the object ID, sensor, and synthetic vision system within MARGOLIN in view of TIANA with a feed-forward machine learning model of LUO to yield an effective object recognition system for mapping nearby obstacles.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
SCHNELL (US 20080262664 A1)
¶ 0012, “The present invention is an improved SVS that provides increased situation awareness information. For purposes of this application, situation awareness information is that information and data that relates to three dimensional recognition of terrain conflicts, obstacle conflicts, flight path and trajectory, location and orientation of navigation aids, and maintenance of spatial orientation. For purposes of this application, the term "terrain conflicts" includes the third or vertical dimension of land surface, for example, land formations and bodies of water. "Obstacle conflicts" includes any physical impediment, for example, manmade obstructions, towers and buildings. The term pathway, or flight path, means the course, route, or way of the aircraft in three dimensions. Navigation aid, or Navaid, is any sort of marker which aids in navigation, for example a Very High Frequency Omni-bearing Range ("VOR"), Global Position System ("GPS") waypoint, airway intersection, airport, etc.”
¶ 0061, “An embodiment of the Display Options 652 of the improved SVS 100 may include various modes, for example, Visual Flight Rules ("VFR") sectional chart colors 670, topographically ("TOPO") enhanced colors 672, and Instrument Flight Rules ("IFR") chart colors 674, shown in FIG. 8. VFR charts may be color coded to convey altitudes such as green for low areas and brown for high areas. The MFD may be set to display the same color scheme as is found on VFR charts. The IFR color scheme eliminates clutter due to colors if the pilot is flying under positive control from air traffic control under an IFR flight plan. The TOPO mode is useful in relatively flat areas where the color scheme is more sensitive to subtle changes in terrain elevation.”
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAFAEL VELASQUEZ VANEGAS whose telephone number is (571)272-6999. The examiner can normally be reached M-F 8 - 4.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, VIVEK KOPPIKAR can be reached at (571) 272-5109. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RAFAEL VELASQUEZ VANEGAS/Patent Examiner, Art Unit 3667
/JOAN T GOODBODY/Examiner, Art Unit 3667