DETAILED ACTION
This is the First Office Action on the Merits and is directed towards claims 1-20 as originally presented and filed on 05/09/2024.
Notice of Pre-AIA or AIA Status
No apparent Priority is claimed, accordingly the earliest filing date is 09 May 2024 (20240509).
The present application filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
As required by M.P.E.P. 609 [R-07.2022], Applicant's 06/28/2024 submission(s) of Information Disclosure Statement (IDS)(s) is/are acknowledged by the Examiner and the reference(s) cited therein has/have been considered in the examination of the claim(s) now pending. A copy of the submitted IDS(s) initialed and dated by the Examiner is/are attached to the instant Office action.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-3, 5-15, and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over in view WO 2012065681 A2 to MARTINKAT NORBERT et al. (MARTINKAT) in view of US 20100008515 A1 to Fulton; David Robert et al. (Fulton).
Regarding claim 1 MARTINKAT teaches in for example the Figure(s) reproduced immediately below:
PNG
media_image1.png
728
635
media_image1.png
Greyscale
and associated descriptive texts a method of guiding an airborne vehicle towards a designated 3-dimensional location by collecting sound from a sound source associated with the airborne vehicle with a group of three microphones with known 3-dimensional vector representing the relative 3-dimensional position of the group of three microphones in relation to the designated 3-dimensional location, (as shown in Fig. 1 above, wherein it is understood that “unmanned flying object 2” connotes “an airborne vehicle”, “landing surface 16” connotes “a designated 3-dimensional location”, “a sound source associated with the airborne vehicle“ connotes sensors 10 and 12, and microphones 10 and 12 also connote “a group of microphones” as claimed and explained in for example paras:
“ The system 1 comprises an unmanned flying object 2 and a ground unit 3. The flying object 2 has actuators 4, a flight computer 5, a communication receiver 6, a reflecting surface 7 and an antenna 8. The antenna 8 is connected to the receiver 6, the receiver 6 in turn to the flight computer 5 and the flight computer 5 with the actuators 4. In this embodiment, each connection can be a direct connection or for example via a bus system. The flying object 2 is preferably able to land vertically, i.e. in particular to lower itself in parallel to a landing area.
The ground unit 3 has an evaluation computer 9, three transmit and receive sensors (shown in FIG. 1 are only the sensors 10 and 12), three measurement computers (only the measurement computers 11 and 13 are shown in FIG. 1), a communication transmitter 14, a Antenna 15 and a landing area 16. The sensor 10 is connected to the measuring computer 11, the sensor 12 to the measuring computer 13 and the third sensor to the third measuring computer. All three measuring computers 11 and 13 are connected to the evaluation computer 13. The evaluation computer 9 is connected to the transmitter 14 and the transmitter 14 to the antenna 15. The three sensors are spaced apart on or near the landing surface 16.“)
the method comprising:
receiving signals, from the sound source, at the group of three microphones, wherein the three microphones are logically connected with a computing device having a processor, nonvolatile storage, and a wireless transmitter, wherein the three microphones are configured as three microphones pairs for jointly processing each pair during distance difference estimations and calculating 3-dimensional locations (as explained in for example paras:
“The sensors each emit an electromagnetic, optical or acoustic signal, which is reflected back from the reflecting surface 7 on the flying object 2 to the respective sensor. The Measuring computers receive the output signals of the assigned sensors and generate from the evaluation computer 9 evaluable signals or information. These signals are transmitted to the evaluation computer 9, which calculates therefrom an approach vector which points from a target point, not shown, on the landing area in the direction of the flying object 2 and whose length represents the distance between the ground unit 3 and the flying object 2. The approach vector is determined by calculating the respective distance of the flying object 2 from the three sensors from the transit time of the signal emitted by each sensor and reflected by the flying object 2. The respective distance is the radius of an imaginary spherical surface around the respective sensor on which the flying object is located. From the intersection of the three balls, two points result as a possible position of the flying object 2, wherein the position below the landing area 16 can be excluded. The approach vector is now the vector from the target point on the landing area to the determined position of the flying object 2.”);
calculating, with the computing device, a 3-dimensional vector representing the relative position of the airborne vehicle in relation to the group of microphones from the signals collected by the group of three microphones by way of distance difference estimation (wherein it is understood that the calculated approach vector connotes the claimed “3-dimensional vector” as explained in for example para:
“The calculated approach vector, together with the orientation of the landing area 16, is transmitted by the evaluation computer 9 to the transmitter 14 and transmitted by the antenna 15 of the ground unit, the antenna 8 of the flying object 2 and an intermediate air interface to the receiver 6 of the flying object 2. The transmission is preferably by radio, but can in principle be based on any wireless technology done. The transmitter 14 and the receiver 6 are to be designed accordingly and the antennas 15 and 8 replaced by corresponding devices. The receiver 6 forwards the received approach vector and the orientation of the landing area 16 to the flight computer 5. The flight computer 5 calculates from the approach vector a target vector which points from the flying object 2 in the direction of the destination point on the landing area 16. In order to approach this destination point, the flight computer 5 controls the actuators, for example propellers and / or rotors, in such a way that the flying object 2 moves in the direction of the destination point. Optionally, the flight computer 5 controls the actuators 4 without prior calculation of the target vector. Shortly before landing the flight computer evaluates the orientation of the landing area 16 and controls the actuators 4 in addition so that the position of the flying object 2 of the orientation of the landing area 16 is adjusted.”);
calculating, with the computing device, a 3-dimensional vector representing the relative position of the airborne vehicle in relation to the designated 3-dimensional location by applying vector addition to the vector representing the known relative position of the group of microphones in relation to the designated 3-dimensional location and the vector representing the relative position of the airborne vehicle in relation to the group of three microphones (as explained in for example paras:
“Preferably, the bottom unit 3 in the figure 1, not shown means to determine the orientation of the landing surface 16, in particular with respect to the horizontal. Thus, it is known whether the landing surface is tilted out of the horizontal plane and optional as the landing surface 16 is rotated about a vertical axis. From the output signals of the measuring computer and the orientation of the landing area 16, the evaluation computer 9 can calculate an approach vector, which is specified in an earth-related coordinate system.”);
sending, by the wireless transmitter, the 3-dimensional vector representing the relative 3-dimensional position of the airborne vehicle in relation to the designated 3-dimensional location of the airborne vehicle to the airborne vehicle for the purpose of calculating, at the airborne vehicle, the control parameters to decrease the distance between the airborne vehicle and the designated 3-dimensional location, and for applying these calculated control parameters to the vehicle management system (as explained in for example paras:
“The calculated approach vector, together with the orientation of the landing area 16, is transmitted by the evaluation computer 9 to the transmitter 14 and transmitted by the antenna 15 of the ground unit, the antenna 8 of the flying object 2 and an intermediate air interface to the receiver 6 of the flying object 2. The transmission is preferably by radio, but can in principle be based on any wireless technology done. The transmitter 14 and the receiver 6 are to be designed accordingly and the antennas 15 and 8 replaced by corresponding devices. The receiver 6 forwards the received approach vector and the orientation of the landing area 16 to the flight computer 5. The flight computer 5 calculates from the approach vector a target vector which points from the flying object 2 in the direction of the destination point on the landing area 16. In order to approach this destination point, the flight computer 5 controls the actuators, for example propellers and / or rotors, in such a way that the flying object 2 moves in the direction of the destination point. Optionally, the flight computer 5 controls the actuators 4 without prior calculation of the target vector. Shortly before landing the flight computer evaluates the orientation of the landing area 16 and controls the actuators 4 in addition so that the position of the flying object 2 of the orientation of the landing area 16 is adjusted.”);
and repeating the receiving, calculating, and sending after the airborne vehicle has changed position (as explained in for example paras:
“The approach vector is determined at the ground unit and usually transmitted wirelessly to the flying object, for example by radio, infrared, acoustic or as a light signal. Viewed from the flying object, the target point of the landing area is in the direction opposite to the approach vector. Thus, the flying object can be automatically moved in the direction of the target point. The approach vector is preferably (periodically) repeatedly determined and transmitted, for example every second, every 2, 5, 10, 15, 20, 30 or more seconds, for example 2, 5, 10, 15, 25, 25 or 50 times per second or more often.
The calculated approach vector, together with the orientation of the landing area 16, is transmitted by the evaluation computer 9 to the transmitter 14 and transmitted by the antenna 15 of the ground unit, the antenna 8 of the flying object 2 and an intermediate air interface to the receiver 6 of the flying object 2. The transmission is preferably by radio, but can in principle be based on any wireless technology done. The transmitter 14 and the receiver 6 are to be designed accordingly and the antennas 15 and 8 replaced by corresponding devices. The receiver 6 forwards the received approach vector and the orientation of the landing area 16 to the flight computer 5. The flight computer 5 calculates from the approach vector a target vector which points from the flying object 2 in the direction of the destination point on the landing area 16. In order to approach this destination point, the flight computer 5 controls the actuators, for example propellers and / or rotors, in such a way that the flying object 2 moves in the direction of the destination point. Optionally, the flight computer 5 controls the actuators 4 without prior calculation of the target vector. Shortly before landing the flight computer evaluates the orientation of the landing area 16 and controls the actuators 4 in addition so that the position of the flying object 2 of the orientation of the landing area 16 is adjusted.”.
While MARTINKAT appears to teach the invention as claimed and explained above including using microphones to calculate an approach vector, MARTINKAT does not appear to expressly disclose wherein the three microphones are configured as three microphones pairs for jointly processing each pair during distance difference estimations and calculating 3-dimensional locations;
calculating, with the computing device, a 3-dimensional vector representing the relative position of the airborne vehicle in relation to the group of microphones from the signals collected by the group of three microphones by way of distance difference estimation;
calculating, with the computing device, a 3-dimensional vector representing the relative position of the airborne vehicle in relation to the designated 3-dimensional location by applying vector addition to the vector representing the known relative position of the group of microphones in relation to the designated 3-dimensional location.
In analogous art Fulton teaches in for example, the figures below:
PNG
media_image2.png
592
525
media_image2.png
Greyscale
PNG
media_image3.png
442
535
media_image3.png
Greyscale
PNG
media_image4.png
394
545
media_image4.png
Greyscale
PNG
media_image5.png
417
502
media_image5.png
Greyscale
And associated descriptive texts three microphones are configured as three microphones pairs for jointly processing each pair during distance difference estimations and calculating 3-dimensional locations (as explained in for example para:
“[0082] As previously set forth, it is contemplated the acoustic sensor 20 can incorporate a second pair of microphones at a set angle to the first pair of microphones, such as at 90.degree. angle to the first microphone axis MA. The second set of microphones could be used in a similar manner as the described above to calculate a height above the ground plane, completing the 3-dimensional solution of the source location. Alternatively, if boundary finding algorithms will be used to solve the 3-dimensional cone problem, the 2-dimensional solution S can be used as a seed value to speed the calculations.”;
calculating, with the computing device, a 3-dimensional vector representing the relative position of the airborne vehicle in relation to the group of microphones from the signals collected by the group of three microphones by way of distance difference estimation (as explained in for example para:
“[0100] The present system 10 thus provides a method for 360.degree. directional acoustic event detection. In the method, selected frequencies or ranges within the spectra of acoustic events in front or behind the acoustic sensor 20 can be selectively filtered. Separate front acoustic spectra and back acoustic spectra from the acoustic sensor 20 are communicated to the command unit 120, wherein the command unit employs a combination of the monitored sound pressure within the monitored frequencies and a continuously variable degree of heterodyne sum and difference of the sound pressure received by the individual microphones of the acoustic sensor at selected identical or similar frequencies and a common absolute or relative time of arrival of the acoustic event.”);
calculating, with the computing device, a 3-dimensional vector representing the relative position of the airborne vehicle in relation to the designated 3-dimensional location by applying vector addition to the vector representing the known relative position of the group of microphones in relation to the designated 3-dimensional location (as explained in for example para:
“[0100] The present system 10 thus provides a method for 360.degree. directional acoustic event detection. In the method, selected frequencies or ranges within the spectra of acoustic events in front or behind the acoustic sensor 20 can be selectively filtered. Separate front acoustic spectra and back acoustic spectra from the acoustic sensor 20 are communicated to the command unit 120, wherein the command unit employs a combination of the monitored sound pressure within the monitored frequencies and a continuously variable degree of heterodyne sum and difference of the sound pressure received by the individual microphones of the acoustic sensor at selected identical or similar frequencies and a common absolute or relative time of arrival of the acoustic event.”).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to combine the microphone pairs and methods disclosed in Fulton with the microphones and methods taught in MARTINKAT with a reasonable expectation of success because it would have “resolved acoustic event location ambiguities” as taught by Fulton Paras:
“[0008] Acoustic techniques have been used to calculate potential acoustic source positions based on a time delay associated with an acoustic signal traveling along two different paths to reach two spaced-apart microphones. U.S. Pat. Nos. 6,600,824 and 7,039,1 98 disclose microphone arrays for locating a signal source, each of these patents being expressly incorporated by reference. However, resolution of ambiguity in acoustic source positions and sensitivity of prior systems has limited applicability in real time, environment independent systems.
[0009] The need remains for an acoustic monitoring system that can resolve acoustic event location ambiguities as well as provide a sufficiently robust system that allows deployment in hostile operating environments.”.
Regarding claim 2 and the limitation the method of claim 1, wherein the three microphones are installed at or near a landing site (see the EXPRESS teachings of MARTINKAT fig. 1 and para:
“The three sensors are spaced apart on or near the landing surface 16.”).
Regarding claim 3 and the limitation the method of claim 1, wherein the sound source is a first sound source with a first sound signature (see the teachings of MARTINKAT wherein it is understood that each sensor 10, 12 and third are transmitting their own sound signatures as explained in for example para:
“The sensors each emit an electromagnetic, optical or acoustic signal, which is reflected back from the reflecting surface 7 on the flying object 2 to the respective sensor. The Measuring computers receive the output signals of the assigned sensors and generate from the evaluation computer 9 evaluable signals or information. These signals are transmitted to the evaluation computer 9, which calculates therefrom an approach vector which points from a target point, not shown, on the landing area in the direction of the flying object 2 and whose length represents the distance between the ground unit 3 and the flying object 2. The approach vector is determined by calculating the respective distance of the flying object 2 from the three sensors from the transit time of the signal emitted by each sensor and reflected by the flying object 2. The respective distance is the radius of an imaginary spherical surface around the respective sensor on which the flying object is located. From the intersection of the three balls, two points result as a possible position of the flying object 2, wherein the position below the landing area 16 can be excluded. The approach vector is now the vector from the target point on the landing area to the determined position of the flying object 2.”)
and the airborne vehicle comprises a second sound source with a second sound signature (see the teachings of MARTINKAT para:
The present invention further relates to a system for carrying out the method described above. The system includes a ground unit having a landing site having a landing point, a computing unit for detecting an approach vector pointing from the destination toward an unmanned flying object, and a transmitting unit for transmitting the approach vector. The system further includes an unmanned flying object having a receiver for receiving the approach vector and a flight computer for controlling the object of flight based on the received approach vector. In one embodiment of the invention, the system includes a transmitter on the ground unit to emit an electromagnetic or acoustic signal, a reflector on the flying object to reflect the signal, and a detector on the ground unit to receive the reflected signal. From the received signal, for example the time of arrival of the signal, the approach vector can be calculated. The reflector is in particular a retroreflector, which reflects an incident signal back in the direction of incidence. The transmitter and the detector may, for example, be a radar device. The transmitter may also be a laser deflectable in a spatial area such as a plane and the detector may be a light detector. The ground unit may comprise a plurality of transmitters and / or a plurality of detectors. In an alternative embodiment of the invention, the system includes a transmitter at the flying object to emit an electromagnetic or acoustic signal, and a detector at the ground unit for receiving the signal. The detector, analogous to the method described above, may include one or more microphones, one or more antennas or a lens, and a one or two dimensional array of photosensitive elements, such as a CCD chip.”),
and the 3-dimensional location of the first and second sound sources are calculated separately (see the teachings of MARTINKAT wherein it is understood that as explained immediately above the sound emitted from a transmitter at the flying object would have to be calculated separately from the sounds being admitted from the speakers at the landing site in order to make the calculations necessary to determine the location. As expressly taught by MARTINKAT, A different computer (11, 13, etc.) is used for each sensor to calculate the dimensional location separately).
Regarding claim 5 and the limitation the method of claim 3, further comprising: constructing the 3-dimensional position of the airborne vehicle, wherein three microphones capture the signals from the first and second sound sources;
wherein the z-coordinate of one of the first or second sound sources cannot be calculated because of equidistance from the three microphones;
and wherein the 3-dimensional location of the airborne vehicle (X0, Y0, Z0) is calculated from the locations of the first and second sound sources by using a known z-coordinate to reconstruct the z-coordinate that cannot be calculated (see the teachings of MARTINKAT para:
“For example, the signal is transmitted by a transmitter and the reflected signal is received by a receiver. From the signal transit time between transmission and reception of the signal, the distance of the flying object from the receiver can be calculated, so that the position of the flying object on a spherical surface with the receiver as center and the calculated distance as radius. With two receivers, the position is on a circle, which results as the intersection of the two spherical surfaces around the two receivers. With three receivers, two possible positions of the flying object result from the intersection of three spherical surfaces, one of which can be excluded, for example, by prior knowledge of the half-space in which the flying object must be located. In one embodiment, a transmitter is assigned to each receiver, wherein transmitter and receiver are preferably arranged in a common housing. In another embodiment, a transmitter is used whose reflected signal is received by multiple receivers. In this case, the area around the receiver on which the flying object can be located is not a spherical surface but the surface of an ellipsoid. The position determination in two or more receivers is, however, analogous to the first embodiment by the formation of the intersection of two or more ellipsoids. Alternatively, a spatial region, for example a plane, can be scanned by means of a laser. Due to the short signal propagation time of the laser light at the relatively small distance of the flying object in the landing approach corresponds to the direction in which the laser is aligned in the detection of the reflected signal, the direction of the flying object. The distance of the flying object from the target point can be determined using two or more spatially-spaced lasers. In an alternative embodiment, the approach vector is determined by the fact that an electromagnetic or acoustic signal emitted by the flying object is detected at the ground unit. For this purpose, the signal is detected, for example, at two or more spatially separated positions, and the incident direction and thus the approach vector are determined from the transit time difference between the detection positions. An electromagnetic signal in the visible or infrared spectrum can be directed by means of a lens onto a one- or two-dimensional array of light-sensitive elements, such as a CCD chip, so that the direction of incidence and thus the approach vector are closed from the element hit by the incident signal can be.”).
Regarding claim 6 and the limitation the method of claim 3, wherein the first sound source has a first sound signature and the second sound source has a second sound signature, and the first and second sound signatures are stored by the computing device and further comprising calculating the z-coordinate of the airborne vehicle using the stored first and second sound signatures when a position of airborne vehicle on the z-axis cannot be calculated because one of the first or second sound sources is equidistant to the three microphones (see the teachings of MARTINKAT wherein it is understood that each microphone sensor sends out its own specific signal as explained in for example paras:
“For example, the signal is transmitted by a transmitter and the reflected signal is received by a receiver. From the signal transit time between transmission and reception of the signal, the distance of the flying object from the receiver can be calculated, so that the position of the flying object on a spherical surface with the receiver as center and the calculated distance as radius. With two receivers, the position is on a circle, which results as the intersection of the two spherical surfaces around the two receivers. With three receivers, two possible positions of the flying object result from the intersection of three spherical surfaces, one of which can be excluded, for example, by prior knowledge of the half-space in which the flying object must be located. In one embodiment, a transmitter is assigned to each receiver, wherein transmitter and receiver are preferably arranged in a common housing. In another embodiment, a transmitter is used whose reflected signal is received by multiple receivers. In this case, the area around the receiver on which the flying object can be located is not a spherical surface but the surface of an ellipsoid. The position determination in two or more receivers is, however, analogous to the first embodiment by the formation of the intersection of two or more ellipsoids. Alternatively, a spatial region, for example a plane, can be scanned by means of a laser. Due to the short signal propagation time of the laser light at the relatively small distance of the flying object in the landing approach corresponds to the direction in which the laser is aligned in the detection of the reflected signal, the direction of the flying object. The distance of the flying object from the target point can be determined using two or more spatially-spaced lasers. In an alternative embodiment, the approach vector is determined by the fact that an electromagnetic or acoustic signal emitted by the flying object is detected at the ground unit. For this purpose, the signal is detected, for example, at two or more spatially separated positions, and the incident direction and thus the approach vector are determined from the transit time difference between the detection positions. An electromagnetic signal in the visible or infrared spectrum can be directed by means of a lens onto a one- or two-dimensional array of light-sensitive elements, such as a CCD chip, so that the direction of incidence and thus the approach vector are closed from the element hit by the incident signal can be.
The sensors each emit an electromagnetic, optical or acoustic signal, which is reflected back from the reflecting surface 7 on the flying object 2 to the respective sensor. The Measuring computers receive the output signals of the assigned sensors and generate from the evaluation computer 9 evaluable signals or information. These signals are transmitted to the evaluation computer 9, which calculates therefrom an approach vector which points from a target point, not shown, on the landing area in the direction of the flying object 2 and whose length represents the distance between the ground unit 3 and the flying object 2. The approach vector is determined by calculating the respective distance of the flying object 2 from the three sensors from the transit time of the signal emitted by each sensor and reflected by the flying object 2. The respective distance is the radius of an imaginary spherical surface around the respective sensor on which the flying object is located. From the intersection of the three balls, two points result as a possible position of the flying object 2, wherein the position below the landing area 16 can be excluded. The approach vector is now the vector from the target point on the landing area to the determined position of the flying object 2.”).
Regarding claim 7 and the limitation the method of claim 1, wherein the airborne vehicle receives 3-dimensional location information by way of a radio, a light, or a sound signal (see the express teachings of MARTINKAT Fig. 1, vehicle 2 via antenna 8 receives location information via radio 14 via antenna 15 as explained in for example paras:
“The system 1 comprises an unmanned flying object 2 and a ground unit 3. The flying object 2 has actuators 4, a flight computer 5, a communication receiver 6, a reflecting surface 7 and an antenna 8. The antenna 8 is connected to the receiver 6, the receiver 6 in turn to the flight computer 5 and the flight computer 5 with the actuators 4. In this embodiment, each connection can be a direct connection or for example via a bus system. The flying object 2 is preferably able to land vertically, i.e. in particular to lower itself in parallel to a landing area.
The ground unit 3 has an evaluation computer 9, three transmit and receive sensors (shown in FIG. 1 are only the sensors 10 and 12), three measurement computers (only the measurement computers 11 and 13 are shown in FIG. 1), a communication transmitter 14, a Antenna 15 and a landing area 16. The sensor 10 is connected to the measuring computer 11, the sensor 12 to the measuring computer 13 and the third sensor to the third measuring computer. All three measuring computers 11 and 13 are connected to the evaluation computer 13. The evaluation computer 9 is connected to the transmitter 14 and the transmitter 14 to the antenna 15. The three sensors are spaced apart on or near the landing surface 16.
The sensors each emit an electromagnetic, optical or acoustic signal, which is reflected back from the reflecting surface 7 on the flying object 2 to the respective sensor. The Measuring computers receive the output signals of the assigned sensors and generate from the evaluation computer 9 evaluable signals or information. These signals are transmitted to the evaluation computer 9, which calculates therefrom an approach vector which points from a target point, not shown, on the landing area in the direction of the flying object 2 and whose length represents the distance between the ground unit 3 and the flying object 2. The approach vector is determined by calculating the respective distance of the flying object 2 from the three sensors from the transit time of the signal emitted by each sensor and reflected by the flying object 2. The respective distance is the radius of an imaginary spherical surface around the respective sensor on which the flying object is located. From the intersection of the three balls, two points result as a possible position of the flying object 2, wherein the position below the landing area 16 can be excluded. The approach vector is now the vector from the target point on the landing area to the determined position of the flying object 2.”).
Regarding claim 8 and the limitation the method of claim 1, wherein a fourth microphones receives signals from the sound source (see the teachings of MARTINKAT “The detector, analogous to the method described above, may include one or more microphones…“. And MPEP 2144.05.VI. Duplication of parts wherein it is the Examiners understanding that “mere duplication of parts has no patentable significance unless a new and unexpected result is produced.”).
Regarding claim 9 and the limitation the method of claim 8, wherein more than four microphones are used to receive sound signals from the sound source (see the teachings of MARTINKAT “The detector, analogous to the method described above, may include one or more microphones…“. And MPEP 2144.04.VI. Duplication of parts wherein it is the Examiners understanding that “mere duplication of parts has no patentable significance unless a new and unexpected result is produced.”).
Regarding claim 10 the combination of MARTINKAT and Fulton teach in the rejection of corresponding parts of claim 1 above incorporated herein by reference
the limitations a method of guiding an airborne vehicle by collecting sound from a sound source associated with the airborne vehicle with four microphones, the method comprising:
receiving signals, from the sound source, at the four microphones positioned so that there is no point equidistant to the four microphones (per MPEP 2144.04.VI.C. Rearrangement of parts wherein it is understood it would have been an obvious matter of design choice to rearrange the parts of MARTINKAT “because shifting the position of the microphones would not have modified the operation of the device”);
wherein the four microphones are logically connected with a computing device having a processor, nonvolatile storage, and a wireless transmitter and wherein the four microphones are configured as four or more microphone pairs for jointly processing each pair during distance difference estimations and calculating 3-dimensional locations (see the teachings of the Fulton with regard to, inter alia microphone pairs);
calculating, with the computing device, a 3-dimensional location of the airborne vehicle from the signals collected by the four microphones by way of distance difference estimation, the four microphones being configured as four or more microphone pairs for jointly processing each pair during distance difference estimation and for calculating the 3-dimensional location;
sending, by the wireless transmitter, the 3-dimensional location of the airborne vehicle to the airborne vehicle;
and repeating the receiving, calculating, and sending after the airborne vehicle has changed position (see the teachings of MARTINKAT in the rejection of corresponding parts of claim 1 above incorporated herein by reference.).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to combine the microphone pairs and methods disclosed in Fulton with the microphones and methods taught in MARTINKAT with a reasonable expectation of success because it would have “resolved acoustic event location ambiguities” as taught by Fulton Paras:
“[0008] Acoustic techniques have been used to calculate potential acoustic source positions based on a time delay associated with an acoustic signal traveling along two different paths to reach two spaced-apart microphones. U.S. Pat. Nos. 6,600,824 and 7,039,1 98 disclose microphone arrays for locating a signal source, each of these patents being expressly incorporated by reference. However, resolution of ambiguity in acoustic source positions and sensitivity of prior systems has limited applicability in real time, environment independent systems.
[0009] The need remains for an acoustic monitoring system that can resolve acoustic event location ambiguities as well as provide a sufficiently robust system that allows deployment in hostile operating environments.”.
Regarding claim 11 and the limitation the method of claim 10, wherein the four microphones are installed at or near a landing site (see the rejection of corresponding parts of claims 10 and 2 above incorporated herein by reference.).
Regarding claim 12 and the limitation the method of claim 10, wherein more than four microphones receive signals from the sound source (see the rejection of corresponding parts of claims 10 and 9 above incorporated herein by reference.).
Regarding claim 13 and the limitation the method of claim 10, wherein the airborne vehicle receives 3-dimensional location information by way of a radio, a light, or a sound signal (see the rejection of corresponding parts of claims 10 and 7 above incorporated herein by reference.).
Regarding claim 14 the combination of MARTINKAT and Fulton teach in the rejection of corresponding parts of claim 1 above incorporated herein by reference
the limitations a method of navigating an airborne vehicle by emitting sound from a sound source associated with the airborne vehicle for collection by three microphones, the method comprising:
emitting signals, from the sound source, for collection by the three microphones at or near a landing site (see MARTINKAT, Fig. 1 above items 10/11, 12/13, and 16 respectively),
wherein the three microphones are logically connected with a computing device having a processor, nonvolatile storage, and a wireless transmitter, wherein, with the computing device, a 3-dimensional location of the airborne vehicle is calculated from the signals collected by the three microphones by way of distance difference estimation (see MARTINKAT, Fig. 1 above items 10/11, 12/13, 9, 14 and 15 respectively);
receiving, at the airborne vehicle, the 3-dimensional location of the airborne vehicle from the computing device (see MARTINKAT, Fig. 1 above item 2 via items 9, 14 and 15 respectively);
changing the position of the airborne vehicle relative to the three microphones (see MARTINKAT para:
“The approach vector is determined at the ground unit and usually transmitted wirelessly to the flying object, for example by radio, infrared, acoustic or as a light signal. Viewed from the flying object, the target point of the landing area is in the direction opposite to the approach vector. Thus, the flying object can be automatically moved in the direction of the target point. The approach vector is preferably (periodically) repeatedly determined and transmitted, for example every second, every 2, 5, 10, 15, 20, 30 or more seconds, for example 2, 5, 10, 15, 25, 25 or 50 times per second or more often.”);
and repeating the emitting, receiving, and changing after the airborne vehicle has changed position (see MARTINKAT para:
“The approach vector is determined at the ground unit and usually transmitted wirelessly to the flying object, for example by radio, infrared, acoustic or as a light signal. Viewed from the flying object, the target point of the landing area is in the direction opposite to the approach vector. Thus, the flying object can be automatically moved in the direction of the target point. The approach vector is preferably (periodically) repeatedly determined and transmitted, for example every second, every 2, 5, 10, 15, 20, 30 or more seconds, for example 2, 5, 10, 15, 25, 25 or 50 times per second or more often.”).
Regarding claim 15 and the limitation the method of claim 14, wherein the sound source is a first sound source with a first sound signature and the airborne vehicle comprises a second sound source with a second sound signature (see the rejection of corresponding parts of claim 3 above incorporated herein by reference.).
Regarding claim 18 and the limitation the method of claim 14, further comprising emitting signals for collection by four microphones (see the rejection of corresponding parts of claim 8 above incorporated herein by reference.).
Regarding claim 19 and the limitation the method of claim 18, further comprising emitting signals for collection by a fourth microphone and wherein the four microphones are positioned so that no equidistant point exists between the four microphones (see the rejection of corresponding parts of claims 18 and 6 above incorporated herein by reference “obvious matter of design choice”).
Regarding claim 20 and the limitation the method of claim 15, further comprising emitting signals for collection by more than four microphones (see the rejection of corresponding parts of claim 9 above incorporated herein by reference.).
Claims 4, 16 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over in view WO 2012065681 A2 to MARTINKAT NORBERT et al. (MARTINKAT) in view of US 20100008515 A1 to Fulton; David Robert et al. (Fulton) as applied to the claims above in view of US 20130053056 A1 to Aggarwal; Alok et al. (Aggarwal),
Regarding claim 4 the combination of MARTINKAT and Fulton does not appear to expressly disclose the limitations further comprising sending, with the wireless transmitter, a request to the airborne vehicle to change position in any direction, when a position of the airborne vehicle on the z-axis cannot be calculated because the sound source is equidistant from the three microphones.
In analogous art Aggarwal teaches in for example, the figures below:
PNG
media_image6.png
834
586
media_image6.png
Greyscale
PNG
media_image7.png
837
600
media_image7.png
Greyscale
PNG
media_image8.png
733
499
media_image8.png
Greyscale
PNG
media_image9.png
600
547
media_image9.png
Greyscale
PNG
media_image10.png
519
740
media_image10.png
Greyscale
And associated descriptive texts sending, with the wireless transmitter, a request to the airborne vehicle to change position in any direction, when a position of the airborne vehicle on the z-axis cannot be calculated because the sound source is equidistant from the three microphones (in for example paras:
“[0077] For certain example implementations, indications of expected uncertainty reduction may be determined for transmitters 104. More specifically, a first indication of expected uncertainty reduction may be determined for one or more signals transmitted from first transmitter 104-1, or a second indication of expected uncertainty reduction may be determined for one or more signals transmitted from second transmitter 104-2. As shown in FIG. 6, first transmitter 104-1 is approximately equidistant from a first cluster of first uncertainty 114a and a second cluster of second uncertainty 114b. Furthermore, there are zero obstacles (i) between first transmitter 104-1 and a first cluster of first uncertainty 114a and (ii) between first transmitter 104-1 and a second cluster of a second uncertainty 114b. Consequently, a ranging value for a mobile device that is determined from a direct measurement with first transmitter 104-1 may not differentiate between first uncertainty 114a or second uncertainty 114b. In other words, a direct measurement with first transmitter 104-1 is not likely to yield a useful positioning indication in an example scenario. For example, an RSSI or an RTT for either or both clusters may be expected to be substantially equal. Applying direct measurement weights at phase [2] using one or more measurements from first transmitter 104-1 may not be likely to help distinguish between the first and second clusters. In contrast, second transmitter 104-2 is not equidistant from the first and second clusters. Applying direct measurement weights at phase [2] using one or more measurements from second transmitter 104-2 may be likely to help distinguish between the first and second clusters. Accordingly, a first indication of expected uncertainty reduction for acquiring one or more signals transmitted from first transmitter 104-1 may be determined to be lower than a second indication of expected uncertainty reduction for acquiring one or more signals transmitted from second transmitter 104-2.”).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to combine the method of navigating disclosed in Aggarwal with the method of navigating taught in the combination of MARTINKAT and Fulton with a reasonable expectation of success because it would have improved “navigational or other location-based services” as taught by Aggarwal Para(s):
“[0005] Electronic maps, web-based mapping services, and turn-by-turn directions focus on providing navigational aids in certain situations and in particular environments. Unfortunately, there are other situations or different environments for which they are not intended or have not been designed. Consequently, there remain a number of situations, environments, etc. in which navigational or other location-based services may be improved.”.
Regarding claim 16 and the limitation the method of claim 15, wherein the sound source is equidistant from the three microphones and the 3-dimensional position of the airborne vehicle received by the airborne vehicle comprises X, Y, and Z coordinates calculated from the locations of the first and second sound sources by using a known z-coordinate to reconstruct a z-coordinate that cannot be calculated (see the rejection of corresponding parts of claims 15 and 4 above incorporated herein by reference.).
Regarding claim 17 and the limitation the method of claim 14, wherein the sound source is equidistant from the three microphones and the airborne vehicle changes position in response to a request from the computing device (see the rejection of corresponding parts of claims 14 and 4 above incorporated herein by reference.).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure as teaching, inter alia, the state of the art of aerial vehicle landing systems at the time of the invention. For example:
US 20210150919 A1 to SCHAFERLEIN; Ulrich teaches, inter alia a METHOD AND APPARATUS FOR MONITORING THE TAKE-OFF AND LANDING PROCEDURE OF AN AIRCRAFT AND SYSTEM in for example the ABSTRACT, Figures and/or Paragraphs below:
PNG
media_image11.png
400
667
media_image11.png
Greyscale
“A method for monitoring the take-off and/or landing procedure of an aircraft (1), in particular for an electrical, vertical take-off and landing aircraft (1), in which a monitoring region of a take-off and landing site (2) is monitored by at least one microphone (4, 5) of a monitoring station to detect sound emission data of an aircraft (1) taking off or landing as it approaches or departs and the detected sound emission data are transmitted from the monitoring station to an evaluation unit. The detected sound emission data are evaluated by the evaluation unit by comparing the detected sound emission data to characteristic sound emission data.”.
US 11858625 B1 to Eisenmann; Shmuel et al. teaches, inter alia Object detection using propeller noise in for example the ABSTRACT, Figures and/or Paragraphs below:
“Disclosed are systems and methods to detect objects within an environment by an aerial vehicle. An aerial vehicle may detect objects within an environment based on propeller noises emitted by the aerial vehicle that are reflected back to the aerial vehicle by objects in the environment. The propeller noise may be noise that is generated during normal operation of one or more propellers. The propeller noise emitted by the propellers of the aerial vehicle propagates into the environment around the aerial vehicle and reflects off any objects within the environment. Because the noise generated by each propeller is distinguishable, sets of solutions (distance and all directions) may be computed for each propeller. The intersections of those sets of solutions is representative of the actual distance and direction of the object with respect to the aerial vehicle.”.
US 20120326923 A1 to Oehler; Veit et al. teaches, inter alia a Near Field Navigation System in for example the ABSTRACT, Figures and/or Paragraphs below:
PNG
media_image12.png
358
568
media_image12.png
Greyscale
“A near field navigation system is equipped with a base segment provided on a base structure. The base segment includes at least four transmitters. Each transmitter is provided with a base antenna and the base antennas are positioned relative to each other at known distances. A user segment is provided on a user structure, the user segment including at least one receiver, at least one user antenna connected to the receiver, and a processing unit connected to the receiver. The receiver and each of the transmitters together form distance measuring units and the processing unit is adapted to calculate the relative three-dimensional position data of the user structure with respect to the base structure on the basis of distance data obtained from the distance measuring units.”.
US 5099456 A to Wells; Donald R. teaches, inter alia a Passive locating system in for example the ABSTRACT, Figures and/or Paragraphs below:
PNG
media_image13.png
309
651
media_image13.png
Greyscale
“A passive surveillance system provides ranging and location capability of a signal source. A single receiver in a multipath environment or alternatively a plurality of receivers receive the signal from the source having different propagation delays along different paths. Selection of corresponding frequency components from different paths and mixing of the corresponding selected frequency components from each of the paths generates complex pseudo-noise signals that are suitable for correlation processing. Correlation processing of the mixed signals yields the time difference between the multiple paths. The maximum time difference parameters for each path are used to generate a locus line, either explicitly or implicitly. Range and location processing of the locus information identifies range and location of the signal source.”.
US 5339281 A to Narendra; Patrenahalli M. et al. teaches, inter alia a Compact deployable acoustic sensor in for example the ABSTRACT, Figures and/or Paragraphs below:
PNG
media_image14.png
446
509
media_image14.png
Greyscale
“A sensor for detecting acoustic energy emitted by a target is disclosed. The sensor comprises a housing including a plurality of transducers for receiving acoustic energy. The transducers are randomly deployed within a predetermined area around the housing, forming an array and a self-survey technique is utilized to determine the positions of the sensors. Beamforming techniques are utilized to analyze the acoustic energy detected by the transducers to provide the azimuth of a target and a class estimate of the target. A plurality of sensors may be utilized to provide target location, course and velocity.”.
US 5831936 A to Zlotnick; Gregory et al. teaches, inter alia a System and method of noise detection in for example the ABSTRACT, Figures and/or Paragraphs below:
“A noise detection system is provided which includes a receiving unit, a processing unit and a user interface unit connected together. The receiving unit consists of a three-dimensional acoustical array for generally simultaneously receiving a multiplicity of sound signals from different directions. The sound signals have at least one sound source of interest. The processing unit processes the sound signals and consists of a three-dimensional spatial filter for identifying the elevation and azimuth of each of the sound signals, a sound database containing a multiplicity of soundprints of sound sources of interest, apparatus for classifying the sound signals with the soundprints, and apparatus for providing the azimuth and elevation of each of the classified sound signals. The user interface unit indicates to the user the azimuths and elevations of the classified sound signals. The processing unit further consists of filtering apparatus for filtering extraneous noise signals received by the acoustical array.”.
US 20050096845 A1 to Bergin, Jameson et al. teaches, inter alia INTELLIGENT PASSIVE NAVIGATION SYSTEM FOR BACK-UP AND VERIFICATION OF GPS in for example the ABSTRACT, Figures and/or Paragraphs below:
“A passive navigation system for an airborne platform includes an onboard computer having a database that contains preprogrammed information regarding pre-existing ground-based signal emitters (e.g. cell-phone, television and radio broadcast transmitters). For each emitter, the database includes the geolocation of the emitter and identifying signal characteristic(s) of each emitter's signal such as frequency, bandwidth and strength. An antenna array and digital receiver cooperate with the computer on the platform to passively receive signals from the emitters and determine a direction of arrival (DOA) for selected signals. The computer also extracts identifying signal characteristic(s) from selected received signals and matches them against the database information to ascertain the geolocation of the emitter that corresponds to the received signal. The platform location is then calculated from the DOA(s) and emitter geolocations using a triangulation-type algorithm. Also, preprogrammed site-specific terrain scattering information can be compared to observed scattered signals to enhance system accuracy.”.
US 10101196 B2 to Naguib; Ayman et al. teaches, inter alia a Device for UAV detection and identification in for example the ABSTRACT, Figures and/or Paragraphs below:
“Apparatuses and methods are described herein for identifying a Unmanned Aerial Vehicle (UAV), including, but not limited to, determining a first maneuver type, determining a first acoustic signature of sound captured by a plurality of audio sensors while the UAV performs the first maneuver type, determining a second acoustic signature of sound captured by the plurality of audio sensors while the UAV performs a second maneuver type different from the first maneuver type, determining an acoustic signature delta based on the first acoustic signature and the second acoustic signature, and determining an identity of the UAV based on the acoustic signature delta.”.
US 20180306890 A1 to VATCHER; Robert H. et al. teaches, inter alia a SYSTEM AND METHOD TO LOCATE AND IDENTIFY SOUND SOURCES IN A NOISY ENVIRONMENT in for example the ABSTRACT, Figures and/or Paragraphs below:
PNG
media_image15.png
312
493
media_image15.png
Greyscale
PNG
media_image16.png
456
689
media_image16.png
Greyscale
“A system comprises at least three microphones for generating audio signals representing a sound generated by a sound source, each microphone having a respective identifier (ID), a memory, and a processor. The processor is configured for: storing records in the memory to be referenced using indexes, the indexes based on a time stamp when the audio signals are generated and frequency components of the audio signals, each record containing the respective ID of one of the at least three microphones and a time when the sound is first detected by the microphone corresponding to the ID; matching indexes of records from the memory corresponding to the sound for each of the at least three microphones; and computing a location of the sound source based on the respective arrival times of the sound stored in the records having matching indices by synthetic aperture passive lateration”.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL LAWSON GREENE JR whose telephone number is (571)272-6876. The examiner can normally be reached on MON-THUR 7-5:30PM (EST) .
Examiner interviews are available via telephone and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached on (571) 272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANIEL L GREENE/Primary Examiner, Art Unit 3665 20251227