Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Claim Objections
Claim 68 is objected to because of the following informalities: Claim 68 line 5 appears to be missing a semicolon following “…patrol aircraft”. Appropriate correction is required.
Claim Rejections - 35 USC § 112
Claim 66 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 66 recites the limitations "the defining of the plurality of unique sub-sets" in line 3 and “the determining of the position weight” in line 5. While these limitations appear in claim 65, where it appears these limitations are introduced, claim 66 is not dependent on claim 65, but is instead dependent on claim 60, which does not recite these limitations, nor do any claims preceding claim 60. For sake of examination, the examiner will interpret claim 66 to be dependent upon claim 65, where the limitations are introduced.
There is insufficient antecedent basis for this limitation in the claim.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 51-57, 61-62, and 67-70 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Barazovsky (U.S. Patent No. 10,642,284).
Regarding Claim 51, Barazovsky teaches: A computerized positioning system associated with a vehicle (Barazovsky, Col. 3 Lines 16-40 – “systems” of a “UAV”, or unmanned aerial vehicle, for “determining a location” of the UAV), the computerized positioning system comprising: a processing circuitry (Barazovsky, Col. 5 Lines 26-50 – “one or more processors” which are “capable of executing instructions”) configured to perform the following method:
receive first information indicative of at least one transmission (Barazovsky, Col. 3 Lines 20-34 – where the UAV receives “flight information” from a “service provider” by communication “via one or more wireless network(s)”),
wherein the transmission is associated with at least one object (Barazovsky, Col. 10 Lines 13-26 – the UAV “may determine a ground structure”, or object, having a “known location” in “route to the destination”, where the “latitude and longitude” of the ground structure is provided “accessibly from the service provider” to the UAV),
wherein the first information comprises at least one item of object first position information associated with the at least one object, wherein the at least one item of object first position information is indicative of an absolute position of the at least one object (Barazovsky, Col. 10 Lines 13-26 – a “known location” of the ground structure, or object, the “ground structure having a latitude and longitude known” by the UAV “accessibly from the service provider”; where the latitude and longitude constitute first position information);
receive second position information of the at least one object, the second position information being indicative of a second relative position of the at least one object with respect to the vehicle (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – capturing “one or more depth camera image of the ground structure using the depth camera” and determining “distance and/or angular orientation of the ground structure relative to the camera” of the UAV); and
determine a derived position of the vehicle, based at least on the first information and on the second position information (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – “the UAV 104 may determine a current location of the UAV” by calculating the “distance and/or angular orientation of the ground structure relative to the camera”, i.e. a “offset distance” and “angular offset”, and the “UAV may determine a location for the UAV by applying the offset to the known location of the ground structure”, where the calculated UAV location is stored as “an imaged location”, or derived location; where the known location of the ground structure is the “latitude/longitude”);
wherein the derived position of the vehicle is capable of being utilized to facilitate a correction in a reported position of the vehicle (Barazovsky, Col. 4 Lines 36-57 – determining “a GPS correction value (“loc correction”) as a difference between the GPS location”, or reported position, “and the imaged location”; where the “imaged location” is determined as cited above),
wherein the reported position of the vehicle is based on at least one Global Navigation Satellite Systems (GNSS) signal received by at least one GNSS receiver associated with the vehicle (Barazovsky, Col. 4 Lines 36-57 – “the UAV 104 may determine a GPS location of the UAV 104”).
In regards to Claim 52, Barazovsky teaches the computerized positioning system of Claim 51, and Barazovsky further teaches wherein the method further comprising:
determining a deviation between the derived position and the reported position of the vehicle (Barazovsky, Col. 4 Lines 36-57 – determining “a GPS correction value (“loc correction”) as a difference, or deviation, between the GPS location”, or reported position, “and the imaged location”, or derived location).
In regards to Claim 53, Barazovsky teaches the computerized positioning system of Claim 52, and Barazovsky further teaches wherein the method further comprising at least one of the following:
sending an alert indicative of the determined deviation (Barazovsky, Col. 2 Lines 53-67, Col. 8 Lines 4-15, and Col. 10 Line 64-Col. 11 Line 2 – wherein a “UAV communication module” may “send (transmit, broadcast)” the determined “GPS correction value”, or deviation, “to other UAVs”); or
sending a correction instruction to correct the reported position of the vehicle, thereby deriving a corrected reported position of the vehicle (Barazovsky, Col. 2 Lines 53-67, Col. 8 Lines 4-15, and Col. 10 Line 64-Col. 11 Line 2 – “the UAV 104 may determine a corrected location of the UAV” by applying the “GPS correction value” to a received GPS location, wherein a “UAV communication module” may “send (transmit, broadcast)” the determined “corrected location of the UAV”, or “true location”, to other UAVs).
In regards to Claim 54, Barazovsky teaches the computerized positioning system of Claim 53, and Barazovsky further teaches wherein the alert is sent to at least one of: a user interface associated with a human operator; an autonomous navigation system; or an external system (Barazovsky, Col. 8 Lines 4-15 and Col. 10 Line 64-Col. 11 Line 2 – wherein a “UAV communication module” may “send (transmit, broadcast)” the determined “GPS correction value”, or deviation, “to other UAVs”; where other UAVs are external to the current UAV),
wherein the instruction is sent to at least one of: a user interface associated with a human operator; an autonomous navigation system; or an external system (Barazovsky, Col. 8 Lines 4-15 and Col. 10 Line 64-Col. 11 Line 2 – wherein a “UAV communication module” may “determine a corrected location of the UAV” by applying the “GPS correction value”, such that an instruction is sent to the UAV system to apply the correction value, and wherein the “UAV communication module” may “send (transmit, broadcast)” the determined “corrected location of the UAV”, or “true location”, to other UAVs; where other UAVs are external to the current UAV).
In regards to Claim 55, Barazovsky teaches the computerized positioning system of Claim 53, and Barazovsky further teaches wherein the method further comprising at least one of the following:
navigating the vehicle based on the corrected reported position of the vehicle (Barazovsky, Col. 7 Line 36-Col. 8 Line 3 and Col. 10 Line 64-Col. 11 Line 2 – the “corrected location may be used as a true or utilized location of the UAV” and is determined by a “locating component”/”signal source analyzer”; where the UAV’s “flight controller” may “receive inputs from the signal source analyzer 226”, also known as the “locating component 226”, to “update the flight plan” and/or “make changes in a direction or conduct of flight based on the information from the signal source analyzer 226”); or
performing at least one repetition of the method, the repetition thereby enabling a tracking of the corrected reported position (Barazovsky, Col. 5 Lines 4-25, Col. 6 Lines 53-57, and Col. 10 Line 60-Col. 11 Line 2 – where the UAV may repeat the method above from “time to time” with different “round structures having known locations” existing “along a flight plan” to “determine a location of the UAV”; where the UAV may apply the determined “GPS correction value” to subsequently received “GPS location[s]”, such as “a second GPS location of the UAV” not used in the calculation of the GPS correction value).
In regards to Claim 56, Barazovsky teaches the computerized positioning system of Claim 51, and Barazovsky further teaches wherein at least one of the following is true:
the first information comprises at least one item of object identification information of the at least one object (Barazovsky, Col. 2 Line 6-52 – “objects or structures that are stationary and uniquely identifiable by a UAV using an imaging device”, for example “image analysis algorithms may be used to identify the ground structure”);
the derived position is capable of being utilized in a case of a disruption associated with the at least one GNSS signal;
the at least one received GNSS signal and the at least one GNSS receiver are associated with at least one of the following technologies: Global Positioning System (GPS) (Barazovsky, Col. 4 Lines 36-57 – “the UAV 104 may determine a GPS location of the UAV 104”), Global Navigation Satellite System (GLONASS) and Galileo;
the transmission is received from at least one of a transmitter and a transponder that are associated with the at least one object;
the vehicle is associated with a receiver configured for receiving the transmission (Barazovsky, Col. 3 Lines 20-34, Col. 6 Line 58-Col. 7 Line 3 – where the UAV receives “flight information” through communication “via one or more wireless network(s)” by a “network interface 216, such as a transceiver”);
the object first position information associated with the at least one object comprises GNSS position information of the at least one object;
the object first position information associated with the at least one object (Barazovsky, Col. 4 Lines 7-35 – “a ground structure 116 having a known location, such as a known latitude and longitude”); the determining of the derived position of the vehicle in said step (c), based at least on the first information and on the second position information, comprises determining a derived absolute position of the vehicle (Barazovsky, Col. 4 Lines 7-57, Col. 10 Lines 33-42, and Col. 10 Line 60-Col. 11 Line 2 – “the UAV 104 may determine a current location of the UAV” by calculating the “distance and/or angular orientation of the ground structure relative to the camera”, i.e. a “offset distance” and “angular offset”, and the “UAV may determine a location for the UAV by applying the offset to the known location of the ground structure”, where the offset is a “GPS correction value” and the determined “corrected location of the UAV” is “a true or utilized location of the UAV”, or absolute position);
the receiving of the first information comprises receiving an internet feed indicative of the first information, wherein the determining of the derived position of the vehicle is based at least on the internet feed; or
the vehicle is associated with at least one sensor, wherein the second position information being based on sensor data obtained from the at least one sensor, wherein the second relative position comprising a range of the at least one object and at least one relative angle of the at least one object (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – capturing “one or more depth camera image of the ground structure using the depth camera” and determining “distance and/or angular orientation of the ground structure relative to the camera” of the UAV; where the camera is “an imaging device”, or image sensor).
In regards to Claim 57, Barazovsky teaches the computerized positioning system of Claim 56, and Barazovsky further teaches wherein at least one of the following is true:
the disruption comprises least one: of jamming, interference or spoofing of the at least one GNSS signal, GNSS receiver failure, GNSS antenna failure;
the transponder is an Automatic Identification system (AIS) transponder;
the object identification information of the at least one object comprises at least one of: a MMSI (Maritime Mobile Service Identity); a name of the at least one object (Barazovsky, Col. 12 Line 61-Col. 13 Line 5 – “the UAV 104 may determine an identifier of the ground structure”, for example based on an image, determining that the ground structure is “the Eiffel tower, Big Ben, etc.”, including a “a look up table or other data repositories”, or QR code for identification), a call sign of the at least one object;
the internet feed comprises an internet update of AIS information;
the internet feed is received from at least one satellite;
the at least one sensor comprises at least one of: a Radio Detection and Ranging (RADAR) system; an Identification Friend or Foe (IFF) system; and Automatic Dependent Surveillance-Broadcast (ADS-B) system; or
the at least one sensor comprising a range finder and at least one imaging sensor (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – an “imaging device” may be a “depth camera” which may “determine a distance from an object, such as the ground structure”, such that the depth camera acts as a range finder and captures “one or more images”).
In regards to Claim 61, Barazovsky teaches the computerized positioning system of Claim 57, and Barazovsky further teaches wherein the at least imaging sensor comprising at least one camera (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – an “imaging device” may be a “depth camera”).
In regards to Claim 62, Barazovsky teaches the computerized positioning system of Claim 51, and Barazovsky further teaches wherein the at least one object comprises a plurality of objects, wherein the determining of the derived position of the vehicle is based on an intersection of second relative positions of objects of the plurality of objects (Barazovsky, Fig. 5 and Col. 11 Lines 7-38 – where a UAV 514 may determine its position based on the location information from “three or more UAVs”, as shown on Fig. 5, “using triangulation algorithms”, i.e. using the distances of the three or more UAVs relative to the UAV 514, see distances A through C on Fig. 5).
PNG
media_image1.png
409
586
media_image1.png
Greyscale
Barazovsky, Fig. 5
In regards to Claim 67, Barazovsky teaches the computerized positioning system of Claim 51, and Barazovsky further teaches wherein at least one of the following is true:
the determining of the derived position of the vehicle utilizes a geo-registration process;
the method further comprising, in a case where the second position information is indicative of an insufficient number of the at least one object, sending an instruction to the vehicle to increase the altitude; or
the method further comprising, in a case where the second position information is indicative of an insufficient number of the at least one object, sending an instruction to the vehicle to move to a geographical area comprising a larger number of objects (Barazovsky, Col. 5 Lines 4-25 and Col. 10 Line 27-32 – wherein the UAV may determine it necessary to “deviate from the flight plan from time to time to capture imagery of a ground structure having a known location (e.g., the ground structure 116, etc.) to determine a location of the UAV 104” in order to “place the ground structure within a field of view of the UAV”, such that before the deviation, there is an insufficient number of ground structures within a field of view).
In regards to Claim 68, Barazovsky teaches the computerized positioning system of Claim 51, and Barazovsky further teaches wherein at least one of the following is true:
the at least one object is one object (Barazovsky, Col. 2 Lines 6-26 and Col. 10 Lines 13-26 – “a ground structure”, or object, such as “unique structures or objects”);
the vehicle is an airborne vehicle (Barazovsky, Col. 1 Line 66-Col. 2 Line 26 – “an unmanned aerial vehicle (UAV)”);
the airborne vehicle is a patrol aircraft
the airborne vehicle is an Unmanned Aerial Vehicle (UAV) (Barazovsky, Col. 1 Line 66-Col. 2 Line 26 – “an unmanned aerial vehicle (UAV)”);
the at least one object comprises at least one water-borne vehicle;
the at least one water-borne vehicle comprises at least one ship; or
the at least one object comprises at least one fixed-position object (Barazovsky, Col. 2 Lines 6-26 and Col. 10 Lines 13-26 – “a ground structure”, or object, such as “unique structures or objects”, having “known locations”; where the “objects or structures… are stationary”).
Regarding Claim 69, Barazovsky teaches: A computerized method of positioning a vehicle (Barazovsky, Col. 3 Lines 16-40 and Claim 5 – “systems” of a “UAV”, or unmanned aerial vehicle, for “determining a location” of the UAV by performing a “computer-implemented method”), the computerized method configured to be performed by a computerized positioning system comprising a processing circuitry (Barazovsky, Col. 5 Lines 26-50 – “one or more processors” which are “capable of executing instructions”), the method comprising performing the following by the processing circuitry:
receive first information indicative of at least one transmission (Barazovsky, Col. 3 Lines 20-34 – where the UAV receives “flight information” from a “service provider” by communication “via one or more wireless network(s)”),
wherein the transmission is associated with at least one object (Barazovsky, Col. 10 Lines 13-26 – the UAV “may determine a ground structure”, or object, having a “known location” in “route to the destination”, where the “latitude and longitude” of the ground structure is provided “accessibly from the service provider” to the UAV),
wherein the first information comprises at least one item of object first position information associated with the at least one object, wherein the at least one item of object first position information is indicative of an absolute position of the at least one object (Barazovsky, Col. 10 Lines 13-26 – a “known location” of the ground structure, or object, the “ground structure having a latitude and longitude known” by the UAV “accessibly from the service provider”; where the latitude and longitude constitute first position information);
receive second position information of the at least one object, the second position information being indicative of a second relative position of the at least one object with respect to the vehicle (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – capturing “one or more depth camera image of the ground structure using the depth camera” and determining “distance and/or angular orientation of the ground structure relative to the camera” of the UAV); and
determine a derived position of the vehicle, based at least on the first information and on the second position information (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – “the UAV 104 may determine a current location of the UAV” by calculating the “distance and/or angular orientation of the ground structure relative to the camera”, i.e. a “offset distance” and “angular offset”, and the “UAV may determine a location for the UAV by applying the offset to the known location of the ground structure”, where the calculated UAV location is stored as “an imaged location”, or derived location; where the known location of the ground structure is the “latitude/longitude”);
wherein the derived position of the vehicle is capable of being utilized to facilitate a correction in a reported position of the vehicle (Barazovsky, Col. 4 Lines 36-57 – determining “a GPS correction value (“loc correction”) as a difference between the GPS location”, or reported position, “and the imaged location”; where the “imaged location” is determined as cited above),
wherein the reported position of the vehicle is based on at least one Global Navigation Satellite Systems (GNSS) signal received by at least one GNSS receiver associated with the vehicle (Barazovsky, Col. 4 Lines 36-57 – “the UAV 104 may determine a GPS location of the UAV 104”).
Regarding Claim 70, Barazovsky teaches: A non-transitory computer readable storage medium tangibly embodying a program of instructions (Barazovsky, Col. 5 Lines 26-Col. 6 Line 11 – “a non-transitory computer readable media” configured to store “executable instructions/modules, data, flight paths, and/or data items accessible by the processor(s)”) that, when executed by a computerized positioning system, cause the computer to perform a computerized method of positioning a vehicle (Barazovsky, Col. 3 Lines 16-40 and Claim 5 – “systems” of a “UAV”, or unmanned aerial vehicle, for “determining a location” of the UAV by performing a “computer-implemented method”), the computerized method being performed by a processing circuitry of the computerized positioning system (Barazovsky, Col. 5 Lines 26-50 – “one or more processors” which are “capable of executing instructions”) and comprising performing the following actions:
receive first information indicative of at least one transmission (Barazovsky, Col. 3 Lines 20-34 – where the UAV receives “flight information” from a “service provider” by communication “via one or more wireless network(s)”),
wherein the transmission is associated with at least one object (Barazovsky, Col. 10 Lines 13-26 – the UAV “may determine a ground structure”, or object, having a “known location” in “route to the destination”, where the “latitude and longitude” of the ground structure is provided “accessibly from the service provider” to the UAV),
wherein the first information comprises at least one item of object first position information associated with the at least one object, wherein the at least one item of object first position information is indicative of an absolute position of the at least one object (Barazovsky, Col. 10 Lines 13-26 – a “known location” of the ground structure, or object, the “ground structure having a latitude and longitude known” by the UAV “accessibly from the service provider”; where the latitude and longitude constitute first position information);
receive second position information of the at least one object, the second position information being indicative of a second relative position of the at least one object with respect to the vehicle (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – capturing “one or more depth camera image of the ground structure using the depth camera” and determining “distance and/or angular orientation of the ground structure relative to the camera” of the UAV); and
determine a derived position of the vehicle, based at least on the first information and on the second position information (Barazovsky, Col. 4 Lines 7-35 and Col. 10 Lines 33-42 – “the UAV 104 may determine a current location of the UAV” by calculating the “distance and/or angular orientation of the ground structure relative to the camera”, i.e. a “offset distance” and “angular offset”, and the “UAV may determine a location for the UAV by applying the offset to the known location of the ground structure”, where the calculated UAV location is stored as “an imaged location”, or derived location; where the known location of the ground structure is the “latitude/longitude”);
wherein the derived position of the vehicle is capable of being utilized to facilitate a correction in a reported position of the vehicle (Barazovsky, Col. 4 Lines 36-57 – determining “a GPS correction value (“loc correction”) as a difference between the GPS location”, or reported position, “and the imaged location”; where the “imaged location” is determined as cited above),
wherein the reported position of the vehicle is based on at least one Global Navigation Satellite Systems (GNSS) signal received by at least one GNSS receiver associated with the vehicle (Barazovsky, Col. 4 Lines 36-57 – “the UAV 104 may determine a GPS location of the UAV 104”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 58-60 are rejected under 35 U.S.C. 103 as being unpatentable over Barazovsky in view of Abeywardena (U.S. Patent Application Pub. No. 2019/0080142).
In regards to Claim 58, Barazovsky teaches the computerized positioning system of Claim 56, and while Barazovsky teaches determining of the derived absolute position of the vehicle in said step (c) (Barazovsky, Col. 4 Lines 7-57, Col. 10 Lines 33-42, and Col. 10 Line 60-Col. 11 Line 2 – “the UAV 104 may determine a current location of the UAV” by determining a “corrected location of the UAV” that is “a true or utilized location of the UAV”, or absolute position), but Barazovsky does not teach wherein the receiving of the second position information comprises receiving second position information indicative of an object second relative position of at least one second object with respect to the vehicle, wherein the determining of the derived absolute position of the vehicle in said step (c) comprises: i. perform a first matching of the object second relative position, of the at least one second object, with the at least one item of object first position information, comprised in the first information, associated with the at least one first object, thereby deriving absolute position information of the at least one second object; ii. setting each matched second object of the at least one second object to constitute a corresponding object of the at least one object; and iii. determine the derived absolute position of the vehicle based at least on absolute position information of the corresponding object and on the at least one object second relative position.
However, Abeywardena teaches wherein the receiving of the second position information comprises receiving second position information indicative of an object second relative position of at least one second object with respect to the vehicle (Abeywardena, Para. 0093-0094 – an “image capture device” of a UAV which can “periodically capture images” and capture and extract “a feature” of a structure as an “extracted visual feature”, or first object, and at a later time capture the same feature as a “detected visual feature”, or second object, with respect to the image capture device of the UAV), wherein the determining of the derived absolute position of the vehicle in said step (c) comprises:
perform a first matching of the object second relative position, of the at least one second object, with the at least one item of object first position information, comprised in the first information, associated with the at least one first object, thereby deriving absolute position information of the at least one second object (Abeywardena, Para. 0093-0094 – “comparing an extracted visual feature to the detected visual feature” by using “grayscale matching, gradient matching, histogram matching, and other feature matching algorithms” to determine “a location of the feature” by “triangulation to determine the location of the visual feature from the locations associated with the two or more images in which the feature was detected”);
setting each matched second object of the at least one second object to constitute a corresponding object of the at least one object (Abeywardena, Para. 0093-0094 – detecting a “match between an extracted visual feature and a detected visual feature”); and
determine the derived absolute position of the vehicle based at least on absolute position information of the corresponding object and on the at least one object second relative position (Abeywardena, Para. 0036, 0093-0094 and 0122-127 – determining “a location of the UAV” based on the identified feature detecting by matching; for example a relationship between a detected feature 456 and a localized feature 426 is determined, therefore, a “backup navigation system” of the UAV can determine “the current location of the UAV” using detected feature 456 and a “preloaded map” containing “locations and/or image data of landmarks that are located in the environment of the flight path”, or “known locations of the localized features”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computerized positioning system of Barazovsky to include wherein the receiving of the second position information comprises receiving second position information indicative of an object second relative position of at least one second object with respect to the vehicle, wherein the determining of the derived absolute position of the vehicle in said step (c) comprises: i. perform a first matching of the object second relative position, of the at least one second object, with the at least one item of object first position information, comprised in the first information, associated with the at least one first object, thereby deriving absolute position information of the at least one second object; ii. setting each matched second object of the at least one second object to constitute a corresponding object of the at least one object; and iii. determine the derived absolute position of the vehicle based at least on absolute position information of the corresponding object and on the at least one object second relative position, as taught by Abeywardena, in order to improve the accuracy vehicle position detection by providing a method of verifying and confirming the location of an object used during vehicle localization.
In regards to Claim 59, Barazovsky in view of Abeywardena teaches the computerized positioning system of Claim 58, and Barazovsky in view of Abeywardena further teaches wherein at least one of the following is true:
the first matching is based on the at least one item of object first position information, and the object relative second position, being indicative of a same position, within a defined tolerance;
the at least one object comprises a plurality of objects, wherein said step (c)(iii) comprises determining the derived absolute position of the vehicle based at least on absolute position information of a plurality of corresponding objects and on object second relative positions of the plurality of corresponding objects (Abeywardena, Para. 0036, 0093-0094, 0110 and 0122-127 – determining “a location of the UAV” based on the identified feature detecting by matching; for example a relationship between a detected feature 456 and a localized feature 426 is determined, therefore, a “backup navigation system” of the UAV can determine “the current location of the UAV” using detected feature 456 and a “preloaded map” containing “locations and/or image data of landmarks that are located in the environment of the flight path”, or “known locations of the localized features”; where the UAV camera may, for example, detect multiple “buildings” located in one area of a city over multiple images);
the first matching comprises comparing a first map and a second map, the first map being indicative of the at least one item of object first position information, the second map being indicative of the at least one object second relative position; or
the setting each matched second object comprises: performing a second matching of the at least one object, with the at least one second object, based on the first matching.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computerized positioning system including the above limitations of Barazovsky in view of Abeywardena to include the at least one object comprises a plurality of objects, wherein said step (c)(iii) comprises determining the derived absolute position of the vehicle based at least on absolute position information of a plurality of corresponding objects and on object second relative positions of the plurality of corresponding objects, as taught by Abeywardena, in order to improve the accuracy of position determination by utilizing a plurality of detected objects.
In regards to Claim 60, Barazovsky in view of Abeywardena teaches the computerized positioning system of Claim 59, and Abeywardena further teaches wherein the setting each matched second object further comprising: associating the object identification information of the each matched second object with the corresponding object of the at least one object (Abeywardena, Para. 0089, 0093-0094, 0172 – “comparing an extracted visual feature to the detected visual feature” by using “grayscale matching, gradient matching, histogram matching, and other feature matching algorithms” to determine the two features are of the same feature, the feature having “a unique identifier”, or identification information; where the feature may be “a building, an oil rig”, “a park, a parking lot”, “a billboard, a road sign,” etc.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computerized positioning system including the above limitations of Barazovsky in view of Abeywardena to include wherein the setting each matched second object further comprising: associating the object identification information of the each matched second object with the corresponding object of the at least one object, as taught by Abeywardena, in order to improve the accuracy of position determination by determining that the matched objects are the same object for use when determining the position and to prevent erroneous classification of the objects.
Claim(s) 63-64 are rejected under 35 U.S.C. 103 as being unpatentable over Barazovsky in view of O’Brien, et al., hereinafter O’Brien (U.S. Patent Application Pub. No. 2019/0187239).
In regards to Claim 63, Barazovsky teaches the computerized positioning system of Claim 56, and Barazovsky teaches wherein the at least one object comprises a plurality of objects (Barazovsky, Fig. 5 and Col. 11 Lines 7-38 – where a UAV 514 may determine its position based on the location information from “three or more UAVs”, or a plurality of objects, “using triangulation algorithms”), but Barazovsky does not teach wherein the determining of the derived position of the vehicle in said step (c) further comprises: iv. for each object of the plurality of objects, determining a quality metric associated with a corresponding item of object identification information; and v. performing at least one of the following: (1) select objects of the plurality of objects, to be utilized in the determining of the derived position, based on the quality metric of the each object; and (2) assigning an object weight of the each object, based on the quality metric, and determining the derived position at least based on the object weight.
However, O’Brien teaches wherein the determining of the derived position of the vehicle in said step (c) further comprises:
for each object of the plurality of objects, determining a quality metric associated with a corresponding item of object identification information (O’Brien, Para. 0022, 0031-0033 – the autonomous vehicle identifying “verified radio frequency beacons” and “verified visual beacons” collected in image data based on comparing radio/visual frequencies, or a quality metric, wherein frequency matching indicates beacons located within “a subset of the geographic area”; where image data may further include “LED lights, landmarks, etc.” associated with a specific range); and
performing at least one of the following:
select objects of the plurality of objects, to be utilized in the determining of the derived position, based on the quality metric of the each object (O’Brien, Para. 0022, 0033 – wherein “verified” beacons are determined by comparing the frequencies, or quality metric, to known beacons “within the subset of the geographic area”, such that only beacons within the “subset of the geographic area” are used; where in another embodiment, images containing landmarks, objects, etc. are selected based on “certain ranges”, for example only utilizing “images in the direction of motion of the vehicle, and within a 15° arc of that direction of motion”); and
assigning an object weight of the each object, based on the quality metric, and determining the derived position at least based on the object weight.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computerized positioning system of Barazovsky to include wherein the determining of the derived position of the vehicle in said step (c) further comprises: iv. for each object of the plurality of objects, determining a quality metric associated with a corresponding item of object identification information; and v. performing at least one of the following: (1) select objects of the plurality of objects, to be utilized in the determining of the derived position, based on the quality metric of the each object, as taught by O’Brien, in order to limit the plurality of objects to objects within a specific geographic area to improve “processing efficiency” (O’Brien, Para. 0033).
In regards to Claim 64, Barazovsky in view of O’Brien teaches the computerized positioning system of Claim 63, and Barazovsky in view of O’Brien further teaches wherein the quality metric of the each object is based at least on a level of geographic reasonableness associated with a corresponding item of object first position information of the each object (O’Brien, Para. 0022, 0033 – wherein “verified” beacons are determined by comparing the frequencies, or quality metric, to known beacons “within the subset of the geographic area”, such that only beacons within the “subset of the geographic area” are used; where in another embodiment, images containing landmarks, objects, etc. are selected based on “certain ranges”, for example only utilizing “images in the direction of motion of the vehicle, and within a 15° arc of that direction of motion” while meeting “a threshold level of approximation”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computerized positioning system including the above limitations of Barazovsky in view of O’Brien to include wherein the quality metric of the each object is based at least on a level of geographic reasonableness associated with a corresponding item of object first position information of the each object, as taught by O’Brien, in order to limit the plurality of objects to objects within a specific geographic area to improve “processing efficiency” (O’Brien, Para. 0033).
Claim(s) 65-66 are rejected under 35 U.S.C. 103 as being unpatentable over Barazovsky in view of O’Brien, and further in view of Arditi, et al., hereinafter Arditi (U.S. Patent Application Pub. No. 2020/0309960).
In regards to Claim 65, Barazovsky in view of O’Brien teaches the computerized positioning system of Claim 63, but Barazovsky in view of O’Brien does not teach wherein the determining of the derived position in said step (c) further comprises the following: vi. defining a plurality of unique sub-sets of objects of the plurality of objects; vii. performing an interim position determination, based on a sub-set of the plurality of unique sub-sets, viii. thereby obtaining an interim value of the derived position of the vehicle; ix. determining a position weight associated with the interim value; x. repeating said steps vi to viii for each sub-set of the plurality of unique sub-sets, thereby deriving a plurality of interim values associated with corresponding position weights; and xi. weight the plurality of interim values, based at least on the corresponding position weights, thereby deriving a final value of the derived position of the vehicle, the final value constituting the derived position of the vehicle.
However, Arditi teaches wherein the determining of the derived position in said step (c) further comprises the following:
defining a plurality of unique sub-sets of objects of the plurality of objects (Arditi, Para. 0022-0030 – where a “ML model” for “correcting GPS data” to determine a “ground truth” is trained on a plurality “training samples”, or subsets, of a larger “data set”, and utilizes received input data to determine a “ground truth” position; for example “environmental model data”, wherein the model only utilizes “environmental model data of structures within a predetermined radius (e.g., 200 meters)”);
performing an interim position determination, based on a sub-set of the plurality of unique sub-sets (Arditi, Para. 0022-0023, 0029 – wherein the “ML model” for “correcting GPS data” includes “a number of layers of neurons” wherein each layer provides an “output”),
thereby obtaining an interim value of the derived position of the vehicle (Arditi, Para. 0022-0023, 0029 – wherein the “ML model” for “correcting GPS data” includes “a number of layers of neurons” where “the output of one layer”, or interim value, “may be provided as the input to another layer of neurons”);
determining a position weight associated with the interim value (Arditi, Para. 0023 – “each neuron may independently adjust a weighting coefficient and bias coefficient” applied between each layer);
repeating said steps vi to viii for each sub-set of the plurality of unique sub-sets, thereby deriving a plurality of interim values associated with corresponding position weights (Arditi, Para. 0022-0023, 0027-0028 – wherein the “ML model” for “correcting GPS data” includes “a number of layers of neurons”, which “independently adjust a weighting coefficient and bias coefficient” between each layer; wherein the ML model is trained on a plurality “training samples” of a larger “data set”); and
weight the plurality of interim values, based at least on the corresponding position weights, thereby deriving a final value of the derived position of the vehicle, the final value constituting the derived position of the vehicle (Arditi, Para. 0022-0030 – wherein the “ML model” for “correcting GPS data” outputs “an output corresponding to an inferred ground truth”, i.e. “absolute coordinates (e.g., longitude/latitude) of the ground truth” or “a vector offset from the input GPS data 610 that indicates the ground truth when GPS data 610 was measured”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computerized positioning system including the above limitations of Barazovsky in view of O’Brien to include wherein the determining of the derived position in said step (c) further comprises the following: vi. defining a plurality of unique sub-sets of objects of the plurality of objects; vii. performing an interim position determination, based on a sub-set of the plurality of unique sub-sets, viii. thereby obtaining an interim value of the derived position of the vehicle; ix. determining a position weight associated with the interim value; x. repeating said steps vi to viii for each sub-set of the plurality of unique sub-sets, thereby deriving a plurality of interim values associated with corresponding position weights; and xi. weight the plurality of interim values, based at least on the corresponding position weights, thereby deriving a final value of the derived position of the vehicle, the final value constituting the derived position of the vehicle, as taught by Arditi, in order to improve the accuracy of the derived position of the vehicle.
In regards to Claim 66, Barazovsky in view of O’Brien and Arditi teaches the computerized positioning system of Claim 60 [interpreted by the examiner to be claim 65], and Barazovsky in view of O’Brien and Arditi further teaches wherein at least one of the following is true:
the defining of the plurality of unique sub-sets is based at least on the selecting of the objects to be utilized (Arditi, Para. 0030 – “environmental model data of structures within a predetermined radius (e.g., 200 meters) may be provided to the ML model”, such that only structures within the predetermined radius are selected for the ML model when “correcting a GPS location”); or
the determining of the position weight is based at least on a corresponding object weight of each object in the sub-set.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computerized positioning system including the above limitations of Barazovsky in view of O’Brien and Arditi to include herein at least one of the following is true: (a) the defining of the plurality of unique sub-sets is based at least on the selecting of the objects to be utilized, as taught by Arditi, in order to improve processing efficiency and accuracy by only using sub-sets of relevant objects when positioning.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Govindillam, et al. (U.S. Patent Application Pub. No. 2021/0132233) teaches methods, systems, and non-transitory computer-readable medium for distributed vehicle navigation processing for a vehicle, including determining whether a GNSS signal is below a threshold, and if so, performing a position resolution process to determine and transmit a position of the vehicle by one or more functions via an edge node or a cloud node.
Moskowitz, et al. (U.S. Patent Application Pub. No. 2021/0063162) teaches systems and methods for vehicle navigation, including determining a first position of the vehicle relative to a road navigation model, determining a second position of the vehicle based on a satellite signal, and determine, based on a comparison of the first position and the second position, error information associated with the second position.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HELEN LI whose telephone number is (703)756-4719. The examiner can normally be reached Monday through Friday, from 9am to 5pm eastern.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at (571) 272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/H.L./Examiner, Art Unit 3665
/HUNTER B LONSBERRY/Supervisory Patent Examiner, Art Unit 3665