DETAILED ACTION
This Office action is in response to application filed on 6/12/2024. Claim(s) 1-20 is/are pending.
Specification
The disclosure is objected to because of the following informalities:
[0212] appears to contain a typographical error: “observed from just on UAV pose”, where “on” should be “one”
[0214] appears to contain a typographical error: “the map graph 901 may be considered input to the graph optimizer 901”, where “the graph optimizer 901” should be “the graph optimizer 902”
Appropriate correction is required.
Claim Objections
Claim(s) 9 is/are objected to because of the following informalities:
Claim 9 appears to contain a typographical error, where “causing the aerial vehicle to operated” is grammatically confusing, where the limitation should be, for example, “causing the aerial vehicle to be operated”
Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding claims 1-13, the claims recite “A computer system” and thus, are a machine. Therefore, the claims are within at least one of the four statutory categories.
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 1 includes limitations that recite an abstract idea (emphasized below).
A computing system configured to perform operations comprising:
determining cluster-portion observation data based on an aerial image of a portion of a ground-based cluster of charging pads for aerial vehicles, wherein the ground-based cluster comprises the charging pads arranged in a layout and a plurality of fiducial markers distributed at positions across the layout, wherein the aerial image was captured by an aerial vehicle while hovering above a particular charging pad within the portion of the ground-based cluster, and wherein the cluster-portion observation data comprises (i) information indicating a position of the particular charging pad and (ii) positions of one or more fiducial markers within the portion of the ground-based cluster relative to the particular charging pad;
identifying at least one mapped fiducial marker in a stored reference map of the ground-based cluster that matches at least one of the one or more fiducial markers in the cluster-portion observation data;
identifying a mapped charging pad in the stored reference map as a match to the particular charging pad in the cluster-portion observation data, wherein the mapped charging pad is identified based on the at least one mapped fiducial marker and its position in the stored reference map relative to the mapped charging pad;
determining a geolocation and an orientation of the particular charging pad according to a recorded geolocation and orientation for the identified mapped charging pad; and
outputting location information, the location information indicating that the aerial vehicle is located at the geolocation of the particular charging pad.
The examiner submits that the foregoing bolded limitations constitute a “mental process” because under its broadest interpretation, the claim covers performance of the limitations in the human mind. For example, the “determining...”, “identifying...”, “identifying...”, and “determining...” in the contexts of this claim encompass forming a judgement regarding a geolocation and orientation of a charging pad based on evaluating observed and stored data. Accordingly, the claim recites at least four abstract idea(s).
Regarding Prong II of the Step 2A analysis of the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract idea into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of the judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application”.
In the present case, the additional limitations beyond the above-noted abstract idea(s) are as follows (where the underlined portions are the “additional limitations” while bolded portions continue to represent the “abstract idea”).
A computing system configured to perform operations comprising:
determining cluster-portion observation data based on an aerial image of a portion of a ground-based cluster of charging pads for aerial vehicles, wherein the ground-based cluster comprises the charging pads arranged in a layout and a plurality of fiducial markers distributed at positions across the layout, wherein the aerial image was captured by an aerial vehicle while hovering above a particular charging pad within the portion of the ground-based cluster, and wherein the cluster-portion observation data comprises (i) information indicating a position of the particular charging pad and (ii) positions of one or more fiducial markers within the portion of the ground-based cluster relative to the particular charging pad;
identifying at least one mapped fiducial marker in a stored reference map of the ground-based cluster that matches at least one of the one or more fiducial markers in the cluster-portion observation data;
identifying a mapped charging pad in the stored reference map as a match to the particular charging pad in the cluster-portion observation data, wherein the mapped charging pad is identified based on the at least one mapped fiducial marker and its position in the stored reference map relative to the mapped charging pad;
determining a geolocation and an orientation of the particular charging pad according to a recorded geolocation and orientation for the identified mapped charging pad; and
outputting location information, the location information indicating that the aerial vehicle is located at the geolocation of the particular charging pad.
For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application.
Regarding the additional limitation(s) of “A computing system configured to perform operations comprising”, the examiner submits the limitation(s) are merely tool(s) being used to perform the abstract idea (or instructions to implement the abstract idea on a computer). Further, the “computing system” is/are recited at a high level of generality and merely describe how to generally “apply” the otherwise mental judgement in a generic or general-purpose vehicle control environment. The component(s) merely automate(s) the functional step(s) of the judicial exception and thus do/does not integrate a judicial exception into a “practical application”. See MPEP 2106.05(f). These limitations can also be viewed as nothing more than an attempt to generally link the use of the judicial exception to the technological environment of a computer. It should be noted that because the courts have made it clear that mere physicality or tangibility of an additional element or elements is not a relevant consideration in the eligibility analysis, the physical nature of these computer components does not affect this analysis. See MPEP 2106.05(I).
Regarding the additional limitation(s) of “outputting location information, the location information indicating that the aerial vehicle is located at the geolocation of the particular charging pad”, the examiner submits the limitation(s) is/are insignificant extra-solution activity[ies] that merely use a computer (“computing system”) to perform a nominal or tangential addition to the claim. In particular, the “location information” is recited at a high level of generality, and amounts to a mere post-solution application, which is a form of insignificant extra-solution activity. Additional elements that are considered extra-solution activities do not integrate the claim into a “practical application”. See MPEP 2106.05(g).
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding Step 2B of the 2019 PEG, independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application.
As discussed above with respect to integration of the abstract idea into a practical application, the additional limitation(s) of the “computing system” is/are merely means to apply the exception and does not amount to “significantly more”, as adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, e.g., a limitation indicating that a particular function such as creating and maintaining electronic records is performed by a computer, as discussed in Alice Corp., 573 U.S. at 225-26, 110 USPQ2d at 1984, are not sufficient to amount to significantly more than the judicial exception.
Further, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The additional limitation(s) of “outputting location information…” is a is well-understood, routine, conventional activity: see also MPEP 2106.05(d)(II), and the cases cited therein, including
Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner.
Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015) and OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93, indicate that storing and retrieving of data is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner.
Hence, the claim is not patent eligible.
Regarding claim(s) 14-19, the claim(s) recite(s) “A computer-implemented method” and thus, are a process. Therefore, the claims are within at least one of the four statutory categories. Independent claim 14 rises and falls with independent with claim 1. Thus, the claim is not patent eligible for the same reasons as discussed above with respect to claim 1. Discussion is omitted for brevity.
Hence, the claim is not patent eligible.
Regarding claim(s) 20, the claim(s) recite(s) “An article of manufacture” and thus, are a manufacture. Therefore, the claim(s) is/are within at least one of the four statutory categories. Independent claim 20 recites the similar limitations as indicated above with respect to claim 1. Hence, the claim(s) is/are not patent eligible for the same reasons as discussed above with respect to claim 1. Additional elements present in the independent claim are discussed below. All other limitations not discussed are the same as those discussed above with respect to claim 1. Discussion is omitted for brevity.
Additionally, the claim recites the additional elements of the “a non-transitory computer-readable medium, having stored thereon program instructions that, upon execution by a computing device, cause the computing device to perform operations comprising”. When evaluated in Prong II of the Step 2A analysis in the 2019 PEG, these additional elements do not integrate the above-noted abstract idea into a practical application. The limitation(s) merely describe how to generally “apply” the otherwise mental judgements in a generic or general-purpose environment, are recited at a high level of generality, and merely automate(s) the functional step(s) of the claim. Further, when evaluated in Step 2B of the 2019 PEG, the additional limitation(s) amount(s) to nothing more than applying the exception using a generic computer component. Generally applying an exception using a generic computer component cannot provide an inventive concept.
Hence, the claim is not patent eligible.
Dependent claim(s) 2-13, 15-19 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application.
Hence, the claim(s) is/are not patent eligible.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1, 3-5, 7-14, 16-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jourdan et al. (US 20200301445 A1) in view of Schubert et al. (US 20220357753 A1).
Regarding claim 1, and similarly claims 14 and 20, Jourdan teaches A computing system (“The controller 212 comprises part of the navigation system 200 and forms part of both the FNS 204 and the NFNS 208. In some embodiments, the controller 212 is configured to control additional systems of the UAV, such as the propulsion units 116, 120, the power source 128, the control surfaces 124, etc. The controller 212 includes a processor 224 (e.g., general processing units, graphical processing units, application specific integrated circuits); a data store 228 (a tangible machine readable storage medium); and modules that may be implemented as software logic (e.g., executable software code), firmware logic, hardware logic, or various combinations thereof.”, [0043], Figs. 2, 7) configured to perform operations comprising:
determining cluster-portion observation data based on an aerial image of a portion of a ground-based cluster of charging pads for aerial vehicles (“UAVs of the present disclosure may exhibit the functionality of camera 140 with more than one camera. One purpose of the camera 140 is to image fiducial markers as part of a fiducial navigation sub-system”, [0034], “An array of UAV charging pads (e.g., 728a and 728b) is positioned inside the UAV storage facility 700 in order to receive and recharge a plurality of UAVs (e.g., 100a and 100b).”, [0067], “Each UAV charging pad 728 is similar to the embodiment of FIG. 5 in that it is surrounded by a first group of small fiducial markers (e.g., 720m) and a second group of larger fiducial markers (e.g., medium-sized fiducial markers) (e.g., 720n).”, [0068], Figs. 5, 7), wherein the ground-based cluster comprises the charging pads arranged in a layout and a plurality of fiducial markers distributed at positions across the layout (“Each UAV charging pad 728 is similar to the embodiment of FIG. 5 in that it is surrounded by a first group of small fiducial markers (e.g., 720m) and a second group of larger fiducial markers (e.g., medium-sized fiducial markers) (e.g., 720n).”, [0068], Fig. 7), wherein the aerial image was captured by an aerial vehicle while hovering above a particular charging pad within the portion of the ground-based cluster (“UAV 100a is illustrated executing a precision landing maneuver on charging pad 728c. As UAV 100a descends to a first altitude of about 6 meters, its camera 140 images the medium-sized fiducial markers 770a-770d positioned around the charging pad 728c.”, [0075], Fig. 9), and wherein the cluster-portion observation data comprises (i) information indicating a position of the particular charging pad (“By associating each imaged fiducial marker 770a-770d with the corresponding unique fiducial dataset (e.g., 304a) from the fiducial map 300, the FNS 204 can accurately determine the UAV's navigation solution based upon the known locations at the four sides of the charging pad 728c”, [0075]) and (ii) positions of one or more fiducial markers within the portion of the ground-based cluster relative to the particular charging pad (“a fiducial marker is a reference point that “marks” a known geographical location such as a latitude, longitude, and altitude, and may also represent other known information, including a heading, zip code, or other information.”, [0041]);
identifying at least one mapped fiducial marker in a stored reference map of the ground-based cluster that matches at least one of the one or more fiducial markers in the cluster-portion observation data (“The fiducial map 300 stores information associated with each fiducial marker, which may be represented by the table of FIG. 3. The fiducial map 300 includes information for a population of fiducial markers, including for each individual fiducial marker.”, [0046], “At step 404, the controller 212 compares the image(s) captured by the camera 140 to unique fiducial datasets in the fiducial map 300 (for example, to codes associated with fiducial marker images), and if a match is made, positively identifies the imaged fiducial marker(s) with unique fiducial dataset(s) in the fiducial map 300.”, [0048], Figs. 2, 4);
outputting location information, the location information indicating that the aerial vehicle is located at the geolocation of the particular charging pad (“At step 412, the UAV 100 determines its own navigation solution based at least in part upon the known position of the imaged fiducial marker, analysis of the image (e.g., analyzing the size and position of the fiducial marker in the image), and/or input from one or more instruments (e.g., pitch and roll of camera 140).”, [0048], Fig. 4).
Jourdan teaches “the UAV 100 may have a flight plan loaded onto its controller 212 that includes the position of the charging pad 500 (e.g., latitude, longitude, and altitude). Fiducial markers 504a-504h are positioned around the UAV charging pad 500 in a pattern that assists the UAV 100 to land on the UAV charging pad 500 using the FNS 204 to determine its navigation solution” [0051].
Jourdan does not teach the “fiducial map 300” (Fig. 2, corresponds to Applicant’s “stored reference map”) comprises a “mapped charging pad”, but rather that the “fiducial map 300” comprises saved information regarding each “fiducial marker” ([0046], corresponds to Applicant’s “mapped fiducial marker”). Jordan further teaches outputting a “navigation solution” ([0048], corresponds to Applicant’s “location information”) such that the “fiducial markers may be positioned around a UAV landing pad or charging pad to facilitate precision landing” [0049], i.e., the location of the charging pad and it’s associated fiducial marker locations are known from the aerial vehicle’s flight plan and the sensed fiducial markers are used to land the aerial vehicle [0051].
However, Schubert teaches
identifying a mapped location location location location (where the term “location” is not part of Applicant’s claim, “in block 404, sensor data may be received from one or more sensors on the delivery vehicle. The sensor data may be indicative of a second region of the delivery destination. The sensors may include a camera”, [0124], “FIG. 7A illustrates aerial UDV 180 capturing sensor data representing a portion of delivery destination 602. Aerial UDV 180 may hover about, pan the camera on the UDV, or a combination thereof to sweep the sensor field of view 702 over the delivery destination to acquire sensor data representing a region of the delivery destination (e.g., the front of the house 602).”, [0141], “Once sensor data is captured, a second virtual model 700 may be generated based on the sensor data, as illustrated in FIG. 7B. The second virtual model may reflect the perspective from which the sensor data has been captured.”, [0142], “A mapping may be determined between the first model 600 and the second model 700 to determine an overlapping region between the first and second models, as illustrated in FIG. 8A.”, [0143], Figs. 4, 7B, 8A);
determining a geolocation and an orientation of the particular location location (“In block 410, a position of the target drop-off spot within the second virtual model may be determined based on the overlapping region.”, [0127], “Once an overlapping region is identified, the overlapping region may be used to determine a position of the target drop-off spot 818 in the second virtual model 700. Specifically, determining the mapping may include determining a geometric transformation between representations of the one or more of the first physical features and representations of the one or more of the second physical features. Determining the position of target drop-off spot 818 in the second virtual model 700 may include applying the determined geometric transform to coordinates of the target drop-off spot within the first virtual model to determine coordinates of the target drop-off spot within the second virtual model. By determining the position of the target drop-off spot within the second virtual model, the control system may determine a spatial relationship between the delivery vehicle and the target drop-off spot within the delivery vehicle's perception of the delivery destination (e.g., within the delivery vehicle's coordinate system).”, [0152], Figs. 4, 8A); and
outputting location information, the location information indicating that the aerial vehicle is located at the geolocation of the particular location (“In block 412, based on the position of the target drop-off spot within the second virtual model, instructions may be provided to navigate the delivery vehicle to the target drop-off spot to place the object at the target drop-off spot.”, [0128], “the delivery vehicle may navigate to the target drop-off spot to place the object at the target drop-off spot based on the determined spatial relationship between the delivery vehicle and the target drop-off location, as illustrated in FIG. 8B.”, [0153], “a UDV 200 carrying a payload could simply land on the ground at a delivery location”, [0095], Figs. 4, 8A-8B).
Both Jourdan and Schubert teach matching markers in image data to mapped markers in a stored reference map, and using the matching of the stored and sensed markers to aid in an aerial vehicle landing at a location. Both Jourdan and Schubert teach outputting location information which indicates the landing location. Schubert further teaches matching a landing location in image data to a mapped landing location in the stored reference map, where the mapped landing location is identified based on the matching of the stored and sensed markers. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify the invention of Jourdan with the teachings of Schubert such that the operations of Jourdan further comprised determining and outputting location information of a landing location, as suggested by Schubert, with a reasonable expectation of success. The motivation for doing so would be to associate “physical features” with a “location” which is the destination of an aerial vehicle [0120] such that if the “physical features” are altered slightly, the aerial vehicle is still able to navigate [0146], as taught by Schubert.
Regarding claim 3, and similarly claim 16, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches wherein the aerial image was captured by the aerial vehicle while hovering above the particular charging pad at an altitude above ground level (AGL) that meets particular criteria, wherein the particular criteria comprises hovering below an AGL at which GPS-derived geolocation and/or orientation yields at least a threshold accuracy with at least a threshold likelihood (“As UAV 100a descends to a first altitude of about 6 meters, its camera 140 images the medium-sized fiducial markers 770a-770d positioned around the charging pad 728c.”, [0075], Fig. 9, see also “The fiducial navigation transition module 236 causes the UAV 100 to transition from the non-fiducial navigation mode in which the UAV 100 navigates without aid of the FNS 204, to the fiducial navigation mode in which the UAV 100 navigates at least partially based upon the FNS 204. A variety of triggers may initiate the fiducial navigation transition module 236, for example: when NFNS 208 performance falls below a certain threshold (e.g., when a GPS signal weakens in a GPS-denied or GPS-degraded environment)”, [0082]).
Regarding claim 4, and similarly claim 17, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches wherein the cluster-portion observation data further comprises information indicating respective identifying and orientation markings on the one or more fiducial markers (“] As used in this disclosure, a fiducial marker is a reference point that “marks” a known geographical location such as a latitude, longitude, and altitude, and may also represent other known information, including a heading, zip code, or other information. A fiducial marker is uniquely identifiable by the UAV navigation system 200, and the UAV 100 may utilize the information associated with a fiducial marker to triangulate or otherwise determine its own navigation solution with high accuracy. Using fiducial marker 220 as an example, each fiducial marker is associated with a unique set of spatial coordinates (such as an altitude, latitude, and longitude). In this sense each fiducial marker is a “geo” fiducial marker. Each fiducial marker may be associated with additional information, for example a zip code, a true heading, and/or other information. A fiducial marker may represent its associated geographical information in a variety of ways, for example as a code including a 2-D bar code, a 2-D quick response (“QR”) code, a 3-D code, a pattern of time-correlated flashing visible light or other electromagnetic signals, etc. Fiducial marker 504f of FIG. 5 is representative of a fiducial marker visual code or appearance.”, [0041]).
Regarding claim 5, and similarly claim 18, Jourdan in view of Schubert teaches The computing system of claim 4, and Jourdan further teaches wherein identifying the at least one mapped fiducial marker in the stored reference map comprises:
comparing an identity and orientation recorded for the at least one mapped fiducial marker to the respective identifying and orientation markings on the one or more fiducial markers in the cluster-portion observation data (“The fiducial map 300 stores information associated with each fiducial marker, which may be represented by the table of FIG. 3. The fiducial map 300 includes information for a population of fiducial markers, including for each individual fiducial marker. For each individual fiducial marker, the fiducial map includes a unique fiducial dataset or “line item” (e.g., line item 304a and 304b) storing a code associated with an image of the fiducial marker, a latitude, and longitude of that fiducial marker. In some embodiments, the fiducial map 300 may also include an altitude, a heading, a zip code, and/or other information associated with one or more individual fiducial markers.”, [0046], “At step 404, the controller 212 compares the image(s) captured by the camera 140 to unique fiducial datasets in the fiducial map 300 (for example, to codes associated with fiducial marker images), and if a match is made, positively identifies the imaged fiducial marker(s) with unique fiducial dataset(s) in the fiducial map 300.”, [0048]).
Regarding claim 7, Jourdan in view of Schubert teaches The computing system of claim 1, and Schubert further teaches wherein the location information further indicates that an orientation of the aerial vehicle is an orientation reckoned with respect to an orientation of the particular charging pad (“the delivery vehicle may navigate to the target drop-off spot to place the object at the target drop-off spot based on the determined spatial relationship between the delivery vehicle and the target drop-off location, as illustrated in FIG. 8B.”, [0153]).
Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to further modify the invention of Jourdan with the teachings of Schubert such that the operations of Jourdan further comprised determining and outputting an orientation of the aerial vehicle, as suggested by Schubert, with a reasonable expectation of success. The motivation for doing so would be to place the aerial vehicle in a specified location [0153], as taught by Schubert, relative to the charging pad.
Regarding claim 8, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches wherein the location information further indicates that the aerial vehicle initiated a flight from the particular charging pad (“Upon takeoff, UAV 100 images the first group of fiducial markers 782a-782d to determine its navigation solution.”, [0076], “The fiducial initialization module 256 utilizes the FNS 204 to verify the NFNS 208 before the UAV 100 exits a GPS-degraded area or a GPS-denied area. For example, upon takeoff from a charging pad in a fiducial navigation zone, the fiducial initialization module 256 causes the UAV camera 140 to image one or more fiducial markers, and then to determine a navigation solution utilizing the FNS 204 that includes latitude and longitude (and in some embodiments, a heading).”, [0087]).
Regarding claim 9, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches wherein outputting the location information comprises one or more of:
sending, via electronic transmission, the location information to the aerial vehicle; or
causing the aerial vehicle to operated based on the location information (“At step 412, the UAV 100 determines its own navigation solution based at least in part upon the known position of the imaged fiducial marker, analysis of the image (e.g., analyzing the size and position of the fiducial marker in the image), and/or input from one or more instruments (e.g., pitch and roll of camera 140).”, [0048], “the FNS 204 can navigate the UAV 100 toward the charging pad 500”, [0056], Fig. 4).
Regarding claim 10, and similarly claim 19, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches wherein determining the cluster-portion observation data comprises:
causing the aerial vehicle to hover above the particular charging pad within the portion of the ground-based cluster (“UAV 100a is illustrated executing a precision landing maneuver on charging pad 728c. As UAV 100a descends to a first altitude of about 6 meters, its camera 140 images the medium-sized fiducial markers 770a-770d positioned around the charging pad 728c.”, [0075], Fig. 9); and
causing an imaging system of the aerial vehicle to capture the aerial image of the portion of the ground-based cluster, including the particular charging pad, while the aerial vehicle is hovering above the particular charging pad (“a descending UAV 100 uses its camera 140 to image the charging pad 500. At a first altitude 520 (e.g., 5 meters), the camera field of view 524 encompasses all fiducial markers 504a-504h.”, [0056], see also “its camera 140 images the medium-sized fiducial markers 770a-770d positioned around the charging pad 728c.”, [0075] citation above).
Regarding claim 11, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches wherein determining the cluster-portion observation data comprises:
receiving, from the aerial vehicle, the cluster-portion observation data (“The UAV 100 includes a camera 140, which forms part of a navigation system”, [0034], “a descending UAV 100 uses its camera 140 to image the charging pad 500. At a first altitude 520 (e.g., 5 meters), the camera field of view 524 encompasses all fiducial markers 504a-504h.”, [0056]).
Regarding claim 12, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches comprising a server (“an optional controller system 744 (e.g., a “control tower”) that includes a network 748, a data store 752, a controller 756 (e.g., servers in a distributed system, local computer, a combination thereof, or the like)”, [0069]) configured to perform one or more of the determining of the cluster-portion observation data, the identifying of the at least one mapped fiducial marker, the identifying the mapped charging pad, the determining the geolocation and the orientation, or the outputting the location information (“The logic, algorithms, interactions, relationships, properties, and other factors utilized by the modules of FIG. 2 are stored on the data store 228. In the illustrated embodiment, all the modules identified in FIG. 2 are stored on-board the UAV on the controller 212. In some embodiments, any module may be stored in part or in whole on external storage resources. Likewise, the modules of FIG. 2 are associated with the processor 224 of controller 212. In some embodiments, any module may be executed in part or in whole on one or more processors that are external to the UAV 100 (such as the controller system 744 of FIG. 7)”, [0081]).
Regarding claim 13, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches comprising the aerial vehicle (“UAV 100”, Fig. 2), wherein the aerial vehicle is configured to perform one or more of the determining of the cluster-portion observation data, the identifying of the at least one mapped fiducial marker, the identifying the mapped charging pad, the determining the geolocation and the orientation, or the outputting the location information (“The controller 212 includes a processor 224 (e.g., general processing units, graphical processing units, application specific integrated circuits); a data store 228 (a tangible machine readable storage medium); and modules that may be implemented as software logic (e.g., executable software code), firmware logic, hardware logic, or various combinations thereof.”, [0043], see also [0081]).
Claim(s) 2, 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jourdan et al. (US 20200301445 A1) in view of Schubert et al. (US 20220357753 A1) in view of Breut et al. (US 20250157340 A1).
Regarding claim 2, and similarly claim 15, Jourdan in view of Schubert teaches The computing system of claim 1, and Jourdan further teaches wherein the aerial image was captured by the aerial vehicle while hovering above the particular charging pad at an altitude above ground level (AGL) that meets particular criteria, (“As UAV 100a descends to a first altitude of about 6 meters, its camera 140 images the medium-sized fiducial markers 770a-770d positioned around the charging pad 728c.”, [0075], Fig. 9, see also “The fiducial navigation transition module 236 causes the UAV 100 to transition from the non-fiducial navigation mode in which the UAV 100 navigates without aid of the FNS 204, to the fiducial navigation mode in which the UAV 100 navigates at least partially based upon the FNS 204. A variety of triggers may initiate the fiducial navigation transition module 236, for example:…when the UAV 100 descends below a threshold altitude (e.g., 50 meters)”, [0082] and [0056-0058] and Figs. 5-6).
However, Breut teaches
wherein the particular criteria comprises hovering below an AGL at which at least one aviation regulation applies (“The relevant operational condition can be imposed by regulation, by the control system 116, about a maximum speed or minimum/maximum altitude profile when accessing a landing area or pit stop area.”, [0302], “the at least one central server system processor 220 determines the landing position vector based at least in part on the external sensing system data. For example, the external sensing system data may indicate a clear, flat ground area (e.g. via LIDAR data or the image data confirming the absence of an object or human in the landing area) and the at least one central server system processor 220 may determine a landing position vector within the clear, flat ground area.”, [0307]).
Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify the invention of Jourdan in view of Schubert with the teachings of Breut such that the aerial image is captured at below a maximum altitude imposed by regulation, as suggested by Breut, with a reasonable expectation of success. The motivation for doing so would be to improve safety by docking the aerial vehicle only when certain conditions are met, as suggested by Breut [0301-0302].
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jourdan et al. (US 20200301445 A1) in view of Schubert et al. (US 20220357753 A1) in view of Sharma et al. (US 20170017240 A1).
Regarding claim 6, Jourdan in view of Schubert teaches The computing system of claim 5, wherein identifying the at least one mapped fiducial marker in the stored reference map comprises:
Jourdan teaches “positively identifies the imaged fiducial marker(s) with unique fiducial dataset(s) in the fiducial map 300” [0048] but does not explicitly teach how this is accomplished.
However, Sharma teaches
reorienting at least part of the cluster-portion observation data to match an orientation of the stored reference map (“the 3D orientation of the at least one media sensor with respect to the centroid of the marker may be determined by utilizing vision based techniques. In an embodiment, the orientation of the at least one media sensor with respect to the centroid of the marker the system 200 may be caused to align and compare the stored marker image in the system 200 with the captured marker image to determine a correspondence between the captured marker image and the stored marker image.”, [0037]).
Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify the invention of Jourdan in view of Schubert with the teachings of Sharma such that the
comparing the identity and orientation of the mapped fiducial marker to the sensed fiducial markers comprises aligning the mapped and sensed fiducial markers, as suggested by Sharma, with a reasonable expectation of success. Aligning sensor data to stored data to determine a match between the data is well-known in the art and this modification would only require routine skill. KSR International Co. v. Teleflex Inc. (KSR), 550 U.S. 398, 82 USPQ2d 1385 (2007)
Conclusion
The prior art made of record and not relied upon is considered pertinent to Applicant's disclosure: See Notice of References Cited.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMELIA VORCE whose telephone number is (313) 446-4917. The examiner can normally be reached on Monday-Friday, 9AM-6PM, Central Time.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMELIA VORCE/ Primary Examiner, Art Unit 3666