DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-20 of US Application No. 18/655,032, filed on 03 May 2024, are currently pending and have been examined.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “flight system that provides a drone flight path”, “central management system configured to control flight operations” in claim 10.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 7 and 15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 7 and 15 recite “wherein the flight plans include a series of drone positions and a timecode interface, and the flight controller provides a status interface to inform the central management system of a drone status”. It is not clear how a flight plan, which is a non-physical entity, can include a timecode interface, which is a physical entity. Accordingly, the claim is indefinite. For this Detailed Action, the Examiner interprets the “a timecode interface” as simply ‘a timecode’, consistent with claim 16.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 16-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
In January, 2019 (updated October 2019), the USPTO released new examination guidelines setting forth a two-step inquiry for determining whether a claim is directed to non-statutory subject matter. According to the guidelines, a claim is directed to non-statutory subject matter if:
STEP 1: the claim does not fall within one of the four statutory categories of invention (process, machine, manufacture or composition of matter), or
STEP 2: the claim recites a judicial exception, e.g. an abstract idea, without reciting additional elements that amount to significantly more than the judicial exception, as determined using the following analysis:
STEP 2A (PRONG 1): Does the claim recite an abstract idea, law of nature, or natural phenomenon?
STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application?
STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
Using the two-step inquiry, it is clear that claims 1-13 are directed toward non-statutory subject matter, as shown below:
STEP 1: Does claim 16 fall within one of the statutory categories? Yes. Independent claim 16 is directed toward a process, which falls within one of the statutory categories.
STEP 2A (PRONG 1): Is the claim directed to a law of nature, a natural phenomenon or an abstract idea? Yes, independent claim 16 is directed to an abstract idea.
With regard to STEP 2A (PRONG 1), a claim that recites an abstract idea, a law of nature, or a natural phenomenon is directed to a judicial exception. the guidelines provide three groupings of subject matter that are considered abstract ideas:
Mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations;
Certain methods of organizing human activity – fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and
Mental processes – concepts that are practicably performed in the human mind (including an observation, evaluation, judgment, opinion).
See the 2019 Revised Patent Subject Matter Eligibility Guidance. With respect to mental processes, the courts do not distinguish between mental processes that are performed entirely in the human mind and mental processes that require a human to use a physical aid (e.g., pen and paper or a slide rule) to perform the claim limitation. Nor do the courts distinguish between claims that recite mental processes performed by humans and claims that recite mental processes performed on a computer.
Independent claim 16 recites “determining, based at least in part on the drone flight path, the drone position information, and the first drone type, a control signal to be provided to a drone controller to provide a drone movement to a drone position of the series of drone positions”. These limitations may be performed in the human mind. For example, a person having a flight path, drone position, and drone type, may determine a control signal intended to move the drone along a desired path. Therefore, the claim 16 recites an abstract idea.
STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? No, claim 16 does not recite additional elements that integrate the judicial exception into a practical application.
With regard to STEP 2A (prong 2), even when a judicial element is recited in the claim, an additional claim element(s) that integrates the judicial exception into a practical application of that exception renders the claim eligible under §101. The guidelines provide the following exemplary considerations that are indicative that an additional element (or combination of elements) may have integrated the judicial exception into a practical application:
an additional element reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field;
an additional element that applies or uses a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition;
an additional element implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim;
an additional element effects a transformation or reduction of a particular article to a different state or thing; and
an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception.
While the guidelines further state that the exemplary considerations are not an exhaustive list and that there may be other examples of integrating the exception into a practical application, the guidelines also list examples in which a judicial exception has not been integrated into a practical application:
an additional element merely recites the words “apply it” (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea;
an additional element adds insignificant extra-solution activity to the judicial exception; and
an additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use.
In the instant application, claim 16 does not recite additional elements that integrate the judicial exception into a practical application of that exception. Claim 16 recites the additional elements “configuring the drone as a first drone type of the plurality of different types of drones”, “receiving a drone flight path from a central management system, the drone flight path including a series of drone positions and a timecode associated with each drone position of the series of drone positions”, and “receiving drone position information from one or more sensors”.
As noted above, adding insignificant extra-solution activity to the judicial exception is indicative that the judicial exception has not been integrated into a practical application. Insignificant extra-solution activity includes data gathering and outputting. See MPEP 2106.05(g). Receiving a drone flight path and drone position information is data gathering. Therefore, these additional elements just add insignificant extra-solution activity to the judicial exception.
Also as noted above, no more than generally link the use of a judicial exception to a particular technological environment or field of use is indicative that the judicial exception has not been integrated into a practical application. Configuring the drones as different types merely links the judicial exceptions to the technological environment of drones.
Therefore, claim 16 does not recite additional elements that integrate the judicial exception into a practical application of that exception.
STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No, claim 16 does not recite additional elements that amount to significantly more than the judicial exception.
With regard to STEP 2B, whether the claims recite additional elements that provide significantly more than the recited judicial exception, the guidelines specify that the pre-guideline procedure is still in effect. Specifically, that examiners should continue to consider whether an additional element or combination of elements:
adds a specific limitation or combination of limitations that are not well-understood, routine, conventional activity in the field, which is indicative that an inventive concept may be present; or
simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, which is indicative that an inventive concept may not be present.
Claim 16 does not recite any specific limitation or combination of limitations that are not well-understood, routine, conventional (WURC) activity in the field.
Using a generic computer to perform generic computing functions is WURC activity. Generic computing functions include 1) performing repetitive calculations, 2) receiving, processing, and storing data, 3) electronically scanning or extracting data from a physical document, 4) electronic recordkeeping, 5) automating mental tasks, and 6) receiving or transmitting data over a network, e.g., using the Internet to gather data. See MPEP 2106.05(d)(II). Receiving a drone flight path from a central management system, given its broadest reasonable interpretation, is receiving data over network. Receiving data over a network is WURC activity in the field. Further, receiving position information from sensors is also WURC activity in the field. Garcia, cited below, is one example of receiving position information from sensors, as indicated in the rejections below. The additional elements, both individually and in combination, are well-understood, routine, conventional activity in the field
CONCLUSION
Thus, since claim 16 (a) is directed toward an abstract idea, (b) does not recite additional elements that integrate the judicial exception into a practical application, and (c) does not recite additional elements that amount to significantly more than the judicial exception, it is clear that claim 16 is directed towards non-statutory subject matter.
The Examiner notes that amending claim 16 to positively recite a step of controlling movement of the drones along the flight path based on the control signal might, depending on the scope of the amended claim, might be an additional element that applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception.
Claim 17 recites “determining, based at least on the first drone type, one or more control commands to be provided to a modular payload that is coupled with the drone, the modular payload including one or more of a character body, a puppet, a mirror, a screen, a smoke generator, a dust generator, a laser, or any combinations thereof”, which may be performed mentally. Claim 17 does not recite any new additional elements. Therefore, claim 17 does not recited any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception for the same reasons as claim 16.
Claim 18 further defines a previously-identified abstract idea, i.e., determining one or more commands. However, even as further defined, the abstract idea may be performed mentally. Claim 18 does not recite any new additional elements. Therefore, claim 18 does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception for the same reasons as claim 16.
Claim 19 recites new additional elements “transmitting, to the central management system, a drone status that includes the drone position and current timecode of the drone” and “receiving, from the central management system, an update to the drone flight path that updates one or more subsequent drone positions associated with one or more subsequent timecodes”. Transmitting a drone status is data outputting and receiving an update to the drone flight path is data gathering. Further transmitting the drone status to a central management unit and receiving an update from the central management system are both examples of receiving or transmitting data over a network, i.e., WURC activity in the field. Therefore, claim 19 does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception.
Claim 20 further defines a previously-identified additional element, i.e., receiving drone position information. However, even as further defined, the additional element is still data gathering. Further, the additional element is still WURC activity in the field. Therefore, claim 20 does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 16, 17, 19, and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Garcia Morchon et al. (US 2017/0221394 A1, “Garcia”).
Regarding claim 16, Garcia discloses a system and device for creating an aerial image and teaches:
configuring the drone as a first drone type of the plurality of different types of drones multiple flying devices 107 FLYDEV – see at least Fig. 1 and ¶ [0038]; system SYS may comprise a mixture of drones FLYDEV where some drones have only a light unit, some drones have only a particle generator, and some drones have both a light unit and a particle generator – see at least ¶ [0046]);
receiving a drone flight path from a central management system, the drone flight path including a series of drone positions and a timecode associated with each drone position of the series of drone positions (computer program run on an external computer may generate a configuration file comprising flying paths, corresponding timing, and light control data for respective drones, where the configuration file may be uploaded to control unit 101 – see at least Fig. 1 and ¶ [0068]);
receiving drone position information from one or more sensors (camera and radar may be used to determine an actual position of drone FLYDEV – see at least ¶ [0049]; GPS to determine drone position – see at least ¶ [0061]);
determining, based at least in part on the drone flight path, the drone position information, and the first drone type, a control signal to be provided to a drone controller to provide a drone movement to a drone position of the series of drone positions (a difference between an intended position and the actual position may be used by control unit CTRLU to correct the actual position and maintain the drone at its intended position – see at least ¶ [0049]).
Regarding claim 17, Garcia further teaches:
determining, based at least on the first drone type, one or more control commands to be provided to a modular payload that is coupled with the drone (computer program run on an external computer may generate a configuration file comprising flying paths, corresponding timing, and light control data for respective drones, where the configuration file may be uploaded to control unit 101 – see at least Fig. 1 and ¶ [0068]), the modular payload including one or more of a character body, a puppet, a mirror, a screen, a smoke generator, a dust generator, a laser, or any combinations thereof (light unit may be a laser – see at least ¶ [0055]; drones may comprise a particle generator, which may generate smoke or water, to generate a particle cloud 411 – see at least Fig. 4 and ¶ [0046], [0073].
Regarding claim 19, Garcia further teaches:
transmitting, to the central management system, a drone status that includes the drone position and current timecode of the drone (each drone may send its current position at regular time intervals to control unit 101 – see at least ¶ [0069]); and
receiving, from the central management system, an update to the drone flight path that updates one or more subsequent drone positions associated with one or more subsequent timecodes (a difference between the intended position and the actual position may be used by the control unit CTRLU to correct said actual position and maintain said drone at its intended position – see at least ¶ [0049]).
Regarding claim 20, Garcia further teaches:
wherein the one or more sensors comprise one or more of: a global navigation satellite system (GNSS) module; a ground based radio navigation system receiver module; a visual navigation module; an inertial navigation system module; an air data sensor that measures one or more of airspeed, altitude, and angle of attack; an angle of attack sensor; a magnetometer; a radio altimeter sensor; a proximity sensor; or any combinations thereof (GPS to determine drone position – see at least ¶ [0061]; camera and radar may be used to determine an actual position of drone FLYDEV – see at least ¶ [0049]; data, e.g., wind speed, collected from a variety of sensors – see at least ¶ [0049]; camera and radar to detect other objects – see at least ¶ [0049]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 5, 8, and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Moon et al. (US 2023/0152068 A1, “Moon”) in view of Zhou et al. (US 2020/0020236 A1, “Zhou”).
Regarding claim 1, Moon discloses an apparatus and method for integrated control of flight of the unmanned aerial vehicles and teaches:
a flight controller configured to execute drone flight movements based at least in part on a flight path received from a central management system (flight control signal generator 121 of flight controller 120 generates flight control signals for control on the basis of received flight information – see at least Figs. 1-3 and ¶ [0052], [0060]; flight information receiver 111 receives flight information from a manager terminal, the flight information including flight paths – see at least Fig. 1 and ¶ [0047]-[0048]);
one or more sensors coupled with the flight controller that provide drone position information (state information receiver 140 receives real time flight state information from the unmanned aerial vehicle, including current position – see at least Fig. 1 and ¶ [0055]-[0056]); and
an abstraction layer coupled with the flight controller that provides a drone hardware interface and is adapted to be coupled with a drone to provide control signals to the drone (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]; flight control signals are signals for controlling position, path, speed and angle – see at least ¶ [0060]) [ ], and wherein the abstraction layer converts signals from the flight controller into the control signals [ ] (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]).
Moon fails to teach the control signals for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof; based at least in part on a drone type of the drone that is selected from a plurality of different available drone types.
However, Zhou discloses methods and systems for supporting flight restriction of unmanned aerial vehicles and teaches:
the control signals for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof (terminal 1012 may provide control data to movable object 1000, such as instructions for controlling propulsion mechanisms 1006 – see at least Fig. 18 and ¶ [0318]);
converts signals based at least in part on a drone type of the drone that is selected from a plurality of different available drone types (In step 2506, the one or more commands can be converted into one or more flight instructions executable by the UAV. UAVs of various manufacturers and various models can have different operating system and/or different hardware configuration, therefore, it can be necessary to convert the received commands to executable flight instructions. The conversion can be performed by one or more processors onboard the UAV. For example, the commands received from the remote server can be converted into flight instructions compatible with the instruction set of the UAV operating system – see at least ¶ [0219]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the apparatus and method for integrated control of flight of the unmanned aerial vehicles of Moon to provide for converting signals based on drone type, as taught by Zhou, with a reasonable expectation of success, because converting signals based on drone type would accommodate drones of various models that have different operating systems or hardware configurations such that the commands are compatible with the UAV operating system (Zhou at ¶ [0219]).
Regarding claim 2, Moon further teaches;
wherein the abstraction layer includes one or more hardware components, one or more software modules, or any combinations thereof (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]; processes may be performed by one or more programmable processors executing one or more computer programs to perform functions – see at least ¶ [0060]).
Regarding claim 3, Moon further teaches:
a modular payload adapted to be coupled with the drone, the modular payload including one or more of a character body, a puppet, a mirror, a screen, a smoke generator, a dust generator, a laser, an LED light, pyrotechnics, or any combinations thereof (fireworks product 300 may be a firework or an LED light used for fireworks – see at least Fig. 1 and ¶ [0047]).
Regarding claim 5, Moon further teaches:
wherein the abstraction layer is adapted to be coupled with at least two different types of drones of the plurality of different available drone types and provides a common interface to the flight controller for the at least two different types of drones, and wherein the common interface is compatible with flight plans received at the flight controller from the central management system (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]).
Regarding claim 8, Moon further teaches:
wherein the flight controller comprises: a guidance and navigation module that directs movements of the drone (unmanned aerial vehicle flight controller 120 generates flight control signals – see at least Fig. 1 and ¶ [0058]) and a communication interface that provides a [ ] communications link with the central management system (flight information receiver 111 receives flight information from a manager terminal, the flight information including flight paths – see at least Fig. 1 and ¶ [0047]-[0048]).
Moon fails to teach the communication interface providing wireless communication. However, Zhou further teaches:
a communication interface that provides a wireless communications link with the central management system (flight controller 620 may communicate with mobile devices via wireless modules – see at least ¶ [0262]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined apparatus and method for integrated control of flight of the unmanned aerial vehicles of Moon and Zhou to provide for wireless communication, as further taught by Zhou, with a reasonable expectation of success, because wireless communication would provide direct communication from the external device (Zhou at ¶ [0262]).
Regarding claim 9, Moon further teaches:
wherein the one or more sensors comprise one or more of: a global navigation satellite system (GNSS) module; a ground based radio navigation system receiver module; a visual navigation module; an inertial navigation system module; an air data sensor that measures one or more of airspeed, altitude, and angle of attack; an angle of attack sensor; a magnetometer; a radio altimeter sensor; a proximity sensor; or any combinations thereof (state information receiver 140 receives flight state information, which may include at least one selected from the group of unique numbers, speeds, wind speeds, and current positions to which position correction values according to the ignition of the fireworks products are applied.
Claims 4 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Moon in view of Zhou, as applied to claim 3 above, and further in view of Garcia.
Regarding claim 4, Moon and Zhou fail to teach but Garcia discloses a system and device for creating an aerial image and teaches:
wherein the modular payload includes the [particle] generator and the laser, and wherein the flight controller is further configured to release a cloud of [particles] and activate the laser to provide an output directed toward the cloud of [particles] thereby making a beam from the laser visible to create a floating volumetric screen (drone has a particle generator to create a particle cloud 411 and a light source for generating a light beam 412 directed to the particle cloud – see at least Fig. 4 and ¶ [0073]; drone 405 may include a light source and a particle generator – see at least Fig. 4 and ¶ [0076]; light source may emit a laser beam – see at least ¶ [0055]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined apparatus and method for integrated control of flight of the unmanned aerial vehicles of Moon and Zhou to provide a particle generator and laser, as taught by Garcia, with a reasonable expectation of success, because the particle generator and laser may be used to generate images in a space (Garcia at ¶ [0001]).
Garcia does not teach the particles being dust. However, Garcia discloses the claimed limitation except Garcia uses particles, such as water or smoke, instead of dust, to create the floating volumetric screen. It would have been an obvious matter of design choice to use water or smoke particles instead of dust particles to create the floating volumetric screen, since Applicant has not disclosed that dust particles solves any state problem or is for any particular purpose and it appears that the invention would perform equally as well with water or smoke particles.
Regarding claim 7, Moon and Zhou fail to teach but Garcia discloses a system and device for creating an aerial image and teaches:
wherein the flight plans include a series of drone positions and a timecode [ ] (configuration file may include a required position of each drone FLYDEV at a given moment – see at least ¶ [0068]), and the flight controller provides a status interface to inform the central management system of a drone status (each drone may send its current position at regular time intervals to control unit 101 – see at least ¶ [0069]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined apparatus and method for integrated control of flight of the unmanned aerial vehicles of Moon and Zhou to provide flight plans with positions and timecodes, as taught by Garcia, with a reasonable expectation of success, because the positions and timecodes may create a flying image in a volume of interest (Garcia at ¶ [0068]).
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Moon in view of Zhou, as applied to claim 5 above, and further in view of Hovey et al. (US 10,723,454 B1, “Hovey”).
Regarding claim 6, Moon and Zhou fail to teach but Hovey discloses an aerial show system using unmanned aerial vehicles as creative elements in the show and teaches:
wherein a first type of drone of the two or more different types of drones is a character drone, and a second type of drone of the two or more different types of drones is a light drone (UAVS 310, 311 to animate a show effect device 330 – see at least Fig. 3 and 8:58-67; show effect device may include particles to form a projection surface and a projector for projecting light, e.g., laser light, onto the projection surface – see at least 2:61-67).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined apparatus and method for integrated control of flight of the unmanned aerial vehicles of Moon and Zhou to provide different types of drones, such as a character drone and light drones, as taught by Hovey, with a reasonable expectation of success, because the different types of drones may be used to produce dynamic show effects as the UAVs fly through show’s airspace (Hovey at 1:10-15).
Claims 10-13 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Garcia in view of Moon and Zhou.
Regarding claim 10, Garcia discloses a system for creating an aerial image and teaches:
a first set of drones including a plurality of drones having a first drone type of two or more different types of drones (multiple flying devices 107 FLYDEV – see at least Fig. 1 and ¶ [0038]; system SYS may comprise a mixture of drones FLYDEV where some drones have only a light unit, some drones have only a particle generator, and some drones have both a light unit and a particle generator – see at least ¶ [0046]);
a second set of drones including one or more drones having a second drone type (multiple flying devices 107 FLYDEV – see at least Fig. 1 and ¶ [0038]; system SYS may comprise a mixture of drones FLYDEV where some drones have only a light unit, some drones have only a particle generator, and some drones have both a light unit and a particle generator – see at least ¶ [0046]);
a flight system that provides a drone flight path to each drone of the first set of drones and the second set of drones (control unit 101 VCTRL may determine control data for controlling the drones – see at least Fig. 1 and ¶ [0044]; drones FLYDEV may fly to respective positions according to the control data – see at least ¶ [0044]; FLYDEV may be a quad-copter, type of helicopter, or fixed wing aerial vehicle – see at least ¶ [0050]-[0051]); and
a central management system (control unit 101 + user interface 102 + remote content server 110 – see at least Fig. 1 and ¶ [0044]) configured to control flight operations for each of the first set of drones and the second set of drones based at least on the drone flight path of each drone of the first set of drones and the second set of drones (image to be represented may be retrieved from server 110 using user interface 102, where the control unit 101 determines control data so that the drones represent the image – see at least ¶ [0044]; control unit 101 VCTRL may determine control data for controlling the drones – see at least Fig. 1 and ¶ [0044]; drones FLYDEV may control their respective light units according to the control data – see at least ¶ [0044]; control data may comprise positions of respective drones FLYDEV and on-off status for each respective light source – see at least ¶ [0045]);
wherein each drone of each of the first set of drones and the second set of drones comprises:
a flight controller configured to execute drone flight movements based at least in part on a flight path received from a central management system (drone FLYDEV has a processor connected to the engine and arrange to control a position of the drone FLYDEV by sending an appropriate command to the engine – see at least ¶ [0039]); and
[ ].
Garcia fails to teach an abstraction layer coupled with the flight controller and a drone hardware interface and that provides control signals via the drone hardware interface for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof, and wherein the abstraction layer converts signals from the flight controller into the control signals based at least in part on whether the associated drone is the first drone type or the second drone type.
However, Moon discloses an apparatus and method for integrated control of flight of the unmanned aerial vehicles and teaches:
an abstraction layer coupled with the flight controller and a drone hardware interface and that provides control signals via the drone hardware interface [ ] (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]), and wherein the abstraction layer converts signals from the flight controller into the control signals [ ] (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system for creating an aerial image of Garcia to provide an abstraction layer, as taught by Moon, with a reasonable expectation of success, because the abstraction layer may convert flight control signals into a supported format for the drone (Moon at ¶ [0061]).
Moon fails to teach the control signals for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof; based at least in part on a drone type of the drone that is selected from a plurality of different available drone types.
However, Zhou discloses methods and systems for supporting flight restriction of unmanned aerial vehicles and teaches:
the control signals for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof (terminal 1012 may provide control data to movable object 1000, such as instructions for controlling propulsion mechanisms 1006 – see at least Fig. 18 and ¶ [0318]);
converts signals based at least in part on whether the associated drone is the first drone type or the second drone type (In step 2506, the one or more commands can be converted into one or more flight instructions executable by the UAV. UAVs of various manufacturers and various models can have different operating system and/or different hardware configuration, therefore, it can be necessary to convert the received commands to executable flight instructions. The conversion can be performed by one or more processors onboard the UAV. For example, the commands received from the remote server can be converted into flight instructions compatible with the instruction set of the UAV operating system – see at least ¶ [0219]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined system for creating an aerial image of Garcia and Moon to provide for converting signals based on drone type, as taught by Zhou, with a reasonable expectation of success, because converting signals based on drone type would accommodate drones of various models that have different operating systems or hardware configurations such that the commands are compatible with the UAV operating system (Zhou at ¶ [0219]).
Regarding claim 11, Moon further teaches;
wherein the abstraction layer includes one or more hardware components, one or more software modules, or any combinations thereof (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]; processes may be performed by one or more programmable processors executing one or more computer programs to perform functions – see at least ¶ [0060]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined system for creating an aerial image of Garcia, Moon, and Zhou to provide the abstraction layer, as further taught by Zhou, with a reasonable expectation of success, because converting signals based on drone type would accommodate drones of various models that have different operating systems or hardware configurations such that the commands are compatible with the UAV operating system (Zhou at ¶ [0219]).
Regarding claim 12, Garcia further teaches:
a modular payload adapted to be coupled with the drone, the modular payload including one or more of a character body, a puppet, a mirror, a screen, a smoke generator, a dust generator, a laser, an LED light, pyrotechnics, or any combinations thereof (FLYDEV may comprise a light unit, such as a laser – see at least ¶ [0055]; drones may comprise a particle generator, which may generate smoke or water, to generate a particle cloud 411 – see at least Fig. 4 and ¶ [0046], [0073]).
Regarding claim 13, Zhou further teaches:
wherein the abstraction layer provides a common interface to the flight controller for at least the first drone type and the second drone type, and wherein the common interface is compatible with flight plans received at the flight controller from the central management system (In step 2506, the one or more commands can be converted into one or more flight instructions executable by the UAV. UAVs of various manufacturers and various models can have different operating system and/or different hardware configuration, therefore, it can be necessary to convert the received commands to executable flight instructions. The conversion can be performed by one or more processors onboard the UAV. For example, the commands received from the remote server can be converted into flight instructions compatible with the instruction set of the UAV operating system – see at least ¶ [0219]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined system for creating an aerial image of Garcia, Moon, and Zhou to provide a common interface, as further taught by Zhou, with a reasonable expectation of success, because the interface would accommodate drones of various models that have different operating systems or hardware configurations such that the commands are compatible with the UAV operating system (Zhou at ¶ [0219]).
Regarding claim 15, Garcia further teaches:
wherein the flight plans include a series of drone positions and a timecode [ ] (configuration file may include a required position of each drone FLYDEV at a given moment – see at least ¶ [0068]), and the flight controller provides a status interface to inform the central management system of a drone status (each drone may send its current position at regular time intervals to control unit 101 – see at least ¶ [0069]).
Claims 10 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Garcia in view of Moon and Zhou, as applied to claim 13 above, and further in view of Hovey.
Regarding claim 10, Garcia discloses a system for creating an aerial image and teaches:
a first set of drones including a plurality of drones having a first drone type of two or more different types of drones (multiple flying devices 107 FLYDEV – see at least Fig. 1 and ¶ [0038]; system SYS may comprise a mixture of drones FLYDEV where some drones have only a light unit, some drones have only a particle generator, and some drones have both a light unit and a particle generator – see at least ¶ [0046]; FLYDEV may be a quad-copter, type of helicopter, or fixed wing aerial vehicle – see at least ¶ [0050]-[0051]);
a second set of drones including one or more drones having a second drone type (multiple flying devices 107 FLYDEV – see at least Fig. 1 and ¶ [0038]; system SYS may comprise a mixture of drones FLYDEV where some drones have only a light unit, some drones have only a particle generator, and some drones have both a light unit and a particle generator – see at least ¶ [0046]);
a flight system that provides a drone flight path to each drone of the first set of drones and the second set of drones (control unit 101 VCTRL may determine control data for controlling the drones – see at least Fig. 1 and ¶ [0044]; drones FLYDEV may fly to respective positions according to the control data – see at least ¶ [0044]; FLYDEV may be a quad-copter, type of helicopter, or fixed wing aerial vehicle – see at least ¶ [0050]-[0051]); and
a central management system (control unit 101 + user interface 102 + remote content server 110 – see at least Fig. 1 and ¶ [0044]) configured to control flight operations for each of the first set of drones and the second set of drones based at least on the drone flight path of each drone of the first set of drones and the second set of drones (image to be represented may be retrieved from server 110 using user interface 102, where the control unit 101 determines control data so that the drones represent the image – see at least ¶ [0044]; control unit 101 VCTRL may determine control data for controlling the drones – see at least Fig. 1 and ¶ [0044]; drones FLYDEV may control their respective light units according to the control data – see at least ¶ [0044]; control data may comprise positions of respective drones FLYDEV and on-off status for each respective light source – see at least ¶ [0045]);
wherein each drone of each of the first set of drones and the second set of drones comprises:
a flight controller configured to execute drone flight movements based at least in part on a flight path received from a central management system (drone FLYDEV has a processor connected to the engine and arrange to control a position of the drone FLYDEV by sending an appropriate command to the engine – see at least ¶ [0039]); and
[ ].
Garcia fails to teach an abstraction layer coupled with the flight controller and a drone hardware interface and that provides control signals via the drone hardware interface for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof, and wherein the abstraction layer converts signals from the flight controller into the control signals based at least in part on whether the associated drone is the first drone type or the second drone type.
However, Moon discloses an apparatus and method for integrated control of flight of the unmanned aerial vehicles and teaches:
an abstraction layer coupled with the flight controller and a drone hardware interface and that provides control signals via the drone hardware interface [ ] (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]), and wherein the abstraction layer converts signals from the flight controller into the control signals [ ] (flight control signal converter 122 converts generated flight control signals into a supported format corresponding to the unmanned aerial vehicles – see at least Fig. 3 and ¶ [0061]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system for creating an aerial image of Garcia to provide an abstraction layer, as taught by Moon, with a reasonable expectation of success, because the abstraction layer may convert flight control signals into a supported format for the drone (Moon at ¶ [0061]).
Moon fails to teach the control signals for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof; based at least in part on a drone type of the drone that is selected from a plurality of different available drone types.
However, Zhou discloses methods and systems for supporting flight restriction of unmanned aerial vehicles and teaches:
the control signals for one or more of a drone motor controller, a drone battery controller, a drone charging controller, or any combinations thereof (terminal 1012 may provide control data to movable object 1000, such as instructions for controlling propulsion mechanisms 1006 – see at least Fig. 18 and ¶ [0318]);
converts signals based at least in part on whether the associated drone is the first drone type or the second drone type (In step 2506, the one or more commands can be converted into one or more flight instructions executable by the UAV. UAVs of various manufacturers and various models can have different operating system and/or different hardware configuration, therefore, it can be necessary to convert the received commands to executable flight instructions. The conversion can be performed by one or more processors onboard the UAV. For example, the commands received from the remote server can be converted into flight instructions compatible with the instruction set of the UAV operating system – see at least ¶ [0219]).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined system for creating an aerial image of Garcia and Moon to provide for converting signals based on drone type, as taught by Zhou, with a reasonable expectation of success, because converting signals based on drone type would accommodate drones of various models that have different operating systems or hardware configurations such that the commands are compatible with the UAV operating system (Zhou at ¶ [0219]).
Regarding claim 14, Moon and Zhou fail to teach but Hovey discloses an aerial show system using unmanned aerial vehicles as creative elements in the show and teaches:
wherein the first drone type is a light drone, and a second drone type is a character drone (UAVS 310, 311 to animate a show effect device 330 – see at least Fig. 3 and 8:58-67; show effect device may include particles to form a projection surface and a projector for projecting light, e.g., laser light, onto the projection surface – see at least 2:61-67).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combined system for creating an aerial image of Garcia, Moon, and Zhou to provide different types of drones, such as a character drone and light drones, as taught by Hovey, with a reasonable expectation of success, because the different types of drones may be used to produce dynamic show effects as the UAVs fly through show’s airspace (Hovey at 1:10-15).
Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Garcia.
Regarding claim 18, Garcia further teaches:
wherein the modular payload includes the [particle] generator and the laser, and wherein the one or more control commands includes a command to release a cloud of [particles] and activate the laser to provide an output directed toward the cloud of [particles] thereby making a beam from the laser visible to create a floating volumetric screen (drone has a particle generator to create a particle cloud 411 and a light source for generating a light beam 412 directed to the particle cloud – see at least Fig. 4 and ¶ [0073]; drone 405 may include a light source and a particle generator – see at least Fig. 4 and ¶ [0076]; light source may emit a laser beam – see at least ¶ [0055]; control data for controlling the drone’s light unit and particle generator – see at least ¶ [0052]; control data may comprise timing data defining when the drone should be at a position and when the light source should be switched on or off – see at least ¶ [0042]).
Garcia does not teach the particles being dust. However, Garcia discloses the claimed limitation except Garcia uses particles, such as water or smoke, instead of dust, to create the floating volumetric screen. It would have been an obvious matter of design choice to use water or smoke particles instead of dust particles to create the floating volumetric screen, since Applicant has not disclosed that dust particles solves any state problem or is for any particular purpose and it appears that the invention would perform equally as well with water or smoke particles.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AARON L TROOST whose telephone number is (571)270-5779. The examiner can normally be reached Mon-Fri 7:30am-4pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at 313-446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AARON L TROOST/Primary Examiner, Art Unit 3666