DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Status of Claims
Claims 1-14 are pending and have been examined below.
Claim Rejections - 35 USC § 101
35 USC 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Regarding claim 9, under the broadest reasonable interpretation this claim is directed to a computer program only.
"Computer programs claimed as computer listings per se, i.e., the descriptions or expressions of the programs, are not physical 'things.' They are neither computer components nor statutory processes, as they are not 'acts' being performed." MPEP §2106.03 I. Because the claims recite only abstractions that are neither "things" nor "acts," the claims are not within one of the four statutory classes of invention. Because the claims are not within one of the four statutory classes of invention, the claims are rejected under 35 USC §101.
"Since a computer program is merely a set of instructions capable of being executed by a computer, the computer program itself is not a process and USPTO personnel should treat a claim for a computer program, without the computer-readable medium needed to realize the computer program's functionality, as nonstatutory functional descriptive material." MPEP §2106.03 I.
"A general purpose computer, or microprocessor, programmed to carry out an algorithm creates 'a new machine, because a general purpose computer in effect becomes a special purpose computer once it is programmed to perform particular functions pursuant to instructions from program software.'" WMS Gaming, Inc. v. International Game Tech., 184 F.3d 1339, 1348, 51 USPQ2d 1385, 1391 (Fed. Cir. 1999) citing In re Alappat, 33 F.3d 1526, 1545, 31 USPQ2d 1545, 1558 (Fed. Cir. 1994) (en banc).
In this case, claims 9 is an apparatus claim directed to method steps. Because Applicant's specification does not lexicographically define said terms to be hardware, Examiner uses the broadest reasonable interpretation to interpret steps as software. Thus, Examiner interprets claim 9 as directed to software alone.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 USC 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 2, 4 and 7-9 are rejected under 35 USC 103 as being unpatentable over US20190317529 (“Matus”) in view of US20230399124 (“Karydis”).
Claim 1
Matus discloses a method for operating a drone navigating within an area (0021 The navigational components may further include an altimeter, a barometer, wind speed sensors, cameras, or other components that are configured to assist in navigating the drone.), the navigation of the drone in the area being ruled by a navigation program setting navigation parameters of the drone to ensure the drone follows a calculated trajectory (0022 The motor controllers receive these control signals and control the various drone motors accordingly. In this manner, a drone can determine its current location, identify a flight path, and activate the motors accordingly.),
the method comprising a step of setting the navigation program to adapt the navigation parameters of the drone to an impact between the drone and an object placed in the area (0005 Disclosed embodiments include a computer system for receiving and responding to virtual forces enacted on a drone receives, at a first drone, a second virtual object location. The computer system then determines a first drone operating characteristic associated with the first drone. Further, the computer system calculates a virtual impact strength and virtual impact direction exerted on the first drone based upon the first drone operating. The computer system then communicates one or more control signals to the motors of the first drone to integrate the virtual impact strength and virtual impact direction into a movement of the drone., 0029),
the method being characterized in that the setting of the navigation parameters of the drone in the navigation program depends on the object impacting the drone (0029, 0039 In response to the virtual impact, the drone 100 automatically communicates motor control signals to the drone's motors that create the movement responsive to the virtual force enacted by the virtual projectile. The communicated motor control signals may take priority and override motor controls being provided by a user or by an autopilot operating on the drone 100. For example, the drone 100 may calculate that the virtual projectile imparted a particular momentum onto the drone 100. Using Newtonian physics, the drone 100 may calculate a responsive change in the momentum of the drone and communicate that change to the drone's motors., 0048 Further, FIG. 6 shows that the method 600 includes an act 650 of calculating a virtual impact strength and virtual impact direction exerted on the drone. Act 650 comprises calculating a virtual impact strength and virtual impact direction exerted on the first drone based upon the first drone operating characteristic and the second virtual object operating characteristic. For example, as depicted and described with respect to FIG. 3, the first drone 100 receives the operating characteristic from the second drone 200. The first drone 100 is then able to calculate, using simple Newtonian physics the resulting virtual impact strength and virtual impact direction enacted upon the first drone 100 by the collision with the second drone 200.),
the setting step comprising:
implementing a virtual impact setup in the navigation program for adjusting the navigation parameters of the drone to an impact between the drone and a virtual object (0029, 0039 In response to the virtual impact, the drone 100 automatically communicates motor control signals to the drone's motors that create the movement responsive to the virtual force enacted by the virtual projectile. The communicated motor control signals may take priority and override motor controls being provided by a user or by an autopilot operating on the drone 100. For example, the drone 100 may calculate that the virtual projectile imparted a particular momentum onto the drone 100. Using Newtonian physics, the drone 100 may calculate a responsive change in the momentum of the drone and communicate that change to the drone's motors.).
Matus fails to explicitly disclose implementing a real impact setup in the navigation program for adjusting the navigation parameters of the drone to an impact between the drone and a physical object; and wherein the area is an arena delimited by boundaries. However, Matus does disclose implementing an impact setup in the navigation program for adjusting the navigation parameters of the drone to an impact between the drone and various objects (0029, 0039) and the drone operating within an area (Fig. 3). Furthermore, Karydis teaches a system of operating a drone based on impacts with objects (abstract), including:
implementing a real impact setup in the navigation program for adjusting the navigation parameters of the drone to an impact between the drone and a physical object (0008 Consistent with the disclosed embodiments, the method comprises generating a collision signal from compression of a flexible member resulting from a mid-air collision of an aerial vehicle including the flexible member, and processing the collision signal and initiating execution of a recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision. In some embodiments, processing the collision signal and initiating execution of the recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision comprises processing the collision signal to identify an intensity of the mid-air collision. In some embodiments, the method further comprises maintaining a pre-collision thrust in the aerial vehicle during the collision and generating a new thrust to stabilize the aerial vehicle after the collision., 0056 The collision recovery control system 410 includes a single stage collision recovery control system. The collision recovery control system 410 enables the aerial vehicle 402 to sustain flight after a collision with a variety of objects including walls, poles and unstructured obstacles, or after a collision with a moving object while the aerial vehicle 402 is hovering as shown in FIG. 7.); and
wherein the area is an arena delimited by boundaries (0167, Fig. 29, 0191 Finally, we validate our proposed framework experimentally, and also test is against the collision avoidance strategy in Sec. VII-E, in a single corridor environment (FIG. 29).).
Matus and Karydis both disclose adjusted navigational control of a drone after a collision. Thus, it would have been obvious to one having ordinary skill in the art before the effective filing date of Applicant's invention to modify the system in Matus to include the teaching of Karydis with a reasonable expectation of success in order to provide enhanced safety for the drone for mitigation of collision with different object types.
Claim 2
Matus discloses wherein the virtual impact setup comprises the steps of:
determining the position of said virtual object placed in the arena (0006 The computer system receives, at a first drone, a second virtual object location.);
calculating a trajectory set (0022 The motor controllers receive these control signals and control the various drone motors accordingly. In this manner, a drone can determine its current location, identify a flight path, and activate the motors accordingly.); and
displacing the drone in the arena to follow any one of the trajectories of the trajectory set (0022 The motor controllers receive these control signals and control the various drone motors accordingly. In this manner, a drone can determine its current location, identify a flight path, and activate the motors accordingly.).
Matus fails to disclose the trajectory set comprising several trajectories of the drone within the arena based on the position of the drone relative to the position of an object. However, Matus does disclose the drone following a trajectory (0022). Furthermore, Karydis teaches:
the trajectory set comprising several trajectories of the drone within the arena based on the position of the drone relative to the position of an object (0038 FIGS. 24A-24D show experimental trajectories generated from DRR and collision avoidance trajectory generation strategy when a preplanned path intersects or does not intersect with an obstacle in accordance with some embodiments of the present disclosure;). Furthermore, while Karydis does not teach wherein the object is said virtual object, one of ordinary skill in the art would have acknowledged by the combination of Matus (which does teach the virtual object) and Karydis (which teaches the multiple trajectories relative to objects) that the object could include said virtual object. Thus, the combination teaches the claimed subject matter of wherein the trajectory set comprising several trajectories of the drone within the arena based on the position of the drone relative to the position of said virtual object.
See prior art rejection of claim 1 for obviousness and reasons to combine.
Claim 4
Matus fails to disclose wherein the real impact setup comprises the steps of: stabilizing the drone after the impact with the physical object; computing a new trajectory defined by the direction of an impact vector of the physical object; and adapting the navigation parameters of the drone to stabilize the drone into the new trajectory so that the drone is displaced along said new trajectory. However, Matus does disclose adapting the navigation parameters of the drone after impact with an object (0039). Furthermore, Karydis teaches: wherein the real impact setup comprises the steps of:
stabilizing the drone after the impact with the physical object (0008 Consistent with the disclosed embodiments, the method comprises generating a collision signal from compression of a flexible member resulting from a mid-air collision of an aerial vehicle including the flexible member, and processing the collision signal and initiating execution of a recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision. In some embodiments, processing the collision signal and initiating execution of the recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision comprises processing the collision signal to identify an intensity of the mid-air collision. In some embodiments, the method further comprises maintaining a pre-collision thrust in the aerial vehicle during the collision and generating a new thrust to stabilize the aerial vehicle after the collision., 0056 The collision recovery control system 410 includes a single stage collision recovery control system. The collision recovery control system 410 enables the aerial vehicle 402 to sustain flight after a collision with a variety of objects including walls, poles and unstructured obstacles, or after a collision with a moving object while the aerial vehicle 402 is hovering as shown in FIG. 7.);
computing a new trajectory defined by the direction of an impact vector of the physical object (0008 Consistent with the disclosed embodiments, the method comprises generating a collision signal from compression of a flexible member resulting from a mid-air collision of an aerial vehicle including the flexible member, and processing the collision signal and initiating execution of a recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision. In some embodiments, processing the collision signal and initiating execution of the recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision comprises processing the collision signal to identify an intensity of the mid-air collision. In some embodiments, the method further comprises maintaining a pre-collision thrust in the aerial vehicle during the collision and generating a new thrust to stabilize the aerial vehicle after the collision., 0056 The collision recovery control system 410 includes a single stage collision recovery control system. The collision recovery control system 410 enables the aerial vehicle 402 to sustain flight after a collision with a variety of objects including walls, poles and unstructured obstacles, or after a collision with a moving object while the aerial vehicle 402 is hovering as shown in FIG. 7.); and
adapting the navigation parameters of the drone to stabilize the drone into the new trajectory so that the drone is displaced along said new trajectory (0008 Consistent with the disclosed embodiments, the method comprises generating a collision signal from compression of a flexible member resulting from a mid-air collision of an aerial vehicle including the flexible member, and processing the collision signal and initiating execution of a recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision. In some embodiments, processing the collision signal and initiating execution of the recovery operation to sustain and resume flight of the aerial vehicle following the mid-air collision comprises processing the collision signal to identify an intensity of the mid-air collision. In some embodiments, the method further comprises maintaining a pre-collision thrust in the aerial vehicle during the collision and generating a new thrust to stabilize the aerial vehicle after the collision., 0056 The collision recovery control system 410 includes a single stage collision recovery control system. The collision recovery control system 410 enables the aerial vehicle 402 to sustain flight after a collision with a variety of objects including walls, poles and unstructured obstacles, or after a collision with a moving object while the aerial vehicle 402 is hovering as shown in FIG. 7.).
See prior art rejection of claim 1 for obviousness and reasons to combine.
Claim 7
Matus discloses:
wherein the virtual objects are chosen among virtual boundaries of the arena, virtual drones, virtual obstacles, and computer-generated virtual opponents (0039 In response to the virtual impact, the drone 100 automatically communicates motor control signals to the drone's motors that create the movement responsive to the virtual force enacted by the virtual projectile. The communicated motor control signals may take priority and override motor controls being provided by a user or by an autopilot operating on the drone 100. For example, the drone 100 may calculate that the virtual projectile imparted a particular momentum onto the drone 100. Using Newtonian physics, the drone 100 may calculate a responsive change in the momentum of the drone and communicate that change to the drone's motors.).
Claim 8
Matus fails to explicitly disclose wherein the physical objects are chosen among physical boundaries of the arena, the physical boundaries of the arena including one or more of a wall, drones, batting accessories configured for striking a drone, the players themselves, and physical obstacles placed in the arena. However, Matus does disclose another drone as another object (Fig. 3). Furthermore, Karydis teaches:
wherein the physical objects are chosen among physical boundaries of the arena, the physical boundaries of the arena including one or more of a wall, drones, batting accessories configured for striking a drone, the players themselves, and physical obstacles placed in the arena (0167 Testing the deformation controller (Sec. VII-B) and DRR strategy (Sec. VII-C) experimentally takes place in a 2.0×2.5 m area with a rectangular pillar serving as a static polygon-shaped obstacle. For testing the overall method (Sec. VII-F) experimentally, we consider a 2.5×3.5 m area with a long rectangular pillar right in the middle to create a U-shaped single corridor environment.).
See prior art rejection of claim 1 for obviousness and reasons to combine.
Claim 9
Matus in view of Karydis teaches:
a computer program comprising instructions which, when the computer program is executed by a computer, cause the computer to carry out the method according to claim 1 (Matus: 0057).
See prior art rejection of claim 1 for obviousness and reasons to combine.
Allowable Subject Matter
Claims 10-14 are allowed. Claims 3, 5 and 6 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim(s) and any intervening claim(s). The closest prior art of record is US20190317529 and US20230399124, which generally disclose a drone impacting physical and virtual objects, and adjusting navigation parameters of the drone after impact and based on the type of object impacted. However, the aforementioned claims recite subject matter directed towards at least the following subject matter: a drone configured for navigating in an arena delimited by boundaries pursuant to navigation parameters of the drone configured to ensure the drone follows a calculated trajectory; a batting instrument configured for hitting the drone; and a computer system configured for setting the navigation parameters of the drone when an impact occurs between the drone and an object placed in the arena, the computer system being configured for setting the navigation parameters of the drone according to a determined setup depending on the object impacting the drone, the determined setup being chosen among a virtual impact setup to adjust the navigation parameters of the drone to an impact between the drone and a virtual object, and a real impact setup to adjust the navigation parameters of the drone to an impact between the drone and a physical object, said real impact setup comprising a batting instrument setup to adjust the navigation parameters of the drone to an impact between the drone and the batting instrument, among other features. While relevant to the claims, the prior art does not provide sufficient disclosure, teaching or suggestion to adequately provide a basis for rejection of the claims under 35 USC 102 or 103 because a rejection based on the found prior art would only being made based on impermissible hindsight, the prior art found does not sufficiently teach nor suggest the limitations as claimed, and any prior art references that recite the stated allowable subject matter would not be able to be properly combined with the other prior art references, hence the allowability of the claims. Examiner notes that amendment to the claims resulting in a change of scope may result in requirement of an updated search.
Contact Information
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See PTO892. Specifically, the following prior art is considered relevant to Applicant’s claims:
US20190104250 – COORDINATED CINEMATIC DRONE; and
US20210173391 - APPARATUS, METHODS AND SYSTEMS FOR REMOTE OR ONBOARD CONTROL OF FLIGHTS.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Examiner KRISHNAN RAMESH whose telephone number is (571)272-6407. The examiner can normally be reached Monday-Friday 8:30am-5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn, can be reached at (571)272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KRISHNAN RAMESH/
Primary Examiner, Art Unit 3663