DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s amendments and remarks filed on 12/23/2025 with respect to previous claim rejections under 35 U.S.C. 103 have been fully considered and persuasive.
With respect to the newly amended subject matter and applicant’s arguments, the Examiner relies upon newly cited reference Ohlarik et al (US 2018/0181115 A1).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-10 and 18-21 are rejected under 35 U.S.C. 103 as being unpatentable in view of Zang (US 9,164,506 B1) in further view of Ohlarik et al (US 2018/0181115 A1), (hereinafter Ohlarik) in further view of Liu et al (US 2021/0247781 A1), (hereinafter Liu).
Regarding claim 1, Zang discloses a system to navigate a drone towards an area of interest (AOI), the system comprising: a processor in communication with a non-volatile memory comprising a processor- readable media having thereon a set of executable instructions, configured, when executed, to cause the processor to: (see Zang col 1; lines 52-62 and col 22; lines 33-53 “receiving, from a remote user, one or more navigation commands to move the UAV along a flight path; receiving, from the remote user, target information of a target to be tracked by an imaging device on the UAV; and tracking the target according to the target information by automatically adjusting at least one of the UAV or the imaging device while the UAV moves along the flight path according to the one or more navigation commands from the remote user”),
a. receive transmitted environmental data comprising a set of images of the area of interest (AOI) (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
b. transmit the set of images of the AOI to be displayed (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
c. receive a user command indicative of a selection of a target position within the AOI (see Zang col 1; lines 41-62 and col 4; lines 1-7 “receive a user selection of a target from within a displayed image; generate the target information of the target based on the user selection of the target; and transmit the target information to the UAV”),
f. determine an optimal target velocity to navigate the drone towards the area of interest, wherein the optimal target velocity is based at least in part on the approximate distance between the current position of the drone and the target position within the AOI (see Zang col 3; lines 63-65, col 29; lines 11-27 & 48-50, col 31; lines 1-15 and col 33; lines 15-26 “the remote control device is configured to receive user input from a touchscreen, joystick, keyboard, mouse, or stylus”, “the speed of rotation (e.g., absolute value of the angular velocity) around a given axis (e.g., the Y (yaw) axis) may depend on the distance between the expected and the actual position of the target along the axis” and “The process 800 includes receiving 802 user navigation commands and target information, for example, from a remote control terminal such as described herein. The navigation commands can be used for controlling navigational parameters of the movable object such as the position, speed, orientation, or attitude of the movable object”),
but Zang fails to explicitly disclose the drone, a remote controller with a trigger operable to control the drone; d) receive an indication of pressure exerted on the trigger of the remote controller, the pressure indicative of a user desired velocity of flight and wherein the optimal target velocity is based at least in part on the indication of pressure exerted on the trigger of the remote controller; g) generate a set of control signals, wherein the set of control signals comprise an optimal target velocity command based on the amount of pressure the user applies to the trigger of the remote controller, and a direction.
However, Ohlarik teaches the drone, a remote controller with a trigger operable to control the drone (see Ohlarik paras “0015” and “0046” “transmitter 110 may include a speed trigger 112, a steering wheel 114, and a thumb slider 116 as three primary controls for operation of UAV 120” and “an operator's index finger, pulling the trigger will provide a command to increase rotor speed”),
d) receive an indication of pressure exerted on the trigger of the remote controller, the pressure indicative of a user desired velocity of flight (see Ohlarik paras “0020”, “0026” and “0046” “Squeezing speed trigger 112 from the neutral position may indicate an increasing degree of speed”, “The mechanical motion caused by an operator… may provide corresponding electrical signals indicative of the position of speed trigger 112” and “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement”),
wherein the optimal target velocity is based at least in part on the indication of pressure exerted on the trigger of the remote controller (see Ohlarik paras “0020”, “0041”, “0046” and “0063” “throttle signal translator 510 may generate rotor speed commands to control motors 410/rotors 122 consistent with operator throttle input”, “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement” and “flight controller 140 may convert signals 130 into rotor speed commands that accomplish the forward/reverse motion”),
g) generate a set of control signals, wherein the set of control signals comprise an optimal target velocity command based on the amount of pressure the user applies to the trigger of the remote controller, and a direction (see Ohlarik paras “0014”, “0020-0021”, “0041”, “0046” and “0062-0063” “controls operation of a UAV… via signals”, “Speed trigger 112 may be moved (e.g., by an operator) back and forth in a direction as shown in FIG. 2 to provide forward and reverse speed commands”, “Steering wheel 114 may provide control signals to turn UAV 120 left and right”, “throttle signal translator 510 may generate rotor speed commands to control motors 410/rotors 122 consistent with operator throttle input”, “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement”, “the control signals indicating the relative positions of the speed trigger, the steering wheel, and the altitude slider” and “flight controller 140 may convert signals 130 into rotor speed commands that accomplish the forward/reverse motion”),
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Zang for systems and methods for target tracking “to generate control signals to the UAV that include both velocity and direction” as taught by Ohlarik (para [0020-0021] – [0062-0063]) in order to provide intuitive and proportional control of UAV movement consistent with operator intent.
but modified Zang fails to explicitly disclose e). determine an approximate distance between a current position of the drone and the target position within the area of interest (AOI); and h. transmit the set of control signals to the drone.
However, Liu teaches determine an approximate distance between a current position of the drone and the target position within the area of interest (AOI) (see Liu abstract and paras “0005” and “0025” “determining a current position for the unmanned aerial vehicle; determining a distance of the current position from a target position; generating at least one velocity command from the determined distance, the velocity command reducing to zero as the distance reduces to zero; and controlling the unmanned aerial vehicle to maintain the velocity thereof at the velocity command, whereby as the unmanned aerial vehicle approaches the target destination”),
h. transmit the set of control signals to the drone (see Liu at least abstract and paras “0005” and “0025”),
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Zang for systems and methods for target tracking “to control the unmanned aerial vehicle to maintain the velocity thereof at the velocity command” as taught by Liu (para [0025]) in order to control the UAV to perform the landing maneuver safely.
Regarding claim 2, Zang discloses wherein the processor-readable media having thereon the set of executable instructions, configured, when executed, to further cause the processor to: a. load a menu of options to control a drone-attached motorized arm (see Zang col 37; lines 17-34 “the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”),
b. receive a user command identifying an object of interest (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
c. receive a user-activated command to engage the identified object of interest, the user-activated command selected from the menu of options d. transmit a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options (see Zang col 37; lines 17-34 “the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”).
Regarding claim 3, Zang discloses wherein a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options is at least one of an open command, a close command, an approach command, an activate command, a release command, and a grip command (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein” and “the payload can be configured to interact with the environment or a target. For example, the payload can include a tool, instrument, or mechanism capable of manipulating objects, such as a robotic arm.”).
Regarding claim 4, Zang discloses wherein a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options is selected from a visual user interface (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the process 1000 includes receiving 1004 a user selection of a target from within at least one of the images being displayed. The user may select the target via the same user interface that displays the images. For example, in some embodiments, the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”).
Regarding claim 5, Zang discloses wherein a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options is selected from a joystick (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the process 1000 includes receiving 1004 a user selection of a target from within at least one of the images being displayed. The user may select the target via the same user interface that displays the images. For example, in some embodiments, the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”).
Regarding claim 6, Zang discloses a method to navigate a drone towards an area of interest (AOI), the method comprising: (see Zang col 1; lines 52-62 and col 22; lines 33-53 “receiving, from a remote user, one or more navigation commands to move the UAV along a flight path; receiving, from the remote user, target information of a target to be tracked by an imaging device on the UAV; and tracking the target according to the target information by automatically adjusting at least one of the UAV or the imaging device while the UAV moves along the flight path according to the one or more navigation commands from the remote user”),
a. receiving transmitted environmental data comprising a set of images of the AOI (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
b. transmitting the set of images of the AOI to be displayed (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
c. receiving a user command indicative of a selection of a target position within the AOI (see Zang col 1; lines 41-62 and col 4; lines 1-7 “receive a user selection of a target from within a displayed image; generate the target information of the target based on the user selection of the target; and transmit the target information to the UAV”),
f. determining an optimal target velocity to navigate the drone towards the area of interest, wherein the optimal target velocity is based at least in part on the approximate distance between the current position of the drone and the target position within the AOI (see Zang col 3; lines 63-65, col 29; lines 11-27 & 48-50, col 31; lines 1-15 and col 33; lines 15-26 “the remote control device is configured to receive user input from a touchscreen, joystick, keyboard, mouse, or stylus”, “the speed of rotation (e.g., absolute value of the angular velocity) around a given axis (e.g., the Y (yaw) axis) may depend on the distance between the expected and the actual position of the target along the axis” and “The process 800 includes receiving 802 user navigation commands and target information, for example, from a remote control terminal such as described herein. The navigation commands can be used for controlling navigational parameters of the movable object such as the position, speed, orientation, or attitude of the movable object”),
but Zang fails to explicitly disclose implemented in a system comprising the drone, and a remote controller with a trigger operable to control the drone; d. receiving an indication of pressure exerted on the trigger of the remote controller indicative of a user desired velocity of flight; wherein the optimal target velocity is based at least in part on the indication of pressure exerted on the trigger of the remote controller and g) generating a set of control signals, wherein the set of control signals comprise an optimal target velocity command based on the amount of pressure the user applies to the trigger of the remote controller and a direction.
However, Ohlarik teaches implemented in a system comprising the drone, and a remote controller with a trigger operable to control the drone (see Ohlarik paras “0015” and “0046” “transmitter 110 may include a speed trigger 112, a steering wheel 114, and a thumb slider 116 as three primary controls for operation of UAV 120” and “an operator's index finger, pulling the trigger will provide a command to increase rotor speed”),
d. receiving an indication of pressure exerted on the trigger of the remote controller indicative of a user desired velocity of flight (see Ohlarik paras “0020”, “0026” and “0046” “Squeezing speed trigger 112 from the neutral position may indicate an increasing degree of speed”, “The mechanical motion caused by an operator… may provide corresponding electrical signals indicative of the position of speed trigger 112” and “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement”),
wherein the optimal target velocity is based at least in part on the indication of pressure exerted on the trigger of the remote controller (see Ohlarik paras “0020”, “0041”, “0046” and “0063” “throttle signal translator 510 may generate rotor speed commands to control motors 410/rotors 122 consistent with operator throttle input”, “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement” and “flight controller 140 may convert signals 130 into rotor speed commands that accomplish the forward/reverse motion”),
g) generating a set of control signals, wherein the set of control signals comprise an optimal target velocity command based on the amount of pressure the user applies to the trigger of the remote controller and a direction (see Ohlarik paras “0014”, “0020-0021”, “0041”, “0046” and “0062-0063” “controls operation of a UAV… via signals”, “Speed trigger 112 may be moved (e.g., by an operator) back and forth in a direction as shown in FIG. 2 to provide forward and reverse speed commands”, “Steering wheel 114 may provide control signals to turn UAV 120 left and right”, “throttle signal translator 510 may generate rotor speed commands to control motors 410/rotors 122 consistent with operator throttle input”, “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement”, “the control signals indicating the relative positions of the speed trigger, the steering wheel, and the altitude slider” and “flight controller 140 may convert signals 130 into rotor speed commands that accomplish the forward/reverse motion”),
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Zang for systems and methods for target tracking “to generate control signals to the UAV that include both velocity and direction” as taught by Ohlarik (para [0020-0021] – [0062-0063]) in order to provide intuitive and proportional control of UAV movement consistent with operator intent.
but modified Zang fails to explicitly disclose determining an approximate distance between a current position of the drone and the target position within the AOI; and h. transmitting the set of control signals to the drone.
However, Liu teaches determining an approximate distance between a current position of the drone and the target position within the AOI (see Liu abstract and paras “0005” and “0025” “determining a current position for the unmanned aerial vehicle; determining a distance of the current position from a target position; generating at least one velocity command from the determined distance, the velocity command reducing to zero as the distance reduces to zero; and controlling the unmanned aerial vehicle to maintain the velocity thereof at the velocity command, whereby as the unmanned aerial vehicle approaches the target destination”),
and h. transmitting the set of control signals to the drone (see Liu abstract and paras “0005” and “0025” “determining a current position for the unmanned aerial vehicle; determining a distance of the current position from a target position; generating at least one velocity command from the determined distance, the velocity command reducing to zero as the distance reduces to zero; and controlling the unmanned aerial vehicle to maintain the velocity thereof at the velocity command, whereby as the unmanned aerial vehicle approaches the target destination”),
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Zang for systems and methods for target tracking “to control the unmanned aerial vehicle to maintain the velocity thereof at the velocity command” as taught by Liu (para [0025]) in order to control the UAV to perform the landing maneuver safely.
Regarding claim 7, Zang discloses a. loading a menu of options to control a drone-attached motorized arm (see Zang col 37; lines 17-34 “the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”),
b. receiving a user command identifying an object of interest (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
c. receiving a user-activated command to engage the identified object of interest, the user-activated command selected from the menu of options d. transmitting a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options (see Zang col 37; lines 17-34 “the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”).
Regarding claim 8, Zang discloses wherein a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options is at least one of an open command, a close command, an approach command, an activate command, a release command, and a grip command (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein” and “the payload can be configured to interact with the environment or a target. For example, the payload can include a tool, instrument, or mechanism capable of manipulating objects, such as a robotic arm.”).
Regarding claim 9, Zang discloses wherein a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options is selected from a visual user interface (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the process 1000 includes receiving 1004 a user selection of a target from within at least one of the images being displayed. The user may select the target via the same user interface that displays the images. For example, in some embodiments, the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”).
Regarding claim 10, Zang discloses wherein a user-activated command to engage the object of interest based at least in part on the user-activated command selected from the menu of options is selected from a joystick (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the process 1000 includes receiving 1004 a user selection of a target from within at least one of the images being displayed. The user may select the target via the same user interface that displays the images. For example, in some embodiments, the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”).
Regarding claim 18, Zang discloses a pilot-assisted system to navigate an unmanned drone towards an area of interest (AOI), the system comprising: an unmanned drone further comprising a processor in communication with a non- volatile memory comprising a processor-readable media having thereon a set of executable instructions, configured, when executed, to cause the processor to: (see Zang col 1; lines 52-62 and col 22; lines 33-53 “receiving, from a remote user, one or more navigation commands to move the UAV along a flight path; receiving, from the remote user, target information of a target to be tracked by an imaging device on the UAV; and tracking the target according to the target information by automatically adjusting at least one of the UAV or the imaging device while the UAV moves along the flight path according to the one or more navigation commands from the remote user”),
a. receiving transmitted environmental data comprising a set of images of the area of interest (AOI) (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
b. receive a user command indicative of a selection of a target position within the AOI (see Zang col 1; lines 41-62 and col 4; lines 1-7 “receive a user selection of a target from within a displayed image; generate the target information of the target based on the user selection of the target; and transmit the target information to the UAV”),
e. determine an optimal target velocity to navigate the drone towards the area of interest, wherein the optimal target velocity is based at least in part on the approximate distance between the current position of the drone and the target position within the AOI (see Zang col 3; lines 63-65, col 29; lines 11-27 & 48-50, col 31; lines 1-15 and col 33; lines 15-26 “the remote control device is configured to receive user input from a touchscreen, joystick, keyboard, mouse, or stylus”, “the speed of rotation (e.g., absolute value of the angular velocity) around a given axis (e.g., the Y (yaw) axis) may depend on the distance between the expected and the actual position of the target along the axis” and “The process 800 includes receiving 802 user navigation commands and target information, for example, from a remote control terminal such as described herein. The navigation commands can be used for controlling navigational parameters of the movable object such as the position, speed, orientation, or attitude of the movable object”),
but Zang fails to explicitly disclose the unmanned drone, a remote controller with a trigger operable to control the unmanned drone; c) receive an indication of pressure exerted on the trigger of the remote controller the pressure indicative of a user desired velocity of flight; wherein the optimal target velocity is based at least in part on the indication of pressure exerted on the trigger of the remote controller and f) generate a set of control signals, wherein the set of control signals comprise an optimal target velocity command based on the amount of pressure the user applies to the trigger of the remote controller and a direction in three dimensional space.
However, Ohlarik teaches the unmanned drone, a remote controller with a trigger operable to control the unmanned drone (see Ohlarik paras “0015” and “0046” “transmitter 110 may include a speed trigger 112, a steering wheel 114, and a thumb slider 116 as three primary controls for operation of UAV 120” and “an operator's index finger, pulling the trigger will provide a command to increase rotor speed”),
c) receive an indication of pressure exerted on the trigger of the remote controller the pressure indicative of a user desired velocity of flight (see Ohlarik paras “0020”, “0026” and “0046” “Squeezing speed trigger 112 from the neutral position may indicate an increasing degree of speed”, “The mechanical motion caused by an operator… may provide corresponding electrical signals indicative of the position of speed trigger 112” and “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement”),
wherein the optimal target velocity is based at least in part on the indication of pressure exerted on the trigger of the remote controller (see Ohlarik paras “0020”, “0041”, “0046” and “0063” “throttle signal translator 510 may generate rotor speed commands to control motors 410/rotors 122 consistent with operator throttle input”, “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement” and “flight controller 140 may convert signals 130 into rotor speed commands that accomplish the forward/reverse motion”),
f) generate a set of control signals, wherein the set of control signals comprise an optimal target velocity command based on the amount of pressure the user applies to the trigger of the remote controller and a direction in three dimensional space (see Ohlarik paras “0014”, “0020-0021”, “0035”, “0041”, “0046” and “0062-0063” “controls operation of a UAV… via signals”, “Speed trigger 112 may be moved (e.g., by an operator) back and forth in a direction as shown in FIG. 2 to provide forward and reverse speed commands”, “Steering wheel 114 may provide control signals to turn UAV 120 left and right”, “throttle signal translator 510 may generate rotor speed commands to control motors 410/rotors 122 consistent with operator throttle input”, “The further the pull distance of speed trigger 112, the larger the throttle input and the faster the forward movement”, “the control signals indicating the relative positions of the speed trigger, the steering wheel, and the altitude slider” and “flight controller 140 may convert signals 130 into rotor speed commands that accomplish the forward/reverse motion”),
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Zang for systems and methods for target tracking “to generate control signals to the UAV that include both velocity and direction” as taught by Ohlarik (para [0020-0021] – [0062-0063]) in order to provide intuitive and proportional control of UAV movement consistent with operator intent.
but modified Zang fails to explicitly disclose determine an approximate distance between a current position of the drone and the target position within the AOI; and g. implement the set of control signals.
However, Liu teaches determine an approximate distance between a current position of the drone and the target position within the AOI (see Liu abstract and paras “0005” and “0025” “determining a current position for the unmanned aerial vehicle; determining a distance of the current position from a target position; generating at least one velocity command from the determined distance, the velocity command reducing to zero as the distance reduces to zero; and controlling the unmanned aerial vehicle to maintain the velocity thereof at the velocity command, whereby as the unmanned aerial vehicle approaches the target destination”),
and g. implement the set of control signals (see Liu abstract and paras “0005”, “0008” and “0025” “determining a current position for the unmanned aerial vehicle; determining a distance of the current position from a target position; generating at least one velocity command from the determined distance, the velocity command reducing to zero as the distance reduces to zero; and controlling the unmanned aerial vehicle to maintain the velocity thereof at the velocity command, whereby as the unmanned aerial vehicle approaches the target destination”),
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Zang for systems and methods for target tracking “to control the unmanned aerial vehicle to maintain the velocity thereof at the velocity command” as taught by Liu (para [0025]) in order to control the UAV to perform the landing maneuver safely.
Regarding claim 19, Zang discloses wherein the processor-readable media having thereon the set of executable instructions, configured, when executed, to further cause the processor to: (see Zang col 1; lines 52-62 and col 22; lines 33-53 “receiving, from a remote user, one or more navigation commands to move the UAV along a flight path; receiving, from the remote user, target information of a target to be tracked by an imaging device on the UAV; and tracking the target according to the target information by automatically adjusting at least one of the UAV or the imaging device while the UAV moves along the flight path according to the one or more navigation commands from the remote user”),
a. activate a peripheral device (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
b. transmit at least one image containing an object of interest (see Zang col 1; lines 41-62 “The UAV may be configured to receive target information from a control terminal related to a target to be tracked by an imaging device coupled to the UAV. The target information may be used by the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or more images captured by the imaging device. Any description of tracking may include visual tracking by the imaging device. The control terminal may be configured to display images from the imaging device as well as allowing user input related to the target information”),
c. receive at least one user-defined pixel indicative of the object of interest within the at least one transmitted image (see Zang col 1; lines 41-62, col 4; lines 1-7, col 20; lines 44-49 and col 22; lines 63-67 “receive a user selection of a target from within a displayed image; generate the target information of the target based on the user selection of the target; and transmit the target information to the UAV”),
d. receive a user-activated command to engage the identified object of interest with the activated peripheral device (see Zang col 3; lines 63-65 and col 33; lines 15-26 “the remote control device is configured to receive user input from a touchscreen, joystick, keyboard, mouse, or stylus” and “The process 800 includes receiving 802 user navigation commands and target information, for example, from a remote control terminal such as described herein. The navigation commands can be used for controlling navigational parameters of the movable object such as the position, speed, orientation, or attitude of the movable object”),
f. generate a set of approach control signals to traverse at least a portion of the distance, wherein the set of approach control signals comprise an optimal target velocity command and a direction in two-dimensional space towards the object of interest (see Zang col 3; lines 63-65, col 29; lines 11-27 & 48-50, col 31; lines 1-15 and col 33; lines 15-26 “the remote control device is configured to receive user input from a touchscreen, joystick, keyboard, mouse, or stylus”, “the speed of rotation (e.g., absolute value of the angular velocity) around a given axis (e.g., the Y (yaw) axis) may depend on the distance between the expected and the actual position of the target along the axis” and “The process 800 includes receiving 802 user navigation commands and target information, for example, from a remote control terminal such as described herein. The navigation commands can be used for controlling navigational parameters of the movable object such as the position, speed, orientation, or attitude of the movable object”),
but Zang fails to explicitly disclose determine a distance between the current unmanned drone position and the object of interest based at least in part on the at least one user-defined pixel indicative of the object of interest within the at least one transmitted image; g. execute the set of approach control signals to traverse at least a portion of the distance; and h. execute the user-activated command to engage the identified object of interest with the activated peripheral device.
However, Liu teaches determine a distance between the current unmanned drone position and the object of interest based at least in part on the at least one user-defined pixel indicative of the object of interest within the at least one transmitted image (see Liu abstract and paras “0005” and “0025” “determining a current position for the unmanned aerial vehicle; determining a distance of the current position from a target position; generating at least one velocity command from the determined distance, the velocity command reducing to zero as the distance reduces to zero; and controlling the unmanned aerial vehicle to maintain the velocity thereof at the velocity command, whereby as the unmanned aerial vehicle approaches the target destination”),
g. execute the set of approach control signals to traverse at least a portion of the distance; and h. execute the user-activated command to engage the identified object of interest with the activated peripheral device (see Liu abstract and paras “0005” and “0025” “determining a current position for the unmanned aerial vehicle; determining a distance of the current position from a target position; generating at least one velocity command from the determined distance, the velocity command reducing to zero as the distance reduces to zero; and controlling the unmanned aerial vehicle to maintain the velocity thereof at the velocity command, whereby as the unmanned aerial vehicle approaches the target destination”),
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Zang for systems and methods for target tracking “to control the unmanned aerial vehicle to maintain the velocity thereof at the velocity command” as taught by Liu (para [0025]) in order to control the UAV to perform the landing maneuver safely.
Regarding claim 20, Zang discloses wherein a user-activated command to engage the identified object of interest with the activated peripheral device is at least one of an open command, a close command, an approach command, an activate command, a release command, and a grip command (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein” and “the payload can be configured to interact with the environment or a target. For example, the payload can include a tool, instrument, or mechanism capable of manipulating objects, such as a robotic arm.”).
Regarding claim 21, Zang discloses wherein a user-activated command to engage the identified object of interest with the activated peripheral device based at least in part on the user-activated command selected from the menu of options is selected from a visual user interface (see Zang col 37; lines 17-34 and col 43; lines 22-44 “the process 1000 includes receiving 1004 a user selection of a target from within at least one of the images being displayed. The user may select the target via the same user interface that displays the images. For example, in some embodiments, the user selection of the target is achieved by a user selecting an area of the at least one of the one or more images being displayed on the display, the selected area corresponding to the target. For example, the user may select the target by directly touching a touchscreen using a finger or stylus. As another example, the user may select the target using a mouse, joystick, gesture, or voice command. In yet another embodiment, the user may select target via a wearable device such as a helmet, virtual reality goggle, and the like. Selection of the target area can involve any suitable motion including touching or tapping, swiping, circling, clicking, or the like associated with any suitable input methods such as described herein”).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HOSSAM M ABDELLATIF whose telephone number is (571)272-5869. The examiner can normally be reached on M-F 8 am-5 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rachid Bendidi can be reached on (571) 272-4896. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HOSSAM M ABD EL LATIF/Examiner, Art Unit 3664