Prosecution Insights
Last updated: April 19, 2026
Application No. 18/773,769

METHOD AND APPARATUS FOR CONTROLLING FLIGHT ASSEMBLY, TERMINAL, AND READABLE STORAGE MEDIUM

Non-Final OA §103
Filed
Jul 16, 2024
Examiner
SEOL, DAVIN
Art Unit
3662
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Vivo Mobile Communication Co., Ltd.
OA Round
1 (Non-Final)
65%
Grant Probability
Favorable
1-2
OA Rounds
3y 1m
To Grant
79%
With Interview

Examiner Intelligence

Grants 65% — above average
65%
Career Allow Rate
102 granted / 157 resolved
+13.0% vs TC avg
Moderate +14% lift
Without
With
+14.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
29 currently pending
Career history
186
Total Applications
across all art units

Statute-Specific Performance

§101
18.5%
-21.5% vs TC avg
§103
44.9%
+4.9% vs TC avg
§102
10.3%
-29.7% vs TC avg
§112
22.8%
-17.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 157 resolved cases

Office Action

§103
DETAILED ACTION This is a first action on the merits. Claims 1-20 are pending. Claims dated 07/16/2024 are being examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (IDS) submitted on 07/16/2024 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Each of the dependent claims are objected to because of the following informalities: the preamble should account for previous recitations of the limitation. For example, “a flight action” should be introduced as “the flight action”. Each dependent claim preamble contains similar informalities. Appropriate correction is required. Claims 8 and 17 are objected to because of the following informality: The preamble recites “wherein the touch object is a stylus”. This limitation is already present in claim 6 and 15, respectively. Deletion is suggested to avoid repeating the same limitation. Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 4, 10-11, 13, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US-20180046176-A1), in view of Li et al. (US-20180237140-A1) and herein after will be referred to as Kim and Li, respectively. Regarding claim 1, Kim teaches a method for controlling a flight assembly, wherein the flight assembly is movably disposed inside a terminal (FIG. 3 unmanned air vehicle 100 is movably disposed inside terminal part 200), the terminal is provided with an outlet for the flight assembly to move out of the terminal (FIGS. 3 hangar part 210; FIG. 11 hangar cover 231 for unmanned air vehicle to slidably exit), and the method comprises ([0027] As shown in FIGS. 1 to 9, a mobile communication terminal provided for an embodiment of the present invention includes: an unmanned air vehicle 100; and a mobile communication terminal part 200 in which the unmanned air vehicle 100 is kept therein): displaying a control on the terminal in response to a first input performed by a touch object on the terminal (FIG. 1 displaying a control interface in response to user turning on and/or waking up the terminal 200; [0013] …the manipulation part of the communication terminal part may include a display device displaying a touch screen for controlling the unmanned air vehicle when the unmanned air vehicle control application program is executed); […] the flight assembly to move towards the outlet ([0029] a user may take out the unmanned air vehicle 100 from the hangar part 210 of the mobile communication terminal part 200; FIG. 11 unmanned air vehicle to slidably exit from hangar cover 231 via sliding groove 232 from user upward force) and controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal (FIG. 1 controlling the flight action of the unmanned air vehicle according to touch key inputs; [0044] As shown in FIG. 1, the unmanned air vehicle control application program 262 may be executed to display a touch screen on which various touch keys are arranged so that the user may control an operation of the unmanned air vehicle 100 through the display device 271 provided in a front part of the mobile communication terminal part 200). Kim does not explicitly teach moving the flight assembly towards the outlet by controlling, in response to a second input performed by the touch object on the control, because in Kim, the unmanned air vehicle is taken out manually by the user’s upward force (FIG. 11). However, Li teaches controlling, in response to a second input performed by the touch object on the control, the flight assembly ([0181] That is, referring to FIG. 17, when it is detected that the moving direction of the single touch point is moving up, an operation command for controlling an aircraft to move up is generated). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify how the flight assembly exits the terminal as taught in Kim to incorporate the teachings of Li to include doing so “in response to a second input performed by the touch object on the control”, with a reasonable expectation of success since having another way to fly the camera out of the terminal with less manual work increases the convenience of user control of the camera, and is more convenient in that “this manner of controlling the aircraft is similar to the manner of operating the movement of the aircraft in the real world” (Li [0183]). Regarding claim 2, Kim, as modified, teaches the method according to claim 1. Kim does not explicitly teach: wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the method further comprises: controlling the flight assembly to fly to a preset altitude to hover. However, Li teaches wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the method further comprises: controlling the flight assembly to fly to a preset altitude to hover (FIG. 4 acquire a photographing start command 402, send the combined operation command to a UAV 406; FIG. 5 start control 502 and fixed-point photographing 503; [0073] In some embodiments, the UAV flight action includes at least one of a UAV flight-speed adjustment action, a UAV direction adjustment action, a UAV height adjustment action, a UAV hover action, a UAV roll action, and a UAV yaw action). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim to incorporate the teachings of Li to include wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the method further comprises: controlling the flight assembly to fly to a preset altitude to hover, with a reasonable expectation of success since “in this way, a user does not need to learn complex techniques of operating a UAV and performing photographing to photograph a landscape image at a specific position by using the UAV, so that operations are convenient” (Li [0102]). Regarding claim 4, Kim, as modified, teaches the method according to claim 1. Kim, as modified, does not explicitly teach wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a translation direction and a translation distance of the touch object, and determining a target translation direction and a target translation distance of the flight assembly according to the translation direction and the translation distance of the touch object; and controlling the flight assembly to translate according to the target translation direction and the target translation distance. However, Li teaches wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a translation direction and a translation distance of the touch object, and determining a target translation direction and a target translation distance of the flight assembly according to the translation direction and the translation distance of the touch object; and controlling the flight assembly to translate according to the target translation direction and the target translation distance ([0181] That is, referring to FIG. 17, when it is detected that the moving direction of the single touch point is moving up, an operation command for controlling an aircraft to move up is generated). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Li to include wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a translation direction and a translation distance of the touch object, and determining a target translation direction and a target translation distance of the flight assembly according to the translation direction and the translation distance of the touch object; and controlling the flight assembly to translate according to the target translation direction and the target translation distance, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and is more convenient in that “this manner of controlling the aircraft is similar to the manner of operating the movement of the aircraft in the real world” (Li [0183]). Regarding claim 10, Kim teaches a terminal, comprising (FIG. 10 mobile communication terminal part 200): a memory storing computer-readable instructions; and a processor coupled to the memory and configured to execute the computer-readable instructions, wherein the computer-readable instructions, when executed by the processor, cause the processor to perform operations comprising (FIG. 10 main controller 261 and UAV control application program 262): displaying a control on the terminal in response to a first input performed by a touch object on the terminal (FIG. 1 displaying a control interface in response to user turning on and/or waking up the terminal 200; [0013] …the manipulation part of the communication terminal part may include a display device displaying a touch screen for controlling the unmanned air vehicle when the unmanned air vehicle control application program is executed); […] the flight assembly to move towards the outlet ([0029] a user may take out the unmanned air vehicle 100 from the hangar part 210 of the mobile communication terminal part 200; FIG. 11 unmanned air vehicle to slidably exit from hangar cover 231 via sliding groove 232 from user upward force) and controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal (FIG. 1 controlling the flight action of the unmanned air vehicle according to touch key inputs; [0044] As shown in FIG. 1, the unmanned air vehicle control application program 262 may be executed to display a touch screen on which various touch keys are arranged so that the user may control an operation of the unmanned air vehicle 100 through the display device 271 provided in a front part of the mobile communication terminal part 200). Kim does not explicitly teach moving the flight assembly towards the outlet by controlling, in response to a second input performed by the touch object on the control, because in Kim, the unmanned air vehicle is taken out manually by the user’s upward force (FIG. 11). However, Li teaches controlling, in response to a second input performed by the touch object on the control, the flight assembly ([0181] That is, referring to FIG. 17, when it is detected that the moving direction of the single touch point is moving up, an operation command for controlling an aircraft to move up is generated). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify how the flight assembly exits the terminal as taught in Kim to incorporate the teachings of Li to include doing so “in response to a second input performed by the touch object on the control”, with a reasonable expectation of success since having another way to fly the camera out of the terminal with less manual work increases the convenience of user control of the camera, and is more convenient in that “this manner of controlling the aircraft is similar to the manner of operating the movement of the aircraft in the real world” (Li [0183]). Regarding claim 11, Kim, as modified, teaches the terminal according to claim 10. Kim does not explicitly teach: wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: controlling the flight assembly to fly to a preset altitude to hover. However, Li teaches wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: controlling the flight assembly to fly to a preset altitude to hover (FIG. 4 acquire a photographing start command 402, send the combined operation command to a UAV 406; FIG. 5 start control 502 and fixed-point photographing 503; [0073] In some embodiments, the UAV flight action includes at least one of a UAV flight-speed adjustment action, a UAV direction adjustment action, a UAV height adjustment action, a UAV hover action, a UAV roll action, and a UAV yaw action). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim to incorporate the teachings of Li to include wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: controlling the flight assembly to fly to a preset altitude to hover, with a reasonable expectation of success since “in this way, a user does not need to learn complex techniques of operating a UAV and performing photographing to photograph a landscape image at a specific position by using the UAV, so that operations are convenient” (Li [0102]). Regarding claim 13, Kim, as modified, teaches the terminal according to claim 10. Kim, as modified, does not explicitly teach wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a translation direction and a translation distance of the touch object, and determining a target translation direction and a target translation distance of the flight assembly according to the translation direction and the translation distance of the touch object; and controlling the flight assembly to translate according to the target translation direction and the target translation distance. However, Li teaches wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a translation direction and a translation distance of the touch object, and determining a target translation direction and a target translation distance of the flight assembly according to the translation direction and the translation distance of the touch object; and controlling the flight assembly to translate according to the target translation direction and the target translation distance ([0181] That is, referring to FIG. 17, when it is detected that the moving direction of the single touch point is moving up, an operation command for controlling an aircraft to move up is generated). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Li to include wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a translation direction and a translation distance of the touch object, and determining a target translation direction and a target translation distance of the flight assembly according to the translation direction and the translation distance of the touch object; and controlling the flight assembly to translate according to the target translation direction and the target translation distance, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and is more convenient in that “this manner of controlling the aircraft is similar to the manner of operating the movement of the aircraft in the real world” (Li [0183]). Regarding claim 19, Kim teaches a non-transitory computer-readable medium storing instructions that, when executed by a processor of a terminal, cause the processor to perform operations comprising (FIG. 10 main controller 261 and UAV control application program 262): displaying a control on the terminal in response to a first input performed by a touch object on the terminal (FIG. 1 displaying a control interface in response to user turning on and/or waking up the terminal 200; [0013] …the manipulation part of the communication terminal part may include a display device displaying a touch screen for controlling the unmanned air vehicle when the unmanned air vehicle control application program is executed); […] the flight assembly to move towards the outlet ([0029] a user may take out the unmanned air vehicle 100 from the hangar part 210 of the mobile communication terminal part 200; FIG. 11 unmanned air vehicle to slidably exit from hangar cover 231 via sliding groove 232 from user upward force) and controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal (FIG. 1 controlling the flight action of the unmanned air vehicle according to touch key inputs; [0044] As shown in FIG. 1, the unmanned air vehicle control application program 262 may be executed to display a touch screen on which various touch keys are arranged so that the user may control an operation of the unmanned air vehicle 100 through the display device 271 provided in a front part of the mobile communication terminal part 200). Kim does not explicitly teach moving the flight assembly towards the outlet by controlling, in response to a second input performed by the touch object on the control, because in Kim, the unmanned air vehicle is taken out manually by the user’s upward force (FIG. 11). However, Li teaches controlling, in response to a second input performed by the touch object on the control, the flight assembly ([0181] That is, referring to FIG. 17, when it is detected that the moving direction of the single touch point is moving up, an operation command for controlling an aircraft to move up is generated). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify how the flight assembly exits the terminal as taught in Kim to incorporate the teachings of Li to include doing so “in response to a second input performed by the touch object on the control”, with a reasonable expectation of success since having another way to fly the camera out of the terminal with less manual work increases the convenience of user control of the camera, and is more convenient in that “this manner of controlling the aircraft is similar to the manner of operating the movement of the aircraft in the real world” (Li [0183]). Regarding claim 20, Kim, as modified, teaches the non-transitory computer-readable medium according to claim 19. Kim does not explicitly teach: wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: controlling the flight assembly to fly to a preset altitude to hover. However, Li teaches wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: controlling the flight assembly to fly to a preset altitude to hover (FIG. 4 acquire a photographing start command 402, send the combined operation command to a UAV 406; FIG. 5 start control 502 and fixed-point photographing 503; [0073] In some embodiments, the UAV flight action includes at least one of a UAV flight-speed adjustment action, a UAV direction adjustment action, a UAV height adjustment action, a UAV hover action, a UAV roll action, and a UAV yaw action). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim to incorporate the teachings of Li to include wherein before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: controlling the flight assembly to fly to a preset altitude to hover, with a reasonable expectation of success since “in this way, a user does not need to learn complex techniques of operating a UAV and performing photographing to photograph a landscape image at a specific position by using the UAV, so that operations are convenient” (Li [0102]). Claims 3, 5, 12, and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Kim, in view of Li, in further view of Liu et al. (US-20200166926-A1) and herein after will be referred to as Liu. Regarding claim 3, Kim, as modified, teaches the method according to claim 1. Kim also teaches wherein the touch object is a user ([0044] …the user may control an operation of the unmanned air vehicle 100 through the display device 271 provided in a front part of the mobile communication terminal part 200). Kim does not explicitly teach: and before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the method further comprises: obtaining a sensing signal of a target sensor that is worn on a wearable device of the user; and determining the action of the touch object according to the sensing signal. However, Liu teaches and before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the method further comprises: obtaining a sensing signal of a target sensor that is worn on a wearable device of the user; and ([0005] The wearable electronic device includes a motion detector configured to acquire a motion state of a body part of a user) determining the action of the touch object according to the sensing signal ([0117] the aircraft control system 100 can include the wearable electronic device 20 configured to directly control the rotorcraft 10. As such, the rotorcraft 10 can be directly controlled using motion information of the wearable electronic device 20 (e.g., the motion state of the body part of the user)). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Liu to include and before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the method further comprises: obtaining a sensing signal of a target sensor that is worn on a wearable device of the user; and determining the action of the touch object according to the sensing signal, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and by utilizing the wearable device, “the operation can be simple and the user's hands can be freed” (Liu [0037]). Regarding claim 5, Kim, as modified, teaches the method according to claim 1. Kim, as modified, does not explicitly teach wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation direction and a rotation angle of the touch object, and determining a target rotation direction and a target rotation angle of the flight assembly according to the rotation direction and the rotation angle of the touch object; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis. However, Liu teaches wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation direction and a rotation angle of the touch object, and determining a target rotation direction and a target rotation angle of the flight assembly according to the rotation direction and the rotation angle of the touch object; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis ([0005] The wearable electronic device includes a motion detector configured to acquire a motion state of a body part of a user; [0048] As shown in FIG. 3, the process S4 includes, when the motion state is the leftward rotation, controlling the rotor motor 12 to cause the rotorcraft 10 to yaw toward left (S41); [0052] An extent to which the rotorcraft 10 yaws and the rotation angle of the gimbal 14 can be determined by a leftward rotation angle of the motion state) It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Liu to include wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation direction and a rotation angle of the touch object, and determining a target rotation direction and a target rotation angle of the flight assembly according to the rotation direction and the rotation angle of the touch object; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and by utilizing the wearable device, “the operation can be simple and the user's hands can be freed” (Liu [0037]). Regarding claim 12, Kim, as modified, teaches the terminal according to claim 10. Kim also teaches wherein the touch object is a user ([0044] …the user may control an operation of the unmanned air vehicle 100 through the display device 271 provided in a front part of the mobile communication terminal part 200). Kim does not explicitly teach: and before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the method further comprises: obtaining a sensing signal of a target sensor that is worn on a wearable device of the user; and determining the action of the touch object according to the sensing signal. However, Liu teaches and before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: obtaining a sensing signal of a target sensor that is worn on a wearable device of the user; and ([0005] The wearable electronic device includes a motion detector configured to acquire a motion state of a body part of a user) determining the action of the touch object according to the sensing signal ([0117] the aircraft control system 100 can include the wearable electronic device 20 configured to directly control the rotorcraft 10. As such, the rotorcraft 10 can be directly controlled using motion information of the wearable electronic device 20 (e.g., the motion state of the body part of the user)). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Liu to include and before the controlling a flight action of the flight assembly according to an action of the touch object after the flight assembly moves out of the terminal, the operations further comprise: obtaining a sensing signal of a target sensor that is worn on a wearable device of the user; and determining the action of the touch object according to the sensing signal, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and by utilizing the wearable device, “the operation can be simple and the user's hands can be freed” (Liu [0037]). Regarding claim 14, Kim, as modified, teaches the terminal according to claim 10. Kim, as modified, does not explicitly teach wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation direction and a rotation angle of the touch object, and determining a target rotation direction and a target rotation angle of the flight assembly according to the rotation direction and the rotation angle of the touch object; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis. However, Liu teaches wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation direction and a rotation angle of the touch object, and determining a target rotation direction and a target rotation angle of the flight assembly according to the rotation direction and the rotation angle of the touch object; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis ([0005] The wearable electronic device includes a motion detector configured to acquire a motion state of a body part of a user; [0048] As shown in FIG. 3, the process S4 includes, when the motion state is the leftward rotation, controlling the rotor motor 12 to cause the rotorcraft 10 to yaw toward left (S41); [0052] An extent to which the rotorcraft 10 yaws and the rotation angle of the gimbal 14 can be determined by a leftward rotation angle of the motion state) It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Liu to include wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation direction and a rotation angle of the touch object, and determining a target rotation direction and a target rotation angle of the flight assembly according to the rotation direction and the rotation angle of the touch object; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and by utilizing the wearable device, “the operation can be simple and the user's hands can be freed” (Liu [0037]). Claims 6, 8, 15, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Kim, in view of Li, in view of Liu, in further view of Mou et al. (CN-113238649-A) and herein after will be referred to as Mou. Regarding claim 6, Kim, as modified, teaches the method according to claim 1. Kim, as modified, does not explicitly teach wherein the touch object is a stylus, and the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation angle of the stylus with a longitudinal direction of a stylus body as a rotation axis, and determining a target rotation angle of the flight assembly according to the rotation angle; detecting a rotation direction of the stylus with the longitudinal direction of the stylus body as a rotation axis, and determining a target rotation direction of the flight assembly according to the rotation direction; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis. However, Liu teaches the touch object is a body part ([0043] …any body part that is convenient for the user to move, such as a head, a hand, or a foot), and the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation angle of the body part with a longitudinal direction of a body as a rotation axis, and determining a target rotation angle of the flight assembly according to the rotation angle; detecting a rotation direction of the body part with the longitudinal direction of the stylus body as a rotation axis, and determining a target rotation direction of the flight assembly according to the rotation direction; and ([0046] The body part of the user, the rotorcraft 10, and the gimbal 14 can have three attitude angles, i.e., a yaw angle, a pitch angle, and a roll angle. A negative yaw angle can correspond to a leftward rotation, and a positive yaw angle can correspond to a rightward rotation. A negative pitch angle can correspond to a downward rotation, and a positive pitch angle can correspond to an upward rotation. A negative roll angle can correspond to a leftward deflection of the gimbal 14 or a leftward roll of the rotorcraft 10, and a positive roll angle can correspond to a rightward deflection of the gimbal 14 or a rightward roll of the rotorcraft 10) controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis ([0035] After the rotorcraft 10 receives the motion state of the body part of the user, the processor 30 of the rotorcraft 10 can process the motion state of the body part of the user to generate a control signal to control the rotor motor 12 and the gimbal 14). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Liu to include the touch object is a body part, and the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation angle of the body part with a longitudinal direction of a body as a rotation axis, and determining a target rotation angle of the flight assembly according to the rotation angle; detecting a rotation direction of the body part with the longitudinal direction of the body as a rotation axis, and determining a target rotation direction of the flight assembly according to the rotation direction; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and by utilizing the wearable device, “the operation can be simple and the user's hands can be freed” (Liu [0037]). Kim, as modified, does not explicitly teach that the touch object is “a stylus”. However, Mou teaches gesture control using “a stylus” (page 4 of translation: the method comprises the steps that the stylus monitors acceleration change and/or rotation posture change of the stylus by a sensor […] In use, a user can hold the stylus to execute a gesture operation, the stylus can obtain gesture information (or referred to as a gesture characteristic value) corresponding to the gesture operation of the user, the stylus sends the gesture information corresponding to the gesture operation of the user to the electronic device, and the electronic device can match the gesture information with the association relation to obtain a function associated with the gesture operation and realize the associated function in the electronic device). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify the body part as taught in Kim, as modified, to substitute using a stylus as taught in Mou because it has been held that the substitution of one known element for another would have been obvious if the substitution yielded predictable results to one of ordinary skill in the art at the time of the invention. In this case, the substitution of a stylus for a body part i.e., a finger would have had the predictable result of controlling the flight apparatus corresponding to a gesture operation of the user. Furthermore, using a stylus to control the flight apparatus instead of a body part, i.e., a finger would have been an obvious design choice yielding predictable results to control the flight apparatus. It appears the invention would perform equally well with both choices, as Applicant’s specification [0033] discloses the interchangeability of using either a stylus or a user (i.e., finger) as the touch object. Regarding claim 8, Kim, as modified, teaches the method according to claim 6. Kim, as modified, also teaches wherein the touch object is a stylus, and before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the method further comprises: obtaining attitude data of the stylus and attitude data of the flight assembly (see rejection of claim 6 cited to Wang receiving attitude of body part and attitude of the rotorcraft; see rejection of claim 6 cited to Mou teaching the touch object as a stylus instead of the body part); determining a first coordinate axis of the flight assembly according to the attitude data of the stylus and the attitude data of the flight assembly (see rejection of claim 6 cited to Wang roll, pitch, yaw coordinate system of the body part and rotorcraft; see rejection of claim 6 cited to Mou teaching the touch object as a stylus instead of the body part), wherein the first coordinate axis of the flight assembly is a coordinate axis that is in a body coordinate system of the flight assembly (see rejection of claim 6 cited to Wang roll, pitch, yaw coordinate system of the body part and rotorcraft, where the yaw axis of the rotorcraft corresponds to the claimed “first coordinate axis”) and that forms an acute angle of less than 45 degrees with a longitudinal direction of a stylus body of the stylus, and the body coordinate system is a three-dimensional rectangular coordinate system; and (see rejection of claim 6 cited to Wang where a yaw axis of the body part corresponds to a yaw axis turn of the rotorcraft, which therefore the angle is less than 45 deg as both axes are approximately parallel in space; see rejection of claim 6 cited to Mou teaching the touch object as a stylus instead of the body part) determining the first coordinate axis of the flight assembly as the target axis (see rejection of claim 6 cited to Wang where a yaw axis turning of the body part corresponds to a yaw axis turning of the rotorcraft, where the yaw axis corresponds to the claimed “target axis”). Regarding claim 15, Kim, as modified, teaches the terminal according to claim 10. Kim, as modified, does not explicitly teach wherein the touch object is a stylus, and the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation angle of the stylus with a longitudinal direction of a stylus body as a rotation axis, and determining a target rotation angle of the flight assembly according to the rotation angle; detecting a rotation direction of the stylus with the longitudinal direction of the stylus body as a rotation axis, and determining a target rotation direction of the flight assembly according to the rotation direction; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis. However, Liu teaches the touch object is a body part ([0043] …any body part that is convenient for the user to move, such as a head, a hand, or a foot), and the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation angle of the body part with a longitudinal direction of a body as a rotation axis, and determining a target rotation angle of the flight assembly according to the rotation angle; detecting a rotation direction of the body part with the longitudinal direction of the stylus body as a rotation axis, and determining a target rotation direction of the flight assembly according to the rotation direction; and ([0046] The body part of the user, the rotorcraft 10, and the gimbal 14 can have three attitude angles, i.e., a yaw angle, a pitch angle, and a roll angle. A negative yaw angle can correspond to a leftward rotation, and a positive yaw angle can correspond to a rightward rotation. A negative pitch angle can correspond to a downward rotation, and a positive pitch angle can correspond to an upward rotation. A negative roll angle can correspond to a leftward deflection of the gimbal 14 or a leftward roll of the rotorcraft 10, and a positive roll angle can correspond to a rightward deflection of the gimbal 14 or a rightward roll of the rotorcraft 10) controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis ([0035] After the rotorcraft 10 receives the motion state of the body part of the user, the processor 30 of the rotorcraft 10 can process the motion state of the body part of the user to generate a control signal to control the rotor motor 12 and the gimbal 14). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Liu to include the touch object is a body part, and the controlling a flight action of the flight assembly according to an action of the touch object comprises: detecting a rotation angle of the body part with a longitudinal direction of a body as a rotation axis, and determining a target rotation angle of the flight assembly according to the rotation angle; detecting a rotation direction of the body part with the longitudinal direction of the body as a rotation axis, and determining a target rotation direction of the flight assembly according to the rotation direction; and controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, with a reasonable expectation of success since having another way to fly the flight assembly increases the convenience of user control of the camera, and by utilizing the wearable device, “the operation can be simple and the user's hands can be freed” (Liu [0037]). Kim, as modified, does not explicitly teach that the touch object is “a stylus”. However, Mou teaches gesture control using “a stylus” (page 4 of translation: the method comprises the steps that the stylus monitors acceleration change and/or rotation posture change of the stylus by a sensor […] In use, a user can hold the stylus to execute a gesture operation, the stylus can obtain gesture information (or referred to as a gesture characteristic value) corresponding to the gesture operation of the user, the stylus sends the gesture information corresponding to the gesture operation of the user to the electronic device, and the electronic device can match the gesture information with the association relation to obtain a function associated with the gesture operation and realize the associated function in the electronic device). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify the body part as taught in Kim, as modified, to substitute using a stylus as taught in Mou because it has been held that the substitution of one known element for another would have been obvious if the substitution yielded predictable results to one of ordinary skill in the art at the time of the invention. In this case, the substitution of a stylus for a body part i.e., a finger would have had the predictable result of controlling the flight apparatus corresponding to a gesture operation of the user. Furthermore, using a stylus to control the flight apparatus instead of a body part, i.e., a finger would have been an obvious design choice yielding predictable results to control the flight apparatus. It appears the invention would perform equally well with both choices, as Applicant’s specification [0033] discloses the interchangeability of using either a stylus or a user (i.e., finger) as the touch object. Regarding claim 17, Kim, as modified, teaches the terminal according to claim 15. Kim, as modified, also teaches wherein the touch object is a stylus, and before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the operations further comprise: obtaining attitude data of the stylus and attitude data of the flight assembly (see rejection of claim 6 cited to Wang receiving attitude of body part and attitude of the rotorcraft; see rejection of claim 6 cited to Mou teaching the touch object as a stylus instead of the body part); determining a first coordinate axis of the flight assembly according to the attitude data of the stylus and the attitude data of the flight assembly (see rejection of claim 6 cited to Wang roll, pitch, yaw coordinate system of the body part and rotorcraft; see rejection of claim 6 cited to Mou teaching the touch object as a stylus instead of the body part), wherein the first coordinate axis of the flight assembly is a coordinate axis that is in a body coordinate system of the flight assembly (see rejection of claim 6 cited to Wang roll, pitch, yaw coordinate system of the body part and rotorcraft, where the yaw axis of the rotorcraft corresponds to the claimed “first coordinate axis”) and that forms an acute angle of less than 45 degrees with a longitudinal direction of a stylus body of the stylus, and the body coordinate system is a three-dimensional rectangular coordinate system; and (see rejection of claim 6 cited to Wang where a yaw axis of the body part corresponds to a yaw axis turn of the rotorcraft, which therefore the angle is less than 45 deg as both axes are approximately parallel in space; see rejection of claim 6 cited to Mou teaching the touch object as a stylus instead of the body part) determining the first coordinate axis of the flight assembly as the target axis (see rejection of claim 6 cited to Wang where a yaw axis turning of the body part corresponds to a yaw axis turning of the rotorcraft, where the yaw axis corresponds to the claimed “target axis”). Claims 7 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Kim, in view of Li, in further view of Böckem et al. (US-20210397202-A1) and herein after will be referred to as Böckem. Regarding claim 7, Kim, as modified, teaches the method according to claim 5. Kim, as modified, does not explicitly teach wherein before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the method further comprises: obtaining attitude data of the flight assembly; displaying an attitude image of the flight assembly on the terminal according to the attitude data of the flight assembly, wherein the attitude image comprises a body coordinate system of the flight assembly, and the body coordinate system is a three-dimensional rectangular coordinate system; and selecting one coordinate axis of the flight assembly as the target axis in response to a third input performed by the touch object on the attitude image. However, Böckem teaches wherein before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the method further comprises: obtaining attitude data of the flight assembly ([0060] The as-tracked location/position and orientation of the UAV symbol can directly be provided to the motion generation system of the UAV in order to instruct the UAV to move in its physical environment in accordance to the movement of the UAV symbol in the 3D-view); displaying an attitude image of the flight assembly on the terminal according to the attitude data of the flight assembly, wherein the attitude image comprises a body coordinate system of the flight assembly, and the body coordinate system is a three-dimensional rectangular coordinate system; and ([0060] A UAV symbol is displayed in/overlaid to the displayed 3D-view of the UAV's environment. The UAV symbol by its location and orientation, in particular and by its shape and appearance, in the displayed 3D-view represents the UAV with its position and orientation in the physical environment; FIG. 1 roll, pitch, yaw axes) selecting one coordinate axis of the flight assembly as the target axis in response to a third input performed by the touch object on the attitude image ([0060] For example, based on a touch input the UAV symbol's orientation can be changed by rotating the UAV symbol only around its yaw axis). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Böckem to include wherein before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the method further comprises: obtaining attitude data of the flight assembly; displaying an attitude image of the flight assembly on the terminal according to the attitude data of the flight assembly, wherein the attitude image comprises a body coordinate system of the flight assembly, and the body coordinate system is a three-dimensional rectangular coordinate system; and selecting one coordinate axis of the flight assembly as the target axis in response to a third input performed by the touch object on the attitude image, with a reasonable expectation of success since doing so would have achieved the benefit of “assisting the user in safely navigating the UAV in its three-dimensional environment” (Böckem [0063]). Regarding claim 16, Kim, as modified, teaches the terminal according to claim 14. Kim, as modified, does not explicitly teach wherein before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the operations further comprise: obtaining attitude data of the flight assembly; displaying an attitude image of the flight assembly on the terminal according to the attitude data of the flight assembly, wherein the attitude image comprises a body coordinate system of the flight assembly, and the body coordinate system is a three-dimensional rectangular coordinate system; and selecting one coordinate axis of the flight assembly as the target axis in response to a third input performed by the touch object on the attitude image. However, Böckem teaches wherein before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the operations further comprise: obtaining attitude data of the flight assembly ([0060] The as-tracked location/position and orientation of the UAV symbol can directly be provided to the motion generation system of the UAV in order to instruct the UAV to move in its physical environment in accordance to the movement of the UAV symbol in the 3D-view); displaying an attitude image of the flight assembly on the terminal according to the attitude data of the flight assembly, wherein the attitude image comprises a body coordinate system of the flight assembly, and the body coordinate system is a three-dimensional rectangular coordinate system; and ([0060] A UAV symbol is displayed in/overlaid to the displayed 3D-view of the UAV's environment. The UAV symbol by its location and orientation, in particular and by its shape and appearance, in the displayed 3D-view represents the UAV with its position and orientation in the physical environment; FIG. 1 roll, pitch, yaw axes) selecting one coordinate axis of the flight assembly as the target axis in response to a third input performed by the touch object on the attitude image ([0060] For example, based on a touch input the UAV symbol's orientation can be changed by rotating the UAV symbol only around its yaw axis). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Böckem to include wherein before the controlling the flight assembly to rotate according to the target rotation direction and the target rotation angle by using a target axis as a rotation axis, the operations further comprise: obtaining attitude data of the flight assembly; displaying an attitude image of the flight assembly on the terminal according to the attitude data of the flight assembly, wherein the attitude image comprises a body coordinate system of the flight assembly, and the body coordinate system is a three-dimensional rectangular coordinate system; and selecting one coordinate axis of the flight assembly as the target axis in response to a third input performed by the touch object on the attitude image, with a reasonable expectation of success since doing so would have achieved the benefit of “assisting the user in safely navigating the UAV in its three-dimensional environment” (Böckem [0063]). Claims 9 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Kim, in view of Li, in further view of Qian et al. (US-20200346753-A1) and herein after will be referred to as Qian. Regarding claim 9, Kim, as modified, teaches the method according to claim 4. Kim also teaches wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: generating a flight instruction according to the action of the touch object (FIG. 1 controlling the flight action of the unmanned air vehicle according to touch key inputs; [0044] As shown in FIG. 1, the unmanned air vehicle control application program 262 may be executed to display a touch screen on which various touch keys are arranged so that the user may control an operation of the unmanned air vehicle 100 through the display device 271 provided in a front part of the mobile communication terminal part 200). Kim, as modified, does not explicitly teach obtaining an average moving speed of the touch object within a preset time before the flight instruction is generated; and when the average moving speed is less than or equal to a preset speed threshold, sending the flight instruction to the flight assembly, so that the flight assembly flies according to the flight instruction. However, Qian teaches obtaining a moving speed of the touch object within a preset time before the flight instruction is generated; and when the moving speed is less than or equal to a preset speed threshold, sending the flight instruction to the flight assembly, so that the flight assembly flies according to the flight instruction ([0126] The specific process is: when the gesture of the target object's hand is a descending gesture and the flying speed of the UAV is less than or equal to a preset speed threshold, control the UAV to descend. If the UAV's flying speed is greater than the preset speed threshold, in order to prevent the UAV from landing at a flying speed which may cause damage to the UAV, the descending gesture is ignored when the UAV's flight speed is greater than the preset speed threshold, that is, the UAV is not controlled to descend). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Qian to include obtaining a moving speed of the touch object within a preset time before the flight instruction is generated; and when the moving speed is less than or equal to a preset speed threshold, sending the flight instruction to the flight assembly, so that the flight assembly flies according to the flight instruction, with a reasonable expectation of success since doing so would have achieved the benefit of ensuring the UAV is not damaged from excessive high speed (Qian [0126]). Although Kim, as modified, does not explicitly teach an “average” moving speed, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify the moving speed taught in Kim, as modified, to utilize an “average” moving speed, with a reasonable expectation of success since averaging speed over time is a well-known and routine practice for characterizing motion and improving measurement stability. Applying an average speed would have been an obvious design choice yielding predictable results, i.e., denying control inputs that cause excessive speed. Regarding claim 18, Kim, as modified, teaches the terminal according to claim 13. Kim also teaches wherein the controlling a flight action of the flight assembly according to an action of the touch object comprises: generating a flight instruction according to the action of the touch object (FIG. 1 controlling the flight action of the unmanned air vehicle according to touch key inputs; [0044] As shown in FIG. 1, the unmanned air vehicle control application program 262 may be executed to display a touch screen on which various touch keys are arranged so that the user may control an operation of the unmanned air vehicle 100 through the display device 271 provided in a front part of the mobile communication terminal part 200). Kim, as modified, does not explicitly teach obtaining an average moving speed of the touch object within a preset time before the flight instruction is generated; and when the average moving speed is less than or equal to a preset speed threshold, sending the flight instruction to the flight assembly, so that the flight assembly flies according to the flight instruction. However, Qian teaches obtaining a moving speed of the touch object within a preset time before the flight instruction is generated; and when the moving speed is less than or equal to a preset speed threshold, sending the flight instruction to the flight assembly, so that the flight assembly flies according to the flight instruction ([0126] The specific process is: when the gesture of the target object's hand is a descending gesture and the flying speed of the UAV is less than or equal to a preset speed threshold, control the UAV to descend. If the UAV's flying speed is greater than the preset speed threshold, in order to prevent the UAV from landing at a flying speed which may cause damage to the UAV, the descending gesture is ignored when the UAV's flight speed is greater than the preset speed threshold, that is, the UAV is not controlled to descend). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present claimed invention to modify Kim, as modified, to incorporate the teachings of Qian to include obtaining a moving speed of the touch object within a preset time before the flight instruction is generated; and when the moving speed is less than or equal to a preset speed threshold, sending the flight instruction to the flight assembly, so that the flight assembly flies according to the flight instruction, with a reasonable expectation of success since doing so would have achieved the benefit of ensuring the UAV is not damaged from excessive high speed (Qian [0126]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Non-Patent Literature from IP.com: Drone Embedded Phone Camera https://ip.com/IPCOM/000265404 US-20220221857-A1: Nakai teaches a display of altitude and posture of aircraft FIG. 4 Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVIN SEOL whose telephone number is (571) 272-6488. The examiner can normally be reached on Monday-Friday 9:00 a.m. to 5:00 p.m. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jelani Smith can be reached on (571) 270-3969. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVIN SEOL/Examiner, Art Unit 3662
Read full office action

Prosecution Timeline

Jul 16, 2024
Application Filed
Jan 28, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596370
Fitness And Sports Applications For An Autonomous Unmanned Aerial Vehicle
2y 5m to grant Granted Apr 07, 2026
Patent 12583442
RELAY DEVICE, DATA RELAY METHOD, AND STORAGE MEDIUM STORING DATA RELAY PROGRAM
2y 5m to grant Granted Mar 24, 2026
Patent 12583357
MANAGEMENT APPARATUS, MANAGEMENT METHOD AND COATING PROCESSING FACILITY
2y 5m to grant Granted Mar 24, 2026
Patent 12572145
Fitness And Sports Applications For An Autonomous Unmanned Aerial Vehicle
2y 5m to grant Granted Mar 10, 2026
Patent 12572158
RECEPTION DEVICE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
65%
Grant Probability
79%
With Interview (+14.4%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 157 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month