Prosecution Insights
Last updated: April 19, 2026
Application No. 18/959,823

AIRCRAFT CONTROL APPARATUS, AIRCRAFT CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE MEDIUM

Non-Final OA §101§103§112
Filed
Nov 26, 2024
Examiner
BEDEWI, RAMI NABIH
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Corporation
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
74 granted / 108 resolved
+16.5% vs TC avg
Strong +34% interview lift
Without
With
+33.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
31 currently pending
Career history
139
Total Applications
across all art units

Statute-Specific Performance

§101
7.9%
-32.1% vs TC avg
§103
42.4%
+2.4% vs TC avg
§102
24.9%
-15.1% vs TC avg
§112
24.1%
-15.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 108 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Examiner’s Note Examiner has cited particular paragraphs/columns and line numbers or figures in the references as applied to the claims below for convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations with the individual claim, other passages and figures may apply as well. It is respectfully requested from the applicant, in preparing the responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. Applicant is reminded that the Examiner is entitled to give the broadest reasonable interpretation to the language of the claims. Furthermore, the Examiner is not limited to the Applicant’s definition which is not specifically set forth in the claims. Information Disclosure Statements The Information Disclosure Statement(s) (IDS) filed on 11/26/2024 and 07/11/2025 has/have been acknowledged. Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant's cooperation is requested in correcting any errors of which applicant may become aware of, in the specification. Status of Application The list of claims 1-15 is pending in this application. In the claim set filed 11/26/2024: Claim(s) 1, 6 and 11 is/are the independent claim(s) observed in the application. Objection to the Abstract The abstract of the disclosure is objected to because it exceeds the 150-word requirement. A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim(s) 1-15 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. With respect to claims 1, 6 and 11 , claims 1, 6 and 11 recite the following indefinite claim construction: “receiving plural input instructions to specify an object in the image; generating a flight route to capture images of plurality of specified objects.” In particular, the combination of limitations reciting receiving a plurality of instructions pertaining to specifying a (singular) object and subsequently generating a flight route pertaining to a plurality of specified objects presents an inconsistency that renders the claim indefinite, wherein it is unclear which of the following scenarios the claim language is meant to capture: Receiving a plurality of input instructions pertaining to specifying a plurality of objects (for example one input instruction per specified object) to then subsequently generate a flight route for the plurality of specified objects, or: Receiving a plurality of input instructions pertaining to specifying a single object to then subsequently generate a flight route for the specified single object. In view of at least Fig. 6 (in particular S20) and accompanying paragraphs 0070 and 0071 of the Applicant’s specification, which recite an inspector providing a plurality of points (plurality of input instructions) to select an inspection target (an object), the Examiner has interpreted the cited claim limitation as being directed towards the second interpretation above pertaining to a single object for the sake of compact prosecution. Claim(s) 2-5, 7-10 and 12-15 is/are further rejected due to their dependency on rejected claims 1, 6 and 11 and for failing to cure the deficiencies outlined above. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-15 is/are rejected under 35 USC 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claim(s) 1, 6 and 11 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim(s) recite(s) using an “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor”(in the case of claim 1), “a computer” (in the case of claim 6), or “A non-transitory computer-readable medium storing a program” (in the case of claim 11), which all comprise generic computing devices and computer software, to perform the following method: 1) receiving at least one image captured by an imaging device on the aircraft; 2) displaying the image on a display; 3) receiving plural input instructions to specify an object in the image; 4) generating a flight route to capture images of plurality of specified objects; and 5) and sending instruction information based on the flight route to the aircraft. The limitations of: using “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor,” “a computer,” or “A non-transitory computer-readable medium storing a program,” to perform the following: 1) receiving at least one image captured by an imaging device on the aircraft; 2) displaying the image on a display; 3) receiving plural input instructions to specify an object in the image; 4) generating a flight route to capture images of plurality of specified objects; and 5) sending instruction information based on the flight route to the aircraft, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic hardware components (computing device and position sensor). That is, other than reciting a “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor,” “a computer,” or “A non-transitory computer-readable medium storing a program,” nothing in the claim element precludes the step from practically being performed in the mind. For example, but for the “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor,” “a computer,” or “A non-transitory computer-readable medium storing a program,” language, in the context of this claim encompasses the user manually performing steps of receiving an image from an aircraft via a display, and manually placing a plurality of points on the displayed image and further connecting a flight route between the plurality of points, after which the flight route may be sent to the aircraft. Furthermore, steps 1 and 2 above merely disclose retrieval and displaying of previously gathered data. As explained in MPEP § 2106.05(g) the step of mere data gathering is an example of pre-solution activity (Insignificant Extra-Solution Activity), and therefore, is not sufficient in making the claims patent eligible. Furthermore, step 5 above merely discloses transmission(which is a form of data output) of manually generated data(a manually generated flight route based on a plurality of manually input points). As explained in MPEP § 2106.05(g) the step of mere outputting of data is an example of post-solution activity (Insignificant Extra-Solution Activity), and therefore, is not sufficient in making the claims patent eligible. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components and Insignificant Extra-Solution Activity, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim(s) recite(s) an abstract idea. This judicial exception is not integrated into a practical application. In particular, the claim(s) only recite(s) the following additional elements – “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor,” “a computer,” or “A non-transitory computer-readable medium storing a program,” to perform: 1) receiving at least one image captured by an imaging device on the aircraft; 2) displaying the image on a display; 3) receiving plural input instructions to specify an object in the image; 4) generating a flight route to capture images of plurality of specified objects; and 5) sending instruction information based on the flight route to the aircraft. The “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor,” “a computer,” or “A non-transitory computer-readable medium storing a program,” are recited at a high-level of generality (i.e., as generic processors performing generic computer functions of receiving data and transferring data) such that they amount to no more than mere instructions to apply the exception using a generic computer component. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim(s) is/are directed to an abstract idea. The claim(s) do/does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor,” “a computer,” or “A non-transitory computer-readable medium storing a program,” to perform: 1) receiving at least one image captured by an imaging device on the aircraft; 2) displaying the image on a display; 3) receiving plural input instructions to specify an object in the image; 4) generating a flight route to capture images of plurality of specified objects; and 5) sending instruction information based on the flight route to the aircraft, amount to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. The claim(s) is/are not patent eligible. Dependent claim(s) 2-5, 7-10 and 12-15 when analyzed as a whole, is/are held to be patent ineligible under 35 U.S.C. 101 because the additional recited limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea. The additional element(s), if any, in the dependent claim(s) is/are not sufficient to amount to significantly more than the judicial exception for the same reasons as with claim(s) 1, 6 and 11. Examiner’s Note: In order to overcome this rejection, the Office suggests further defining the limitations of the independent claim(s), for example linking the claimed subject matter to a non-generic device and controlling a positively recited vehicle or an apparatus in a specific way based on the data analysis performed or further showing that the claimed subject matter is an improvement to a technical field. Limitations such as these suggested above would further bring the claimed subject matter out of the realm of abstract idea and into the realm of a statutory category. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims under pre-AIA 35 U.S.C. 103(a), the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were made absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and invention dates of each claim that was not commonly owned at the time a later invention was made in order for the examiner to consider the applicability of pre-AIA 35 U.S.C. 103(c) and potential pre-AIA 35 U.S.C. 102(e), (f) or (g) prior art under pre-AIA 35 U.S.C. 103(a). Claim(s) 1-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Tanaka et al. (United States Patent Publication 2017/0010615 A1) in view of Zang (United States Patent Publication 2017/0322551 A1) referenced as Tanaka and Zang, respectively, moving forward. With respect to claim 1, while Tanaka discloses: “An aircraft control apparatus comprising: at least one memory storing instructions; and at least one processor configured to execute the instructions to perform operations comprising:” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses a "control device, imaging device, control method, imaging method, and computer program that make it possible to make more efficient the inspection performed by a flying body capable of performing imaging;" ¶: 0012; Tanaka further discloses: "In addition, a computer program for causing hardware such as a CPU, a ROM, and a RAM installed in each device to exhibit the equivalent functions to those of each of the devices described above can also be created. In addition, a storage medium in which such a computer program is stored can also be provided. In addition, by configuring each of the functional blocks shown in the functional block diagram to be hardware or a hardware circuit, a series of processes can also be realized using hardware or a hardware circuit;" ¶: 0180]; “receiving at least one image captured by an imaging device on the aircraft” [Tanaka; "The hovering camera 100 is an exemplary imaging device of the present disclosure and serves as the flying body equipped with the imaging device described above. The hovering camera 100 is a flying body configured to be able to perform an automatic flight based on a designated flight path and capture a still image at a designated imaging position through the imaging device;" ¶: 0054; See also: ¶: 0048, 0050]; And “generating a flight route to capture images of plurality of specified objects; and sending instruction information based on the flight route to the aircraft” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses using a control terminal and display unit to enable a UAV operator to first select an "inspection region" by dragging on the touch panel of the display unit. In response to the user-selected "inspection region," Tanaka discloses generating a plurality of flight path options for the user to select from, such as the flight paths R1 and R2, denoted in Fig. 7, for example. The user is then able to select one of the two (in the cited example) flight paths to determine, which flight path to transmit to the UAV to perform the inspection process; See also: Fig. 5-7; ¶: 0105, 0119, 0120]; Tanaka does not specifically state: “displaying the image on a display; receiving plural input instructions to specify an object in the image.” Zang, which is in the same field of invention of systems/methods for controlling UAVs to perform object tracking, teaches: “displaying the image on a display; receiving plural input instructions to specify an object in the image” [Zang; Zang discloses: "In some embodiments, the remote control device is configured to: receive one or more images captured by the imaging device from the UAV; display the one or more images; receive a user selection of a target from within a displayed image; generate the target information of the target based on the user selection of the target; and transmit the target information to the UAV;" ¶: 0047; and further: "the user-specified target information including a predetermined position or a predetermined size of the target within an image captured by the imaging device;" ¶: 0050. The user performing a first input to select a target for tracking and a second input of maintaining a predetermined position or a predetermined size, has been interpreted as patentably indistinct from the Applicant's broadly recited "receiving plural input instructions to specify an object in the image;" See also: ¶: 0051, 0052]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for specifying a flight path to control a UAV to gather information on a particular structure as disclosed by Tanaka to incorporate the teachings regarding allowing a user to input a plurality of user-specified parameters related to object tracking for controlling a UAV to automatically track an object as taught by Zang with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for specifying a flight path to control a UAV to gather information on a particular structure that is more robust in its ability to “facilitate the automation of the low-level control portion of the tracking process so as to reduce the efforts required and the errors resulting from manual tracking. At the same time, the tracking methods and system described herein still allows users to maintain, if desired, high-level control of the tracking process (e.g., by specifying the type of target to track)” [Zang; ¶: 0155]. With respect to claim 2, Tanaka discloses: “wherein the operations further comprise displaying, on the display, at least one of map data around the object specified by the plural input instructions and the flight route” [Tanaka; "When the flight information of the hovering camera 100 is generated, the control terminal 200 reads the information related to the overview of the bridge 1 to be inspected, for example, an overview diagram of the bridge 1 to be inspected, and causes the read information to be displayed on a screen. Points on the overview diagram of the bridge 1 are associated with points on map data including more detailed GPS information. The associating is preferably performed by at least two sets of points. The overview diagram of the bridge 1 is associated with points on the map data including detailed GPS information in advance, and thus the flight path of the hovering camera 100 is defined as GPS values. Then, the control terminal 200 generates the flight path of the hovering camera 100 based on the overview diagram of the bridge 1. The flight path of the hovering camera 100 is displayed on the overview diagram in a superimposed manner so that it is easily understood by the user (structure inspection worker);" ¶: 0057; See also:¶: 0097, 0098]. With respect to claim 3, Tanaka discloses: “wherein the instruction information includes height information instructing height of the aircraft flying along the flight route or control information to control the imaging device for capturing the object” [Tanaka; "The flight path set for the hovering camera 100 may be set using all of a latitude, a longitude, and an altitude as GPS position information or may be set using only a latitude and a longitude as the GPS position information, and, for example, a relative height from the base station 600 which will be described below may be set as an altitude;" ¶: 0055]. With respect to claim 4, Tanaka discloses: “wherein the operations further comprise: displaying a screen to check the flight route on the display; and in response to receiving input to finalize the flight route on the screen, sending the instruction information to the aircraft” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses prompting the user to select one of flight paths R1 or R2 as indicated on the display presented in Fig. 7, and in response to the user's selection switching to a second display, as seen in Fig. 8, prompting the user to trigger takeoff, which has been interpreted as patentably indistinct from the Applicant's broadly recited "sending the instruction information to the aircraft" limitation; See also: Fig. 5-7; ¶: 0105, 0119-0121]. With respect to claim 5, Tanaka does not specifically state: “wherein the instruction information includes information for controlling height of the aircraft or information for controlling at least one of a photographing direction and magnification of the imaging device, in such a way that the object comes to a center in the image.” Zang teaches: “wherein the instruction information includes information for controlling height of the aircraft or information for controlling at least one of a photographing direction and magnification of the imaging device, in such a way that the object comes to a center in the image” [Zang; Zang discloses: "For instance, the attitude, position, velocity, zoom, and other aspects of the UAV and/or the imaging device can be automatically adjusted to ensure that the user maintains a designated position and/or size within the images captured by the imaging device;" ¶: 0154; and further: "For example, the movable object may be configured, by default, to keep a target at substantially the center of an image, or at around particular coordinates of the image;" ¶: 0207; wherein control of the UAV's attitude in order to maintain an image in a center of an image has been interpreted as patentably indistinct from the Applicant's broadly recited: "controlling height of the aircraft;" See also: ¶: 0267, 0270]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for specifying a flight path to control a UAV to gather information on a particular structure as disclosed by Tanaka to incorporate the teachings regarding allowing a user to input a plurality of user-specified parameters related to object tracking for controlling a UAV to automatically track an object as taught by Zang with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for specifying a flight path to control a UAV to gather information on a particular structure that is more robust in its ability to “facilitate the automation of the low-level control portion of the tracking process so as to reduce the efforts required and the errors resulting from manual tracking. At the same time, the tracking methods and system described herein still allows users to maintain, if desired, high-level control of the tracking process (e.g., by specifying the type of target to track)” [Zang; ¶: 0155]. With respect to claim 6, while Tanaka discloses: “An aircraft control method performed by a computer and comprising:” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses a "control device, imaging device, control method, imaging method, and computer program that make it possible to make more efficient the inspection performed by a flying body capable of performing imaging;" ¶: 0012; Tanaka further discloses: "In addition, a computer program for causing hardware such as a CPU, a ROM, and a RAM installed in each device to exhibit the equivalent functions to those of each of the devices described above can also be created. In addition, a storage medium in which such a computer program is stored can also be provided. In addition, by configuring each of the functional blocks shown in the functional block diagram to be hardware or a hardware circuit, a series of processes can also be realized using hardware or a hardware circuit;" ¶: 0180]; “receiving at least one image captured by an imaging device on the aircraft” [Tanaka; "The hovering camera 100 is an exemplary imaging device of the present disclosure and serves as the flying body equipped with the imaging device described above. The hovering camera 100 is a flying body configured to be able to perform an automatic flight based on a designated flight path and capture a still image at a designated imaging position through the imaging device;" ¶: 0054; See also: ¶: 0048, 0050]; And “generating a flight route to capture images of plurality of specified objects; and sending instruction information based on the flight route to the aircraft” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses using a control terminal and display unit to enable a UAV operator to first select an "inspection region" by dragging on the touch panel of the display unit. In response to the user-selected "inspection region," Tanaka discloses generating a plurality of flight path options for the user to select from, such as the flight paths R1 and R2, denoted in Fig. 7, for example. The user is then able to select one of the two (in the cited example) flight paths to determine, which flight path to transmit to the UAV to perform the inspection process; See also: Fig. 5-7; ¶: 0105, 0119, 0120]; Tanaka does not specifically state: “displaying the image on a display; receiving plural input instructions to specify an object in the image.” Zang teaches: “displaying the image on a display; receiving plural input instructions to specify an object in the image” [Zang; Zang discloses: "In some embodiments, the remote control device is configured to: receive one or more images captured by the imaging device from the UAV; display the one or more images; receive a user selection of a target from within a displayed image; generate the target information of the target based on the user selection of the target; and transmit the target information to the UAV;" ¶: 0047; and further: "the user-specified target information including a predetermined position or a predetermined size of the target within an image captured by the imaging device;" ¶: 0050. The user performing a first input to select a target for tracking and a second input of maintaining a predetermined position or a predetermined size, has been interpreted as patentably indistinct from the Applicant's broadly recited "receiving plural input instructions to specify an object in the image;" See also: ¶: 0051, 0052]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for specifying a flight path to control a UAV to gather information on a particular structure as disclosed by Tanaka to incorporate the teachings regarding allowing a user to input a plurality of user-specified parameters related to object tracking for controlling a UAV to automatically track an object as taught by Zang with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for specifying a flight path to control a UAV to gather information on a particular structure that is more robust in its ability to “facilitate the automation of the low-level control portion of the tracking process so as to reduce the efforts required and the errors resulting from manual tracking. At the same time, the tracking methods and system described herein still allows users to maintain, if desired, high-level control of the tracking process (e.g., by specifying the type of target to track)” [Zang; ¶: 0155]. With respect to claim 7, Tanaka discloses: “wherein the method further comprises displaying, on the display, at least one of map data around the object specified by the plural input instructions and the flight route” [Tanaka; "When the flight information of the hovering camera 100 is generated, the control terminal 200 reads the information related to the overview of the bridge 1 to be inspected, for example, an overview diagram of the bridge 1 to be inspected, and causes the read information to be displayed on a screen. Points on the overview diagram of the bridge 1 are associated with points on map data including more detailed GPS information. The associating is preferably performed by at least two sets of points. The overview diagram of the bridge 1 is associated with points on the map data including detailed GPS information in advance, and thus the flight path of the hovering camera 100 is defined as GPS values. Then, the control terminal 200 generates the flight path of the hovering camera 100 based on the overview diagram of the bridge 1. The flight path of the hovering camera 100 is displayed on the overview diagram in a superimposed manner so that it is easily understood by the user (structure inspection worker);" ¶: 0057; See also:¶: 0097, 0098]. With respect to claim 8, Tanaka discloses: “wherein the instruction information includes height information instructing height of the aircraft flying along the flight route or control information to control the imaging device for capturing the object” [Tanaka; "The flight path set for the hovering camera 100 may be set using all of a latitude, a longitude, and an altitude as GPS position information or may be set using only a latitude and a longitude as the GPS position information, and, for example, a relative height from the base station 600 which will be described below may be set as an altitude;" ¶: 0055]. With respect to claim 9, Tanaka discloses: “wherein the method further comprises: displaying a screen to check the flight route on the display; and in response to receiving input to finalize the flight route on the screen, sending the instruction information to the aircraft” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses prompting the user to select one of flight paths R1 or R2 as indicated on the display presented in Fig. 7, and in response to the user's selection switching to a second display, as seen in Fig. 8, prompting the user to trigger takeoff, which has been interpreted as patentably indistinct from the Applicant's broadly recited "sending the instruction information to the aircraft" limitation; See also: Fig. 5-7; ¶: 0105, 0119-0121]. With respect to claim 10, Tanaka does not specifically state: “wherein the instruction information includes information for controlling height of the aircraft or information for controlling at least one of a photographing direction and magnification of the imaging device, in such a way that the object comes to a center in the image.” Zang teaches: “wherein the instruction information includes information for controlling height of the aircraft or information for controlling at least one of a photographing direction and magnification of the imaging device, in such a way that the object comes to a center in the image” [Zang; Zang discloses: "For instance, the attitude, position, velocity, zoom, and other aspects of the UAV and/or the imaging device can be automatically adjusted to ensure that the user maintains a designated position and/or size within the images captured by the imaging device;" ¶: 0154; and further: "For example, the movable object may be configured, by default, to keep a target at substantially the center of an image, or at around particular coordinates of the image;" ¶: 0207; wherein control of the UAV's attitude in order to maintain an image in a center of an image has been interpreted as patentably indistinct from the Applicant's broadly recited: "controlling height of the aircraft;" See also: ¶: 0267, 0270]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for specifying a flight path to control a UAV to gather information on a particular structure as disclosed by Tanaka to incorporate the teachings regarding allowing a user to input a plurality of user-specified parameters related to object tracking for controlling a UAV to automatically track an object as taught by Zang with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for specifying a flight path to control a UAV to gather information on a particular structure that is more robust in its ability to “facilitate the automation of the low-level control portion of the tracking process so as to reduce the efforts required and the errors resulting from manual tracking. At the same time, the tracking methods and system described herein still allows users to maintain, if desired, high-level control of the tracking process (e.g., by specifying the type of target to track)” [Zang; ¶: 0155]. With respect to claim 11, while Tanaka discloses: “A non-transitory computer-readable medium storing a program for causing a computer to perform operations comprising:” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses a "control device, imaging device, control method, imaging method, and computer program that make it possible to make more efficient the inspection performed by a flying body capable of performing imaging;" ¶: 0012; Tanaka further discloses: "In addition, a computer program for causing hardware such as a CPU, a ROM, and a RAM installed in each device to exhibit the equivalent functions to those of each of the devices described above can also be created. In addition, a storage medium in which such a computer program is stored can also be provided. In addition, by configuring each of the functional blocks shown in the functional block diagram to be hardware or a hardware circuit, a series of processes can also be realized using hardware or a hardware circuit;" ¶: 0180]; “receiving at least one image captured by an imaging device on the aircraft” [Tanaka; "The hovering camera 100 is an exemplary imaging device of the present disclosure and serves as the flying body equipped with the imaging device described above. The hovering camera 100 is a flying body configured to be able to perform an automatic flight based on a designated flight path and capture a still image at a designated imaging position through the imaging device;" ¶: 0054; See also: ¶: 0048, 0050]; And “generating a flight route to capture images of plurality of specified objects; and sending instruction information based on the flight route to the aircraft” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses using a control terminal and display unit to enable a UAV operator to first select an "inspection region" by dragging on the touch panel of the display unit. In response to the user-selected "inspection region," Tanaka discloses generating a plurality of flight path options for the user to select from, such as the flight paths R1 and R2, denoted in Fig. 7, for example. The user is then able to select one of the two (in the cited example) flight paths to determine, which flight path to transmit to the UAV to perform the inspection process; See also: Fig. 5-7; ¶: 0105, 0119, 0120]; Tanaka does not specifically state: “displaying the image on a display; receiving plural input instructions to specify an object in the image.” Zang teaches: “displaying the image on a display; receiving plural input instructions to specify an object in the image” [Zang; Zang discloses: "In some embodiments, the remote control device is configured to: receive one or more images captured by the imaging device from the UAV; display the one or more images; receive a user selection of a target from within a displayed image; generate the target information of the target based on the user selection of the target; and transmit the target information to the UAV;" ¶: 0047; and further: "the user-specified target information including a predetermined position or a predetermined size of the target within an image captured by the imaging device;" ¶: 0050. The user performing a first input to select a target for tracking and a second input of maintaining a predetermined position or a predetermined size, has been interpreted as patentably indistinct from the Applicant's broadly recited "receiving plural input instructions to specify an object in the image;" See also: ¶: 0051, 0052]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for specifying a flight path to control a UAV to gather information on a particular structure as disclosed by Tanaka to incorporate the teachings regarding allowing a user to input a plurality of user-specified parameters related to object tracking for controlling a UAV to automatically track an object as taught by Zang with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for specifying a flight path to control a UAV to gather information on a particular structure that is more robust in its ability to “facilitate the automation of the low-level control portion of the tracking process so as to reduce the efforts required and the errors resulting from manual tracking. At the same time, the tracking methods and system described herein still allows users to maintain, if desired, high-level control of the tracking process (e.g., by specifying the type of target to track)” [Zang; ¶: 0155]. With respect to claim 12, Tanaka discloses: “wherein the operations further comprise displaying, on the display, at least one of map data around the object specified by the plural input instructions and the flight route” [Tanaka; "When the flight information of the hovering camera 100 is generated, the control terminal 200 reads the information related to the overview of the bridge 1 to be inspected, for example, an overview diagram of the bridge 1 to be inspected, and causes the read information to be displayed on a screen. Points on the overview diagram of the bridge 1 are associated with points on map data including more detailed GPS information. The associating is preferably performed by at least two sets of points. The overview diagram of the bridge 1 is associated with points on the map data including detailed GPS information in advance, and thus the flight path of the hovering camera 100 is defined as GPS values. Then, the control terminal 200 generates the flight path of the hovering camera 100 based on the overview diagram of the bridge 1. The flight path of the hovering camera 100 is displayed on the overview diagram in a superimposed manner so that it is easily understood by the user (structure inspection worker);" ¶: 0057; See also:¶: 0097, 0098]. With respect to claim 13, Tanaka discloses: “wherein the instruction information includes height information instructing height of the aircraft flying along the flight route or control information to control the imaging device for capturing the object” [Tanaka; "The flight path set for the hovering camera 100 may be set using all of a latitude, a longitude, and an altitude as GPS position information or may be set using only a latitude and a longitude as the GPS position information, and, for example, a relative height from the base station 600 which will be described below may be set as an altitude;" ¶: 0055]. With respect to claim 14, Tanaka discloses: “wherein the operations further comprise: displaying a screen to check the flight route on the display; and in response to receiving input to finalize the flight route on the screen, sending the instruction information to the aircraft” [Tanaka; In at least the paragraphs and figures cited, Tanaka discloses prompting the user to select one of flight paths R1 or R2 as indicated on the display presented in Fig. 7, and in response to the user's selection switching to a second display, as seen in Fig. 8, prompting the user to trigger takeoff, which has been interpreted as patentably indistinct from the Applicant's broadly recited "sending the instruction information to the aircraft" limitation; See also: Fig. 5-7; ¶: 0105, 0119-0121]. With respect to claim 15, Tanaka does not specifically state: “wherein the instruction information includes information for controlling height of the aircraft or information for controlling at least one of a photographing direction and magnification of the imaging device, in such a way that the object comes to a center in the image.” Zang teaches: “wherein the instruction information includes information for controlling height of the aircraft or information for controlling at least one of a photographing direction and magnification of the imaging device, in such a way that the object comes to a center in the image” [Zang; Zang discloses: "For instance, the attitude, position, velocity, zoom, and other aspects of the UAV and/or the imaging device can be automatically adjusted to ensure that the user maintains a designated position and/or size within the images captured by the imaging device;" ¶: 0154; and further: "For example, the movable object may be configured, by default, to keep a target at substantially the center of an image, or at around particular coordinates of the image;" ¶: 0207; wherein control of the UAV's attitude in order to maintain an image in a center of an image has been interpreted as patentably indistinct from the Applicant's broadly recited: "controlling height of the aircraft;" See also: ¶: 0267, 0270]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system/method for specifying a flight path to control a UAV to gather information on a particular structure as disclosed by Tanaka to incorporate the teachings regarding allowing a user to input a plurality of user-specified parameters related to object tracking for controlling a UAV to automatically track an object as taught by Zang with a reasonable expectation of success. By combining these inventions, the outcome is a system/method for specifying a flight path to control a UAV to gather information on a particular structure that is more robust in its ability to “facilitate the automation of the low-level control portion of the tracking process so as to reduce the efforts required and the errors resulting from manual tracking. At the same time, the tracking methods and system described herein still allows users to maintain, if desired, high-level control of the tracking process (e.g., by specifying the type of target to track)” [Zang; ¶: 0155]. Prior Art (Not relied upon) The prior art made of record and not relied upon is considered pertinent to applicant's disclosure can be found in the attached form 892. GARIEPY et al. (United States Patent Publication 2010/0084513 A1) discloses: A method of remotely controlling an aerial vehicle within an environment, including providing a control station in communication with the aerial vehicle, providing a map of the environment, receiving target world coordinates for the aerial vehicle within the environment, determining a desired velocity vector to direct the aerial vehicle to the target world coordinates at a speed proportional to the distance between the aerial vehicle and the target world coordinates, and directing the aerial vehicle along the desired velocity vector until the aerial vehicle reaches the target world coordinates. GNOTH (United States Patent Publication 2020/0169666 A1) discloses: The present disclosure provides a target observation method, a related device and a system. The method includes: displaying, on a display screen of a remote control device, a target tracked by an unmanned aerial vehicle (UAV); determining, when a visual angle adjustment operation for the target that is input by a user into the remote control device is received, a visual angle adjustment parameter according to the visual angle adjustment operation; and adjusting an observation perspective for the target according to the visual angle adjustment parameter. This can simplify operations of the user during the adjustment of the observation perspective for the UAV and improve efficiency in the adjustment of the observation perspective for the target. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAMI N BEDEWI whose telephone number is (571)272-5753. The examiner can normally be reached Monday - Thursday - 6:00 am - 11:00 am & 12:00pm - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached on (571-270-0151). The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /R.N.B./Examiner, Art Unit 3666C /SCOTT A BROWNE/Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Nov 26, 2024
Application Filed
Feb 05, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596382
MOVING BODY CONFIGURED TO LEAD A VEHICLE BASED ON SENSING A REGION AROUND THE VEHICLE.
2y 5m to grant Granted Apr 07, 2026
Patent 12589772
Combined Acceleration Sensor for High and Low Crash Detection for an Autonomous Vehicle
2y 5m to grant Granted Mar 31, 2026
Patent 12579897
MOBILE DEVICE AND METHODS FOR TRAVELLING TOWARDS A DESTINATION USING A COMMUNICATION NETWORK
2y 5m to grant Granted Mar 17, 2026
Patent 12545293
CONTINUOUS LEARNING MACHINE USING CLOSED COURSE SCENARIOS FOR AUTONOMOUS VEHICLES
2y 5m to grant Granted Feb 10, 2026
Patent 12547180
CONTROLLER, AUTOMATED GUIDED VEHICLES AND METHOD OF GUIDING A PLATOON OF AUTOMATED GUIDED VEHICLES
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
99%
With Interview (+33.8%)
3y 2m
Median Time to Grant
Low
PTA Risk
Based on 108 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month