Prosecution Insights
Last updated: April 19, 2026
Application No. 17/918,234

SYSTEMS AND METHODS FOR A CONTROL STATION

Final Rejection §103§112
Filed
Oct 11, 2022
Examiner
YANG, WENYUAN
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Oqab Dietrich Induction Inc.
OA Round
4 (Final)
68%
Grant Probability
Favorable
5-6
OA Rounds
3y 0m
To Grant
85%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
90 granted / 133 resolved
+15.7% vs TC avg
Strong +18% interview lift
Without
With
+17.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
33 currently pending
Career history
166
Total Applications
across all art units

Statute-Specific Performance

§101
14.2%
-25.8% vs TC avg
§103
54.3%
+14.3% vs TC avg
§102
18.3%
-21.7% vs TC avg
§112
10.7%
-29.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 133 resolved cases

Office Action

§103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This Office Action is in response to Applicant's Amendment and Remarks filed on 1/5/2026. This Action is made FINAL. Claims 4, 10-11, 17, 21-22, 24-32, 34, 36, were canceled. Claims 1-3, 5-9, 12-16, 18-20, 23, 33, 35, 37 are pending for examination. Response to Arguments (A) Applicant's arguments filed “Claims 1 and 23 have been rejected for reciting the limitations of "a beam-riding highway for transmitting power and the command and control signals wirelessly" as not supported by the specification. Applicant respectfully submits that the limitations "a beam-riding highway for transmitting power and the command and control signals wirelessly" is supported by the specification as filed at para. 257 and Figs. 26A, 26B, and 26C, which provide "wide-beam area riding highways including point-to-point power transmission, point-to-point transportation including orbit raising and descending". The point-to- point transportation achieved by the beam-riding highways requires command and control signals to be provided to enable command and control of mobile computer devices such that the mobile computer devices can be transported point-to-point by the beam-riding highway. Applicant respectfully submits that the impugned claim language is accordingly reasonably inferable from the foregoing” on 1/5/2026 have been fully considered but they are not persuasive. As to point (A), the examiner respectfully disagrees. The examiner further notes the specification only disclosed "wide-beam area riding highways including point-to-point power transmission, point-to-point transportation including orbit raising and descending" and failed to disclose the claimed limitation of “a beam-riding highway for transmitting … the command and control signals wirelessly”. (B) Applicant’s arguments, see page 8, filed “Claims 1, 3, 5, 8, 13-14, 16, 19, 23, 33, 37 have been rejected for reciting the limitations of "mobile computer device" as not supported by the specification. Claim 2 has been rejected for reciting the limitations of "training computer module" as not supported by the specification. Applicant respectfully submits that the limitations "mobile computer device" and "training computer module" are supported by the specification as filed at, for example, paragraph 85.” on 1/5/10/2026, with respect to Claim Rejections under 35 USC § 112(a) have been fully considered but they are not persuasive. As to point (B), the examiner respectfully disagrees. The examiner further notes Para 85 disclosed “One or more systems described herein may be implemented in computer programs executing on programmable computers, each comprising at least one processor, a data storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. For example, and without limitation, the programmable computer may be a programmable logic unit, a mainframe computer, server, and personal computer, cloud-based program or system, laptop, personal data assistance, cellular telephone, smartphone, or tablet device” which does not disclose any information on the "mobile computer device" and "training computer module". The examiner further notes the claim amendment “the power transmitted is by the mobile computer device” filed on 1/5/2026 with the support of Para 249 and Fig. 21A-C suggested that the mobile computer device being beam-riding drones or air ships. (C) Applicant's arguments filed “Ma thus teaches a movable object, such as a UAV, configured for travel. The movable object includes propulsion devices connected to a power source for sustaining controlled flight. The movable object is configured to send data and information to a terminal. Accordingly, Ma fails to disclose a movable object with any power transmission capacity. Thus Ma discloses no functionality or components that would be capable of interacting with the claimed "beam-riding highway for transmitting power and the command and control signals wirelessly, wherein the power transmitted is by the mobile computer device", even if such feature were present in Ma's environment. It would not be obvious to modify Ma to include the claimed functionality because there is no motivation to include the claimed functionality within Ma. Specifically, because the movable object of Ma does not include any power transmission capacity, the movable object of Ma is unable to transmit power, and thus the movable objects of Ma include no functionality for carrying out the claimed feature "wherein the power transmitted is by the mobile computer device". Further, because Ma provides no motivation to include the claimed feature "wherein the power transmitted is by the mobile computer device", there is no motivation to combine Yoshichika with Ma to arrive at the present claims as amended even if Yoshichika can be taken to teach a beam-riding highway, to which point Applicant does not acquiesce.” on 1/5/2026 have been fully considered but they are not persuasive. As to point (C), the examiner respectfully disagrees. The examiner further notes the Ma disclosed all the limitation besides "beam-riding highway for transmitting power and the command and control signals wirelessly, wherein the power transmitted is by the mobile computer device" and it would be obvious to modify Ma to include the claimed functionality for the motivation of “realizing stable power supply for a long period of time to a power consuming device mounted on the aircraft while continuing the flight of the aircraft.” disclosed by Yoshichika. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claim 1-3, 5-9, 12-16, 18-20, 23, 33, 35, 37 rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Independent claim 1 and 23 recited the limitations of “a beam-riding highway for transmitting power and the command and control signals wirelessly, wherein the power transmitted is by the mobile computer device” and “transmitting power and the primary and secondary command and control signals wirelessly through a beam-riding highway, wherein the power transmitted is by the mobile computer device” which is not supported by the specification. The specification disclosed beam-riding highway transferring power, however the specification does not disclose “a beam-riding highway for transmitting … the command and control signals wirelessly”; “the power transmitted is by the mobile computer device”; and “transmitting … the primary and secondary command and control signals wirelessly through a beam-riding highway”. claims 1, 3, 5, 8, 13-14, 16, 19, 23, 33, 37 recited the limitations of “mobile computer device” which is not supported by the specification. The specification disclosed “The mobile device 108 may include a computer-readable storage medium storing and processing computer vision method algorithms for detecting obstacles and objects”. The specification failed provide the support indicating that the mobile device 108 is a mobile computer device because computer-readable storage medium is not equivalent of a computer. Dependent claim 2 recited “the primary receiver comprises a training computer module for training using actual fight data and simulated flight data fed through the primary receive” however the specification does not disclose “a training computer module”. Dependent claims 3, 5-9, 12-16, 18-20, 33, 35, 37 are rejected because the claims depended on claims 1 and 23 which is rejected base on the reason above. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1, 3, 7, 8, 12, 18, 23, and 37 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ma ( US20200068516) in view of YOSHICHIKA (WO2019151055A1). In regards to claim 1, Ma teaches A system for remote control of a mobile computer device, the system comprising: a primary receiver for providing primary command and control of the mobile computer device(Ma: Fig. 4 Element 62; Para 40 “at least one of the first control device and second control device is configured to control the flight of the UAV, and the other device is configured to control an imaging device of the UAV. The first control device or the second control device includes at least one of a remote controller, a smart eyeglass, a smart phone, a tablet, a watch, a virtual reality (VR) headset, or a goggle”); a secondary receiver for providing secondary command and control of the mobile computer device(Ma: Fig. 4 Element 64; Para 40 “at least one of the first control device and second control device is configured to control the flight of the UAV, and the other device is configured to control an imaging device of the UAV. The first control device or the second control device includes at least one of a remote controller, a smart eyeglass, a smart phone, a tablet, a watch, a virtual reality (VR) headset, or a goggle”); the mobile device configured to respond to command and control signals sent by any of the primary receiver and the secondary receiver(Ma: Para 69 “The control device may be capable of receiving data transmitted by the movable object via the datalink, the control device may also be capable of transmitting data to the movable object via the datalink. For example, the control device may receive image data captured by the movable object via the datalink, the control device may transmit controlling command to the movable object via the datalink to control the flight of the movable object or control an imaging device (e.g., a camera) of the movable object”); a relay platform for relaying the command and control signals throughout the system(Ma: Para 55 “Any suitable means of communication can be used to transfer data and information to or from control device 22, such as wired communication or wireless communication. For example, communication system 20 can utilize one or more of local area networks (LAN), wide area networks (WAN), infrared, radio, Wi-Fi, point-to-point (P2P) networks, telecommunication networks, cloud communication, and the like. Optionally, relay stations, such as towers, satellites, or mobile stations, can be used”). Yet Ma do not explicitly teach a beam-riding highway for transmitting power and the command and control signals wirelessly. However, in the same field of endeavor, YOSHICHIKA teaches a beam-riding highway for transmitting power and the command and control signals wirelessly, wherein the power transmitted is by the mobile computer device (Yoshichika: Fig. 2; Fig. 10 Element 750; Fig. 18 ; Para 31 “The microwave power transmitting device is provided, for example, in a microwave power supply station 75 which is a ground or sea facility, or in a power supply airship 25 which is an aircraft, and transmits power supply microwave beams 750, 250 toward the HASPS10. Furthermore, the microwave power transmitting device may be installed in a moving object such as a vehicle, such as an automobile, that moves on land or sea, or a ship. The configuration and operation of the microwave power transmitting device will be described in detail later”; Para 13 “FIG. 18 is an explanatory diagram illustrating an example in which the HAPS control information from the remote control device is received by the microwave power transmission device via the mobile communication network in the second modification”; Para 20 “The wireless relay stations of the HAPS 10 and 20 are connected to the core network of the mobile communication network 80 via a feeder station (gateway) 70 that is a relay station installed on the ground or the sea. Communication between the HAPS 10, 20 and the feeder station 70 may be performed by wireless communication using radio waves such as microwaves, or may be performed by optical communication using laser light or the like”; i.e. fig. 18 indicated the HAPS control information and power is being transferred to the HAPS wirelessly). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the system for remote control of a mobile device of Ma with the feature of a beam-riding highway for transmitting power and the command and control signals wirelessly, wherein the power transmitted is by the mobile computer device disclosed by YOSHICHIKA. One would be motivated to do so for the benefit of “realizing stable power supply for a long period of time to a power consuming device mounted on the aircraft while continuing the flight of the aircraft.” (YOSHICHIKA: Para 4). In regards to claim 3, the combination of Ma and YOSHICHIKA teaches the system of claim 1, and Ma further teaches wherein the mobile computer device is any one of a vehicle, a robot, a land vehicle, a car, a truck, a water vehicle, a boat, a submarine, an air vehicle, an airplane, an airship, a helicopter, a drone, a space vehicle, a satellite, and a rocket(Ma: Para 29 “Movable object 10 may be any suitable object, device, mechanism, system, or machine configured to travel on or within a suitable medium (e.g., a surface, air, water, rails, space, underground, etc.). For example, movable object 10 may be an unmanned aerial vehicle (UAV). Although movable object 10 is shown and described herein as a UAV for exemplary purposes of this description, it is understood that other types of movable object (e.g., wheeled objects, nautical objects, locomotive objects, other aerial objects, etc.) may also or alternatively be used in embodiments consistent with this disclosure. As used herein, the term UAV may refer to an aerial device configured to be operated and/or controlled automatically (e.g., via an electronic control system) and/or manually by off-board personnel”). In regards to claim 7, the combination of Ma and YOSHICHIKA teaches the system of claim 1, and Ma further teaches the primary receiver comprises an extended reality headset (Ma: Para 40 “at least one of the first control device and second control device is configured to control the flight of the UAV, and the other device is configured to control an imaging device of the UAV. The first control device or the second control device includes at least one of a remote controller, a smart eyeglass, a smart phone, a tablet, a watch, a virtual reality (VR) headset, or a goggle”)). In regards to claim 8, the combination of Ma and YOSHICHIKA teaches the system of claim 7, and Ma further teaches the mobile computer device is configured to provide data and camera feed to the extended reality headset (Ma: Para 51 “sensory device 19 may be an imaging system 19. In this disclosed embodiment, imaging system 19 may include imaging devices configured to gather data that may be used to generate images for surveying, tracking, and capturing images or video of targets (e.g., objects, landscapes, subjects of photo or video shoots, etc.). For example, imaging devices may include photographic cameras, video cameras, infrared imaging devices, ultraviolet imaging devices, x-ray devices, ultrasonic imaging devices, radar devices, etc. In this exemplary embodiment, the imaging device may be configured to generate optical data of the target for identifying and tracking the target. For example, the imaging device may be an optical device, such as a camera or video camera. The imaging device may be configured to generate imaging data indicative of one or more features of the target. The imaging system 19 may further be configured to communicate data (e.g., image frames) and information with control device 22 via a wired or wireless connection (e.g., RFID, Bluetooth, Wi-Fi, radio, cellular, etc.). Data and information generated by imaging system 19 and communicated to control device 22 may be used by control device 22 for further processing”). In regards to claim 12, the combination of Ma and YOSHICHIKA teaches the system of claim 1, and Ma further teaches the relay platform is a high-altitude relay platform stationed above Earth (Ma: Para 55 “Any suitable means of communication can be used to transfer data and information to or from control device 22, such as wired communication or wireless communication. For example, communication system 20 can utilize one or more of local area networks (LAN), wide area networks (WAN), infrared, radio, Wi-Fi, point-to-point (P2P) networks, telecommunication networks, cloud communication, and the like. Optionally, relay stations, such as towers, satellites, or mobile stations, can be used”). In regards to claim 18, the combination of Ma and YOSHICHIKA teaches the system of claim 1, and Ma further teaches a sensor-based subsystem for incorporating any of high-resolution cameras, pressure sensors, lidar, mechanical, and thermal systems. (Ma: Para 33 “Sensory devices 19 may include devices for collecting or generating data or information, such as surveying, tracking, and capturing images or video of targets (e.g., objects, landscapes, subjects of photo or video shoots, etc.). Sensory devices 19 may include imaging devices configured to gather data that may be used to generate images. For example, imaging devices may include photographic cameras, video cameras, infrared imaging devices, ultraviolet imaging devices, x-ray devices, ultrasonic imaging devices, radar devices, etc. Sensory devices 19 may also or alternatively include devices for capturing audio data, such as microphones or ultrasound detectors. Sensory devices 19 may also or alternatively include other suitable sensors for capturing visual, audio, and/or electromagnetic signals”; Para 38 “Sensing system 18 may include one or more sensors associated with one or more components or other systems of movable device 10. For instance, sensing system may include sensors for determining positional information, velocity information, and acceleration information relating to movable object 10 and/or targets. In some embodiments, sensing system may also include carrier sensors 30. Components of sensing system 18 may be configured to generate data and information that may be used (e.g., processed by control device 22 or another device) to determine additional information about movable object 10, its components, or its targets. Sensing system 18 may include one or more sensors for sensing one or more aspects of movement of movable object 10. For example, sensing system 18 may include sensory devices associated with payload 14 as discussed above and/or additional sensory devices, such as a positioning sensor for a positioning system (e.g., GPS, GLONASS, Galileo, Beidou, GAGAN, etc.), motion sensors, inertial sensors (e.g., IMU sensors), proximity sensors, TOF sensors, image sensors, etc. Sensing system 18 may also include sensors or be configured to provide data or information relating to the surrounding environment, such as weather information (e.g., temperature, pressure, humidity, etc.), lighting conditions (e.g., light-source frequencies), air constituents, or nearby obstacles (e.g., objects, structures, people, other vehicles, etc.)”). As per claim 23, it recites A method for remote control of a mobile computer device having limitations similar to those of claim 1 and therefore is rejected on the same basis. In regards to claim 37, the combination of Ma and YOSHICHIKA teaches the method of claim 23, and Ma further teaches collecting and transmitting the data by the mobile computer device. (Ma: Para 33 “Sensory devices 19 may include devices for collecting or generating data or information, such as surveying, tracking, and capturing images or video of targets (e.g., objects, landscapes, subjects of photo or video shoots, etc.). Sensory devices 19 may include imaging devices configured to gather data that may be used to generate images. For example, imaging devices may include photographic cameras, video cameras, infrared imaging devices, ultraviolet imaging devices, x-ray devices, ultrasonic imaging devices, radar devices, etc. Sensory devices 19 may also or alternatively include devices for capturing audio data, such as microphones or ultrasound detectors. Sensory devices 19 may also or alternatively include other suitable sensors for capturing visual, audio, and/or electromagnetic signals”; Para 55 “The communication system 20 can transmit and/or receive one or more of sensing data from the sensing system 18, processing results produced by the processor 54, predetermined control data, user commands from terminal 32 or a remote controller, and the like”). Claim 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ma (US20200068516) in view of YOSHICHIKA (WO2019151055A1) further in view of Hadfield (US20210043104). In regards to claim 2, the combination of Ma and YOSHICHIKA teaches The system of claim 1. Yet the combination of Ma and YOSHICHIKA do not explicitly teach the primary receiver comprises a training computer module for training using actual fight data and simulated flight data fed through the primary receiver. However, in the same field of endeavor, Hadfield teaches the primary receiver comprises a training computer module for training using actual fight data and simulated flight data fed through the primary receiver(Abstract - simulating a flight scenario during a live flight of an aircraft; (i) generating (60) images comprising scenes relevant to the simulated flight scenario at a simulated altitude; (ii) calculating, using live flight data received for the aircraft and with reference to a predetermined flight model (65), simulated flight data for the simulated flight scenario at the simulated altitude; and (iii) displaying, on a display system (35) of the aircraft, the calculated simulated flight data while controlling the display of said generated scene images to simulate movement of the aircraft through the displayed scene at a rate and in a direction corresponding to the displayed simulated flight data; optionally alter the response of the aircraft to control actions (70) by a pilot to simulate the response expected of the aircraft having the simulated flight characteristics; [0026] flight simulating method and apparatus arranged to simulate flight scenarios in an aircraft, for example for training purposes, during live flight of the aircraft. In particular, a pilot of the aircraft may fly the aircraft according to a simulated flight scenario while flying at a safe altitude. The simulated flight scenarios may comprise training flight scenarios including low-level flying or landing scenarios. A simulated landing scenario may be for example a simulated aircraft carrier flight deck or other simulated landing type in a variety of conditions or locations; [0037] In order to provide an immersive and realistic simulation, the simulation control functionality controls the scenery generator 60 to update the displayed scene images based upon the simulated flight data generated by the flight model 65. The aim is to provide to the pilot a true impression of flying through the simulated scene. If provided for training purposes, for the training to be worthwhile, the muscle memory that the pilot develops during the training needs to correctly reflect the training scenario). Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the UAV target tracking system of the combination of Ma and YOSHICHIKA, with the more specific flight simulation features of Hadfield. Thus, providing a design for safely practicing simulated flight scenarios. Examiner Note: The instant application discloses the “training module” at paragraphs [0008] and [0030]. The Figures do not appear to show the training module. The instant application states that the training module is “for training using actual fight data and simulated flight data.” For examination purposes, the “training module” is being interpreted as software for simulating an operational environment of a vehicle. Claim 5, 6, and 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ma (US20200068516) in view of YOSHICHIKA (WO2019151055A1) further in view of US20210116907 (“Altman”). In regards to claim 5, the combination of Ma and YOSHICHIKA teaches The system of claim 1. Yet the combination of Ma and YOSHICHIKA do not explicitly teach the mobile computer device comprises a computer- readable storage medium storing and processing computer vision method algorithms for detecting obstacles and objects. However, in the same field of endeavor, Altman teaches the mobile computer device comprises a computer- readable storage medium storing and processing computer vision method algorithms for detecting obstacles and objects ([0093] vehicular computer or vehicular processor or the vehicular AI module 119 may utilize AI and machine learning not only for purposes of identifying objects and route-related items (e.g., other cars, traffic lights, traffic signs, lane borders, pedestrians, road blocks, road obstacles, or the like); but also in order to learn which types of data to transmit or upload to remote AI modules, and/or which compression or encoding or data dilution or sparse representation process to utilize, and/or to which extent to dilute or to encode or to compress or to skip data or data-items, and/or to determine which data originators (e.g., which sensors or cameras or LIDARs) to include and/or to exclude from such data transmissions or data uploading, and/or which amount of data and type(s) of data are sufficient to upload or to transmit in order to enable a remote AI module to reach decisions at a pre-defined or de-facto level of certainty or probability or assurance, and/or which one or more communication link(s) and one or more modems or transceivers to utilize for such uploading or transmitting (e.g., from one or more available modems or transceivers, including for example Wi-Fi, satellite, cellular, 3G, 4G, 4G-LTE, 5G, cellular, V2X, or the like), and/or whether or not to transmit or to upload a particular data-item or data-stream over two (or more) different communication links and/or to two (or more) different recipients, and/or whether to divide and in which particular manner to divide packets that are intended for uploading across two or more modems or transceivers or communication connections which then upload or transmit them in concert or in parallel at different rates and/or at different performance characteristics (error rate, latency, delays, bandwidth, goodput, throughput, QoS parameters, or the like), and/or other communication-related decisions or determinations, that can be deep-learned over time and/or trained by an AI module, while also checking and learning whether a particular set of parameters and/or decision is sufficient to enable sufficient AI determinations beyond a pre-defined threshold level of certainty or probability or assurance. Such AI process may utilize actual data, predicted data, estimated data, raw data, processed data, diluted data, historic data, currently-measured or currently-sensed data, recently-measured or recently-sensed data, and/or a suitable combination of such data types; [0094] a local in-vehicle AI module 119, and/or a remote (external to vehicle) AI module (179, 169, 159), may continuously learn and machine-learn and improve its identification of objects and improve its decision making, by continuously analyzing the incoming data-streams that are sensed and/or transmitted or uploaded by the vehicle or from the vehicle, and then comparing the decision results or the identification results to newly-incoming or fresh data that is received in a continuous manner; [0096] utilize a variety of sensors and detectors in order to sense, detect and/or identify their surroundings and/or other data (e.g., traffic lights, traffic signs, lanes, road blocks, road obstacles, navigation paths, pedestrians); for example, cameras, image camera, video cameras, acoustic microphones, audio capturing devices, image and video capturing devices, radar sensors or devices, LIDAR sensors or devices, laser-based sensors or devices, magnetic or magnet-based sensors and devices, ultrasonic sensors, night-vision modules, Global Positioning System (GPS) units, Inertial Measurement Unit (IMU) modules, stereoscopic vision or stereo vision modules, object recognition modules, deep learning modules, machine learning modules, accelerometers, gyroscopes, compass units, odometry, computer vision modules, machine vision modules, Artificial Intelligence (AI) modules, vehicular processor, vehicular computer, dashboard processor, dashboard computer, communication devices or the like. In some embodiments, an autonomous vehicle may utilize a Bayesian simultaneous localization and mapping (SLAM) module or algorithm to fuse and process data from multiple sensors and online/offline maps; and/or detection and tracking of other moving objects (DATMO) modules; and/or real-time locating system (RTLS) beacon modules, high resolution real time maps (“HD maps”) and/or other suitable modules). Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the UAV target tracking system of the combination of Ma and YOSHICHIKA, with the more specific computer vision, artificial intelligence (AI), and/or machine learning features of Altman. Thus, providing a design to increase the safety of autonomous driving and/or self-driving systems and/or remote driving systems. In regards to claim 6, the combination of Ma, YOSHICHIKA, and Altman teaches the system of claim 5, and Altman further teaches the computer vision method algorithms comprise machine learning and artificial intelligence techniques([0093]-[0094], and [0096]). The Examiner supplies the same rationale for the combination of references Ma, YOSHICHIKA, and Altman as in Claim 5 above. In regards to claim 9, the combination of Ma and YOSHICHIKA teaches the system of claim 1, and Altman further teaches the secondary receiver comprises haptic controls ([0012] HMI that utilizes other components, such as, joystick, mouse, trackball, touch-screen, touch-pad, screen, pedals, steering wheel, wearable gear or head-gear or goggles or glasses or helmet, tactile elements, haptic elements, or the like). The Examiner supplies the same rationale for the combination of references Ma, YOSHICHIKA, and Altman as in Claim 5 above. Claim 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ma (US20200068516) in view of YOSHICHIKA (WO2019151055A1) further in view of JP2018008320A (“Onishi”). In regards to claim 13, the combination of Ma and YOSHICHIKA teaches The system of claim 1. Yet the combination of Ma and YOSHICHIKA do not explicitly teach the mobile computer device comprises a robotic arm suitable for grasping, manipulating, and moving objects. However, in the same field of endeavor, Onishi teaches and/or suggests the limitation. (Title - Multi-jointed robot arm and UAV; FIG. 8 shows an example of the UAV 1 with a robot arm). Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the UAV target tracking system of the combination of Ma and YOSHICHIKA, with the more specific robotic arm of Onishi. Thus, providing a design to perform work with a robot arm or to deliver a conveyed product. Claim 14-16, 19, and 33 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ma (US20200068516) in view of YOSHICHIKA (WO2019151055A1) further in view of US20190130783 (“Nissen”). In regards to claim 14, the combination of Ma and YOSHICHIKA teaches The system of claim 1. Yet the combination of Ma and YOSHICHIKA do not explicitly teach a fleet tracking architecture component for determining where the mobile computer device is in relation to other mobile computer devices. However, in the same field of endeavor, Nissen teaches and/or suggests the limitation. ([0014] head-mounted display 108 may show a virtual representation of a state of a remote fleet of vehicles; [0025] multiple flight emulators 100 are connected through network 152, playback may control multiple flight emulators 100 simultaneously, allowing several users 110 to experience feedback from a single simulated vehicle or a fleet of various simulated vehicles; [0026] allows a user 110 of the master aircraft 202, which may be a pilot, co-pilot, or other crew member of the master aircraft 202, to virtually teleport into and assume control of a fleet of remote aircraft 204 connected through the network 208). Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the UAV target tracking system of the combination of Ma and YOSHICHIKA, with the more specific virtual reality emulator features of Nissen. Thus, providing a design to ensure safe and effective operation of respective vehicles during training. In regards to claim 15, the combination of Ma, YOSHICHIKA, and Nissen teaches the system of claim 14, and Nissen further teaches and/or suggests the system comprises an autonomous virtual air traffic control and management system through the fleet tracking architecture component ([0018] flight emulator system 150 comprises a plurality of flight emulators 100 selectively connected in communication through network 152; Master flight emulator 154 comprises a flight emulator 100 selectively connected in communication with each of the plurality of slave flight emulators 100a.sub.1-n through network 152; [0019] connecting multiple flight emulators 100 through network 152, the flight emulators 100 may be flown in the same simulated environment; monitor each of the flight emulators 100 operated by a user 110 (e.g., student or training participant; [0025]). The Examiner supplies the same rationale for the combination of references Ma, YOSHICHIKA, and Nissen as in Claim 14 above. In regards to claim 16, the combination of Ma and YOSHICHIKA teaches the system of claim 1, and Nissen further teaches and/or suggests a second mobile computer device, wherein the mobile computer device and the second mobile computer device are in communication with each other, and wherein the mobile computer device and the second mobile computer device are each in communication with the relay platform and the primary and secondary receivers ([0014]; [0018]; [0019]; [0025]; [0026]). See also prior art of record JP 6652620 B2 (“Cui”). The Examiner supplies the same rationale for the combination of references Ma, YOSHICHIKA, and Nissen as in Claim 14 above. In regards to claim 19, the combination of Ma, YOSHICHIKA, and Nissen teaches the system of claim 16, and Nissen further teaches and/or suggests the primary and secondary receivers are each configured to switch from providing command and control of the mobile computer device to providing command and control of the second mobile computer device ([0014]; [0018]; [0019]; [0025]; [0026]). The Examiner supplies the same rationale for the combination of references Ma, YOSHICHIKA, and Nissen as in Claim 14 above. As per claim 33, it recites a method having limitations similar to those of claim 16 and therefore is rejected on the same basis. Claim 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ma (US20200068516) in view of YOSHICHIKA (WO2019151055A1) further in view of US20200115066 (“DeMunck”). In regards to claim 20, the combination of Ma and YOSHICHIKA teaches The system of claim 1. Yet the combination of Ma and YOSHICHIKA do not explicitly teach suggests a data collection subsystem for collecting data, a distributed data processing pipeline for analyzing the data, and a visualization subsystem for data management and pilot training. However, in the same field of endeavor, DeMunck teaches and/or suggests the limitation. ([0010] performing necessary and sufficient validations, in particular by using the (same) calculation engines used by pilots, validating (and visualizing) the outcome; [0028] visualization operational envelope also can help in creating human interpretable warnings; [0033] Aviation or avionics “safety” designates the state of an avionics system (or organization) in which risks associated with aviation activities, related to, or in direct support of the operation of aircraft, are reduced and controlled to an acceptable level. “Safety” encompasses the theory, practice, investigation, and categorization of flight failures, and the prevention of such failures through regulation, education, and training; [0041] Machine learning can advantageously be performed on “big data”, i.e. using as much data as possible (stability, convergence, weak signals). New data is continuously added and training can be reiterated; [0044] “Semi-supervised learning” designates a situation wherein the computer is given only an incomplete training signal: training set with some (often many) of the target outputs missing; [0045] “Active learning” designates a situation wherein the computer can only obtain training labels for a limited set of instances (based on a budget), and also has to optimize its choice of objects to acquire labels for. When used interactively, these can be presented to the user for labeling; [0046] “Reinforcement learning” designates a situation wherein training data (for example in form of rewards and punishments) is given only as a feedback to the program's actions in a dynamic environment, such as driving a vehicle or playing a game against an opponent; [0054] Large collections of data are on or of or associated to recorded (commercial, real) flights ; [0055] validating said input aircraft model and/or operational data; [0056] generating a superset of data, by performing one or more of the steps comprising adding, deleting, merging, splitting, or combining data of said large collections of data; [0057] Real validated dataset A for an Airbus A320 can be merged into real validated dataset B for an Airbus A330, thus creating artificial but potentially useful data. The superset of data can thus comprise an amount of data which can be way superior to the initial data having being collected; [0058] superset of data is obtained by one or more operations performed on the large collection of data of recorded flights, said operations comprising one or more of extrapolation, interpolation, ponderation, regression, approximation or simulation; [0065] large collection of data comprises aircraft or flight data stemming for a plurality of airlines about a plurality of aircraft, engines, flight contexts, aircraft configurations and meteorological data; [0067] “Semi-supervised” anomaly detection techniques construct a model representing normal behavior from a given normal training data set, and then testing the likelihood of a test instance to be generated by the learnt model; [0068] semi-supervised anomaly detection comprises using one or more of the methods or techniques comprising statistical techniques, density-based techniques such as k-nearest neighbor, local outlier factor or isolation forests, subspace and correlation-based outlier detection for high-dimensional data, Support Vector Machines, replicator neural networks, Bayesian Networks, Hidden Markov models, Cluster analysis-based outlier detection, or Fuzzy logic-based outlier detection; [0069] displaying one or more determined comparisons or anomalies, and triggering one or more visual and/or vibratile and/or audio alerts depending on the application of predefined thresholds on said one or more comparisons or anomalies; [0105] validation test can indicate proximity to the operational envelope (safe boundaries), or otherwise provide feedback to the submitter, such as multi-point visualizations or recommendations (e.g. impossible values, recommended values, borderline values, acceptable values, etc.); [0115] system comprising deep neural networks and/or deep belief networks and/or recurrent neural networks adapted to carry out one or more of the steps of the method. Software Neural network simulators can be used (software applications simulating the behavior of artificial or biological neural networks). Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the UAV target tracking system of the combination of Ma and YOSHICHIKA, with the more specific big data machine learning features of DeMunck. Thus, providing a design to define safe boundaries for one or more aircraft models and/or operations data. Claim 35 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ma (US20200068516) in view of YOSHICHIKA (WO2019151055A1) further in view of US20200044730 (“Takamori”). In regards to claim 35, the combination of Ma and YOSHICHIKA teaches The method of claim 23. Yet the combination of Ma and YOSHICHIKA do not explicitly teach the relay platform operates at an altitude from 3 kilometres to 22 kilometres. However, in the same field of endeavor, Takamori teaches and/or suggests the limitation. ([0045] embodiment described above uses an unmanned aerial vehicle as the relay, a manned airplane can also be used. Alternatively, other flying objects such as an airship, balloon, and helicopter may also be used as the relay. While the flight altitude of the relay is specified to the stratosphere (in particular, 20 km to 25 km) in the embodiment described above, the flight altitude of the relay is not limited to this range. The relay can fly at any height at least between a height which a drone can fly at and the height of the communication satellite, such as for example anywhere in a range of about several hundred meters to several kilometers). Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify, with a reasonable expectation of success, the UAV target tracking system of Ma, with the more specific system for collecting flight information features of Takamori. Thus, providing a design for safe and efficient operation of drones. Examiner Note: Preferred altitudes for a relay platform are a mere design choice and afforded little to no patentable weight. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Jones (US20200036232A1) disclosed A charging corral may be used, and navigation instructions to the charging corral may be provided to a drone. Re-orientable antennas may be used to direct RF charging power in a selected direction, for example, to track the location of a drone. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. /W.Y./Examiner, Art Unit 3667 /Hitesh Patel/Supervisory Patent Examiner, Art Unit 3667 1/22/26
Read full office action

Prosecution Timeline

Oct 11, 2022
Application Filed
Aug 20, 2024
Non-Final Rejection — §103, §112
Nov 26, 2024
Response Filed
Feb 05, 2025
Final Rejection — §103, §112
Jul 10, 2025
Request for Continued Examination
Jul 15, 2025
Response after Non-Final Action
Sep 03, 2025
Non-Final Rejection — §103, §112
Jan 05, 2026
Response Filed
Jan 22, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600239
DRIVE APPARATUS AND ELECTRIC VEHICLE
2y 5m to grant Granted Apr 14, 2026
Patent 12592106
Systems and Methods for Vehicle Tuning and Calibration
2y 5m to grant Granted Mar 31, 2026
Patent 12576728
METHOD TO CONTROL AN ELECTRIC DRIVE VEHICLE
2y 5m to grant Granted Mar 17, 2026
Patent 12570157
VEHICLE SYSTEM
2y 5m to grant Granted Mar 10, 2026
Patent 12548382
METHOD AND COMPUTER PROGRAM FOR RECEIVING, MANAGING AND OUTPUTTING USER-RELATED DATA FILES OF DIFFERENT DATA TYPES ON A USER-ITERFACE OF A DEVICE AND A DEVICE FOR STORAGE AND OPERATION OF THE COMPUTER PROGRAM
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
68%
Grant Probability
85%
With Interview (+17.7%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 133 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month