Prosecution Insights
Last updated: April 19, 2026
Application No. 18/364,816

SYSTEMS AND METHODS FOR VEHICLE CONTROL USING AUTONOMOUS AND REMOTE OPERATION

Final Rejection §103
Filed
Aug 03, 2023
Examiner
LEVY, MERRITT E
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kodiak Robotics Inc.
OA Round
4 (Final)
33%
Grant Probability
At Risk
5-6
OA Rounds
3y 7m
To Grant
70%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
26 granted / 78 resolved
-18.7% vs TC avg
Strong +37% interview lift
Without
With
+36.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
56 currently pending
Career history
134
Total Applications
across all art units

Statute-Specific Performance

§101
9.3%
-30.7% vs TC avg
§103
54.0%
+14.0% vs TC avg
§102
16.3%
-23.7% vs TC avg
§112
20.0%
-20.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 78 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on October 24, 2025, has been entered. Status of Claims This Office action is in response to the amendments filed on October 24, 2025. Claims 1-7, 9-18, and 20 are currently pending, with Claims 1-3, 6, 11-12, and 15-16 being amended. Response to Amendments In response to Applicant’s amendments, filed October 24, 2025, the Examiner maintains the previous 35 U.S.C. 112(f) claim interpretations, and withdraws the previous 35 U.S.C. 102 and 103 rejections. Response to Arguments Applicant’s arguments, filed October 24, 2025, with respect to the rejections of Claims 1-7, 9-18, and 20 under Tiwari, in view of Winter, have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new grounds of rejection is made in view of Tiwari, in view of Polansky, Takeda, and Winter. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are: “a remote station system configured to …” in Claims 1 and 11. “one or more actuation controls configured to enable …” in Claims 1, 16, and 18-20. “one or more actuator commands configured to …” in Claims 1 and 11. “a control module configured to …” in Claims 4, 14, and 18. “one or more remote actuation controls configured to generate …” in Claims 8, 14, and 19. “a display configured to display …” in Claims 9 and 20. Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. Regarding the limitation of “a remote station system configured to…”, the instant specification at Paragraphs [0115], [0120], at least states that “the remote station system control may be performed using a computing device, a processor, and/or other suitable components …”. The structure for the remote station system is a computing device, processor comprising a memory, or its equivalent, plus corresponding hardware or software capable of sending signals to a vehicle. Regarding the limitation of “one or more actuation controls configured to enable …”, the instant specification at Paragraphs [0078], [0083], [0118], at least states that “the computing device 130 may function as a controller for controlling one or more functions of the vehicle 105 …” and “the one or more actuation controls may comprise a brake pedal, an acceleration pedal, a gear shift control, a steering wheel, and/or one or more other suitable actuation controls …”. The structure for the actuation controls is a component capable of changing vehicle operating characteristics, such as a brake pedal, acceleration pedal or steering wheel. Regarding the limitation of “one or more actuator commands configured to …”, the instant specification at Paragraphs [0077]-[0078], [0134] at least states that “the computing device 130 may comprise a processor 135 and/or memory 140 …”, “the computing device 130 may function as a controller for causing one or more functions of the vehicle to perform …”. The structure for the actuator commands software or hardware components capable of receiving signals from a processor comprising a memory, or its computer equivalent, for executing vehicle functions. Regarding the limitation of “a control module configured to …”, the instant Specification at Paragraphs [0080], [0092], at least states that “the autonomous driving system 200 for a vehicle … may comprise a sensor module 202, a perception module 220, a planning module 250, a control module 270 …” and “the control module 270 may be configured to generate control signals for the vehicle …”. The structure for the control module is software components in communication with a processor comprising a memory, or its computer equivalent, capable of receiving and sending signals. Regarding the limitation of “one or more actuation controls configured to generate …”, the instant specification at Paragraphs [0107], at least states that “the one or more actuation controls may comprise a brake pedal, an acceleration pedal, a gear shift control, a steering wheel, and/or one or more other suitable actuation controls …”. The structure for the actuation controls is a component capable of changing vehicle operating characteristics, such as a brake pedal, acceleration pedal or steering wheel. Regarding the limitation of “a display configured to display …”, the instant Drawings at Figure 3C, and at the instant Specification at Paragraphs [0094] and [0098], shows a monitor for displaying road information. The structure for the display is a monitor or its equivalents. If applicant does not intend to have this limitation interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-4, 7, 10-12, and 14-16 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication No. 2018/0154899 A1, to Tiwari, et al (hereinafter referred to as Tiwari; previously of record), in view of U.S. Patent Publication No. 2015/0045989 A1, to Polansky, et al (hereinafter referred to as Polansky; newly of record). As per Claim 1, Tiwari discloses the features of a system for controlling a vehicle (e.g. Paragraphs [0014], [0053]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system), comprising: a vehicle (e.g. Paragraphs [0014], [0075]; where the system (100) functions to control a vehicle, to be operatable between various driving modes); one or more sensors, coupled to the vehicle (e.g. Paragraph [0036]; where the sensor subsystem in the perception module (110) functions to collect localization data and mapping data from vehicle surroundings, and where the sensor system includes at least one mapping sensor and at least one monitoring sensor, and where the sensors can be on-board the vehicle), configured to generate one or more data points pertaining to one or more of: an environment of the vehicle; and one or more system component measurements of the vehicle (e.g. Paragraphs [0053], [0082], [0090]-[0091]; where the system (100) is when in the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system; and where operator inputs are received from the operator and can include a transmission from a teleoperator to actuate a portion of the actuation subsystem; and where the communication module (170) functions to communicatively couple the vehicle control system to a remote computing system and can receive transmissions sent to the vehicle and data to be transmitted away from the vehicle (e.g. sensor data)); one or more actuation controls configured to enable the vehicle to perform one or more driving actions (e.g. Paragraphs [0026], [0071]; where the system (100) includes a behavior planning module (130) which generates control commands, and the control module (150) receives control commands and controls the actuation subsystem (153)); a remote station system (e.g. Paragraphs [0014], [0053]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system) comprising one or more remote actuation controls (e.g. Paragraphs [0026], [0054], [0059], [0071]; where the behavior planning module (130) can be implemented on a local computing system (e.g., residing at the vehicle), and at least in part at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle); and where the operator inputs can include inputs by a teleoperator to control the actuation subsystem); and a computing device, coupled to the remote station system (e.g. Paragraphs [0054], [0090]; where the behavior planning module (130) can be implemented on a local computing system (e.g., residing at the vehicle), and at least in part at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle)), the computing device coupled to the remote station system comprising a processor and a memory, wherein the memory is configured to store programming instructions (e.g. Paragraphs [0116]; where the system can be implemented as a machine configured to receive a computer-readable medium storing instructions that are executed by a controller)that, when executed by the processor, are configured to cause the processor to receive the one or more data points generated by the one or more sensors (e.g. Paragraphs [0014], [0053]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system); generate the one or more remote driving actions using the one or more remote actuation controls (e.g. Paragraphs [0054], [0059]; where the behavior planning module (130) can be implemented on a local computing system (e.g., residing at the vehicle), and at least in part at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle); and where the operator inputs can include inputs by a teleoperator to control the actuation subsystem); and generate a remote trajectory command (e.g. Paragraphs [0014], [0050], [0053]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions; and where the mission planning module (120) plans vehicle behavior based on instructions received from a remote operator and the mission planning module outputs a route plan (1201), which includes a sequence of driving behaviors (i.e. trajectories) required to navigate the vehicle), wherein: the remote trajectory command comprises trajectory instructions which comprise one or more trajectory plot points which are based on the one or more remote driving actions (e.g. Paragraphs [0014], [0050], [0053]; where the mission planning module (120) plans vehicle behavior based on instructions received from a remote operator and the mission planning module outputs a route plan (1201), which includes a sequence of driving behaviors (i.e. trajectories) required to navigate the vehicle), and ‘…’ a computing device, coupled to the vehicle (e.g. Paragraphs [0014], [0038], [0054]; where the vehicle comprises an onboard computing system, which receives inputs from the sensor system and executes instructions at an actuation subsystem to control the vehicle; and where computing modules can be stored locally at the vehicle), the computing device coupled to the vehicle comprising a processor and a memory, wherein the memory is configured to store programming instructions that (e.g. Paragraphs [0116]; where the system can be implemented as a machine configured to receive a computer-readable medium storing instructions that are executed by a controller), when executed by the processor, are configured to cause the processor to: receive the remote trajectory command (e.g. Paragraphs [0014], [0053]; where the remote operator transmits controls instructions to the vehicle; and where the behavior planning module (130) can receive the route generated by the mission planning module or from the remote operator); generate the one or more driving actions based on the trajectory instructions (e.g. Paragraphs [0026], [0071]; where the system (100) includes a behavior planning module (130) which generates control commands, and the control module (150) receives control commands and controls the actuation subsystem (153)), wherein the one or more driving actions correlate to one or more actuator commands configured to cause the vehicle to perform the one or more driving actions (e.g. Paragraph [0071], [0075]; where the control outputs can be used to control the actuation subsystem, where the task block can generate control instructions/ signals for control of the actuation subsystem to directly control the control elements of the vehicle (e.g., throttle, steering)); and transmit the one or more driving actions to the vehicle (e.g. Paragraphs [0057], [0090], [0110]; where the teleoperator can select vehicle actions based on the processed data and send it to the vehicle to be performed by the onboard vehicle system; and where the selected vehicle action can be transmitted to the vehicle, where the teleoperator performs the action (e.g. using the actuation subsystem of the vehicle)); and a switch configured to switch command of the vehicle between automatic trajectory control and remote station system control (e.g. Paragraphs [0014], [0053], [0083]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system). Tiwari fails to disclose every feature of each trajectory plot point, of the one or more trajectory plot points, comprises position coordinates for the vehicle to be at a specific time. However, Polansky, in a similar field of endeavor, teaches a method for providing an indication of a required time of arrival, where the system computes the movement of the aircraft in four dimensions (latitude, longitude, altitude, and time), where the system determines the speed at which the aircraft should travel in order to arrive at a predetermined location at a predetermined time (i.e. plots trajectories for arriving at positional coordinates at a specific time) (e.g. Paragraphs [0003], [0019], [0029], [0035]). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the Applicant’s invention, with a reasonable expectation for success, to modify the vehicle control system of Tiwari, with the feature of plotting trajectory points so that a vehicle arrives at a specific time in the system of Polansky, in order to control speed transitions in multi-segment route plans and achieve the desired arrival time (see at least Paragraphs [0004], [0006] of Polansky). As per Claim 11, Tiwari discloses the features of a system for controlling a vehicle (e.g. Paragraphs [0014], [0053]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system), comprising: a vehicle (e.g. Paragraphs [0014], [0075]; where the system (100) functions to control a vehicle, to be operatable between various driving modes); one or more sensors, coupled to the vehicle (e.g. Paragraph [0036]; where the sensor subsystem in the perception module (110) functions to collect localization data and mapping data from vehicle surroundings, and where the sensor system includes at least one mapping sensor and at least one monitoring sensor, and where the sensors can be on-board the vehicle), configured to generate one or more data points pertaining to one or more of: an environment of the vehicle; and one or more system component measurements of the vehicle (e.g. Paragraphs [0053], [0082], [0090]-[0091]; where the system (100) is when in the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system; and where operator inputs are received from the operator and can include a transmission from a teleoperator to actuate a portion of the actuation subsystem; and where the communication module (170) functions to communicatively couple the vehicle control system to a remote computing system and can receive transmissions sent to the vehicle and data to be transmitted away from the vehicle (e.g. sensor data)); one or more actuation controls (e.g. Paragraphs [0026], [0071]; where the system (100) includes a behavior planning module (130) which generates control commands, and the control module (150) receives control commands and controls the actuation subsystem (153)); a remote station system (e.g. Paragraphs [0014], [0053]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system) comprising: one or more remote actuation controls (e.g. Paragraphs [0026], [0054], [0059], [0071]; where the behavior planning module (130) can be implemented on a local computing system (e.g., residing at the vehicle), and at least in part at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle); and where the operator inputs can include inputs by a teleoperator to control the actuation subsystem); and a computing device, coupled to the remote station system (e.g. Paragraphs [0054], [0090]; where the behavior planning module (130) can be implemented on a local computing system (e.g., residing at the vehicle), and at least in part at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle)), the computing device coupled to the remote station system comprising a processor and a memory, wherein the memory is configured to store programming instructions (e.g. Paragraphs [0116]; where the system can be implemented as a machine configured to receive a computer-readable medium storing instructions that are executed by a controller) that, when executed by the processor, are configured to cause the processor to enable the vehicle to perform one or more driving actions via the one or more actuation controls (e.g. Paragraphs [0054], [0057], [0090], [0110]; where the teleoperator can select vehicle actions based on the processed data and send it to the vehicle to be performed by the onboard vehicle system; and where the selected vehicle action can be transmitted to the vehicle, where the teleoperator performs the action (e.g. using the actuation subsystem of the vehicle; where the behavior planning module (130) can be implemented on a local computing system (e.g., residing at the vehicle), and at least in part at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle), and the operator inputs can include inputs by a teleoperator to control the actuation subsystem); receive the one or more data points generated by the one or more sensors (e.g. Paragraphs [0014], [0053]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system); generate a remote trajectory command (e.g. Paragraphs [0014], [0050], [0053]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions; and where the mission planning module (120) plans vehicle behavior based on instructions received from a remote operator and the mission planning module outputs a route plan (1201), which includes a sequence of driving behaviors (i.e. trajectories) required to navigate the vehicle), wherein: the remote trajectory command comprises trajectory instructions which comprise one or more trajectory plot points which are based on the one or more remote driving actions (e.g. Paragraphs [0014], [0050], [0053]; where the mission planning module (120) plans vehicle behavior based on instructions received from a remote operator and the mission planning module outputs a route plan (1201), which includes a sequence of driving behaviors (i.e. trajectories) required to navigate the vehicle), and ‘…’ a computing device, coupled to the vehicle (e.g. Paragraphs [0014], [0038], [0054]; where the vehicle comprises an onboard computing system, which receives inputs from the sensor system and executes instructions at an actuation subsystem to control the vehicle; and where computing modules can be stored locally at the vehicle), the computing device coupled to the vehicle comprising a processor and a memory, wherein the memory is configured to store programming instructions that (e.g. Paragraphs [0116]; where the system can be implemented as a machine configured to receive a computer-readable medium storing instructions that are executed by a controller), when executed by the processor, are configured to cause the processor to: generate the one or more driving actions based on the trajectory instructions (e.g. Paragraphs [0026], [0071]; where the system (100) includes a behavior planning module (130) which generates control commands, and the control module (150) receives control commands and controls the actuation subsystem (153)), wherein the one or more driving actions correlate to one or more actuator commands configured to cause the vehicle to perform the one or more driving actions (e.g. Paragraph [0071], [0075]; where the control outputs can be used to control the actuation subsystem, where the task block can generate control instructions/ signals for control of the actuation subsystem to directly control the control elements of the vehicle (e.g., throttle, steering)); and a switch configured to switch command of the vehicle between automatic trajectory control and remote station system control (e.g. Paragraphs [0014], [0053], [0083]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system). Tiwari fails to disclose every feature of each trajectory plot point, of the one or more trajectory plot points, comprises position coordinates for the vehicle to be at a specific time. However, Polansky, in a similar field of endeavor, teaches a method for providing an indication of a required time of arrival, where the system computes the movement of the aircraft in four dimensions (latitude, longitude, altitude, and time), where the system determines the speed at which the aircraft should travel in order to arrive at a predetermined location at a predetermined time (i.e. plots trajectories for arriving at positional coordinates at a specific time) (e.g. Paragraphs [0003], [0019], [0029], [0035]). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the Applicant’s invention, with a reasonable expectation for success, to modify the vehicle control system of Tiwari, with the feature of plotting trajectory points so that a vehicle arrives at a specific time in the system of Polansky, in order to control speed transitions in multi-segment route plans and achieve the desired arrival time (see at least Paragraphs [0004], [0006] of Polansky). As per Claim 15, Tiwari discloses the features of method for controlling a vehicle (e.g. Paragraphs [0014], [0053]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system), comprising: generating one or more data points from one or more sensors coupled to a vehicle, wherein the one or more data points pertain to one or more of: an environment of the vehicle; and one or more system component measurements of the vehicle (e.g. Paragraphs [0053], [0082], [0090]-[0091]; where the system (100) is when in the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system; and where operator inputs are received from the operator and can include a transmission from a teleoperator to actuate a portion of the actuation subsystem; and where the communication module (170) functions to communicatively couple the vehicle control system to a remote computing system and can receive transmissions sent to the vehicle and data to be transmitted away from the vehicle (e.g. sensor data)); one or more actuation controls configured to enable the vehicle to perform one or more driving actions (e.g. Paragraphs [0026], [0071]; where the system (100) includes a behavior planning module (130) which generates control commands, and the control module (150) receives control commands and controls the actuation subsystem (153)); switching, using a switch, command of the vehicle between automatic trajectory control and remote station system control (e.g. Paragraphs [0014], [0053], [0083]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system); when command of the vehicle is switched to automatic trajectory control, performing, using a processor, the automatic trajectory control (e.g. Paragraphs [0014], [0053], [0100], [0114]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode; and where the teleoperator can transmit a directive to the vehicle to enter fully-autonomous (i.e. automatic trajectory control) mode after reaching a certain point); when command of the vehicle is switched to remote station system control, performing, using the processor, via a remote station system, the remote station system control (e.g. Paragraphs [0014], [0053], [0083]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system); using the remote station system: receiving the one or more data points generated by the one or more sensors (e.g. Paragraphs [0053], [0082]; where the system (100) is when in the teleoperation mode includes a remote operator transmitting control instructions to the vehicle system; and where operator inputs are received from the operator and can include a transmission from a teleoperator to actuate a portion of the actuation subsystem); and generating the one or more remote driving actions using the one or more remote actuation controls (e.g. Paragraphs [0054], [0059]; where the behavior planning module (130) can be implemented on a local computing system (e.g., residing at the vehicle), and at least in part at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle); and where the operator inputs can include inputs by a teleoperator to control the actuation subsystem); and generating a remote trajectory command (e.g. Paragraphs [0014], [0050], [0053]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions; and where the mission planning module (120) plans vehicle behavior based on instructions received from a remote operator and the mission planning module outputs a route plan (1201), which includes a sequence of driving behaviors (i.e. trajectories) required to navigate the vehicle), wherein: the remote trajectory command comprises trajectory instructions which comprise one or more trajectory plot points which are based on the one or more remote driving actions (e.g. Paragraphs [0014], [0050], [0053]; where the mission planning module (120) plans vehicle behavior based on instructions received from a remote operator and the mission planning module outputs a route plan (1201), which includes a sequence of driving behaviors (i.e. trajectories) required to navigate the vehicle), and ‘…’ receiving the remote trajectory command (e.g. Paragraphs [0014], [0053]; where the remote operator transmits controls instructions to the vehicle; and where the behavior planning module (130) can receive the route generated by the mission planning module or from the remote operator); generating the one or more driving actions based on the trajectory instructions (e.g. Paragraphs [0026], [0071]; where the system (100) includes a behavior planning module (130) which generates control commands, and the control module (150) receives control commands and controls the actuation subsystem (153)), wherein the one or more driving actions correlate to one or more actuator commands configured to cause the vehicle to perform the one or more driving actions (e.g. Paragraph [0071], [0075]; where the control outputs can be used to control the actuation subsystem, where the task block can generate control instructions/ signals for control of the actuation subsystem to directly control the control elements of the vehicle (e.g., throttle, steering)). Tiwari fails to disclose every feature of each trajectory plot point, of the one or more trajectory plot points, comprises position coordinates for the vehicle to be at a specific time. However, Polansky, in a similar field of endeavor, teaches a method for providing an indication of a required time of arrival, where the system computes the movement of the aircraft in four dimensions (latitude, longitude, altitude, and time), where the system determines the speed at which the aircraft should travel in order to arrive at a predetermined location at a predetermined time (i.e. plots trajectories for arriving at positional coordinates at a specific time) (e.g. Paragraphs [0003], [0019], [0029], [0035]). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the Applicant’s invention, with a reasonable expectation for success, to modify the vehicle control system of Tiwari, with the feature of plotting trajectory points so that a vehicle arrives at a specific time in the system of Polansky, in order to control speed transitions in multi-segment route plans and achieve the desired arrival time (see at least Paragraphs [0004], [0006] of Polansky). As per Claim 2, Tiwari, in view of Polansky, teaches the features of Claim 1, and Tiwari further discloses the features of wherein the programming instructions, when executed by the processor of the computing device coupled to the vehicle, are further configured to cause the processor to switch control of the vehicle, via the switch, between the automatic trajectory control and the remote station system control (e.g. Paragraphs [0014], [0053], [0107], [0116]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode, where the planning authority can be transferred to an operator (e.g., local operator residing in the vehicle, teleoperator, etc., in response to a button being activated by the local operator (e.g., receiving a directive from a local operator)). As per Claim 3, and similarly for Claims 12 and 16, Tiwari, in view of Polansky, teaches the features of Claims 1, 11, and 15, respectively, and Tiwari further discloses the features of wherein the programming instructions, when executed by the processor of the computing device coupled to the vehicle, are further configured to cause the processor to perform the automatic trajectory control (e.g. Paragraphs [0014], [0053], [0085], [0114]; where the system (100) functions to control a vehicle during operation and is operable between several operating modes including an autonomous, semi-autonomous, and a teleoperation mode; where the control module (150) receives information from the local planning module (140) to actuate a vehicle system, and the output is received at the vehicle to transition operation modes to an autonomous mode), wherein the performing the automatic trajectory control comprises: automatically generating an automatic trajectory command based on the one or more data points generated from the one or more sensors (e.g. Paragraphs [0014], [0059], [0061], [0101]; where the autonomous operation mode includes an onboard computing system receiving inputs from the sensor subsystem, implementing a decision-making block at the computing system to select a task block based on the inputs, and executing instructions generated by the selected task block at an actuation subsystem to control the vehicle; and where the associated task is automatically performed by the system, and the task blocks can include speed, acceleration, and deceleration parameters, lane change and lane keeping functions), wherein: the automatic trajectory command comprises automatic trajectory instructions which comprise one or more automatic trajectory plot points (e.g. Paragraph [0026], [0054], [0064]; where the system can include a trajectory generator (132); and can be implemented in a local computing system (e.g., residing at the vehicle) and at a remote computing system (e.g., a remote operation system or teleoperation system communicatively linked to the vehicle)), ‘…’ generating, based on the one or more automatic trajectory plot points, one or more driving actions (e.g. Paragraphs [0014], [0061], [0075]; where the autonomous operation mode includes an onboard computing system that executes instructions generated by the selected task block to control the vehicle (i.e. driving actions)), wherein the one or more driving actions correlate to one or more actuator commands (e.g. Paragraphs [0075], [0081]; where the control module (150) functions to directly control the elements of the vehicle based on commands received from the local planning module (140); and the control module (150) includes an actuation subsystem (153) for controlling speed, steering, transmission, and any other suitable control elements) configured to cause the vehicle to be positioned in accordance with the one or more automatic trajectory plot points (e.g. Paragraphs [0075], [0081]; where the control module (150) functions to directly control the elements of the vehicle based on commands received from the local planning module (140); and the control module (150) includes an actuation subsystem (153) for controlling speed, steering, transmission, and any other suitable control elements); and causing the vehicle, via the one or more actuation controls, to perform the one or more driving actions in accordance with the automatic trajectory command (e.g. Paragraphs [0014], [0051]-[0052], [0061], [0075], [0099]; where the autonomous operation mode includes an onboard computing system that executes instructions generated by the selected task block to control the vehicle (i.e. driving actions), such as changing lanes, exiting a highway, braking, accelerating, etc. and the mission planning module generates, updates and modifies the route plan for the vehicle to reach the destination). Tiwari fails to disclose every feature of each automatic trajectory plot point, of the one or more automatic trajectory plot points, comprises position coordinates for the vehicle to be at a specific time. However, Polansky, in a similar field of endeavor, teaches a method for providing an indication of a required time of arrival, where the system computes the movement of the aircraft in four dimensions (latitude, longitude, altitude, and time), where the system determines the speed at which the aircraft should travel in order to arrive at a predetermined location at a predetermined time (i.e. plots trajectories for arriving at positional coordinates at a specific time) (e.g. Paragraphs [0003], [0019], [0029], [0035]). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the Applicant’s invention, with a reasonable expectation for success, to modify the vehicle control system of Tiwari, with the feature of plotting trajectory points so that a vehicle arrives at a specific time in the system of Polansky, in order to control speed transitions in multi-segment route plans and achieve the desired arrival time (see at least Paragraphs [0004], [0006] of Polansky). As per Claim 4, Tiwari, in view of Polansky, teaches the features of Claim 1, and Tiwari further discloses the features of wherein: the programming instructions, when executed by the processor of the computing device coupled to the remote station, are further configured to: transmit the trajectory command to the vehicle, wherein the generating the one or more driving actions is based on the one or more trajectory plot points (e.g. Paragraphs [0014], [0091]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions; and where the sensor data is transmitted to the remote teleoperation interface (171) and the system can receive driver input from the remote vehicle operator), the vehicle comprises a control module configured to receive the remote trajectory command (e.g. Paragraphs [0014], [0100]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions; and where the vehicle receives a directive from the teleoperator instructions for operation), and perform the remote station system control (e.g. Paragraphs [0014]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions), wherein the performing the remote station system control comprises: receiving, via the control module, the remote trajectory command (e.g. Paragraphs [0014]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions); and causing the vehicle, via the one or more actuation controls, to perform the one or more driving actions in accordance with the remote trajectory command (e.g. Paragraphs [0014]; where the teleoperation mode includes a remote operator transmitting control instructions (e.g., behavior guidelines, directives, etc.) to the system (e.g., to a decision making block of the system) and controlling the vehicle based on the control instructions). As Per Claim 7, Tiwari, in view of Polansky, teaches the features of Claim 1, and Tiwari further discloses the features of wherein: the one or more sensors comprise: a Light Detection and Ranging (LiDAR) sensor (e.g. Paragraphs [0036]-[0037], [0039]; where mapping sensors of the sensor system gathers image data, range data (e.g. LIDAR, radar, TOF, etc.), and can include radar, LIDAR, cameras, navigation sensors, etc.); and a camera (e.g. Paragraphs [0036], [0039]; where mapping sensors of the sensor system gathers image data, range data (e.g. LIDAR, radar, TOF, etc.), and can include radar, LIDAR, cameras, navigation sensors, etc.), and the one or more data points comprise: a LiDAR point cloud generated by the LiDAR sensor; and an image captured by the camera (e.g. Paragraphs [0036], [0113]; where the sensor subsystem gathers the sensor data, such as range data (e.g. LIDAR data, point cloud data); and where the sensor data includes image data collected from one or more camera, as well as cloud point data from one or more rangefinding sensors of the vehicle). As per Claim 10, Tiwari, in view of Polansky, teaches the features of Claim 1, and Tiwari further discloses the features of wherein the one or more actuation controls comprise one or more of: a brake pedal; an acceleration pedal; a gear shift control; and a steering wheel (e.g. Paragraphs [0081]-[0082]; where the actuation subsystem (153) functions to actuate the control interferences of the vehicle, to include a throttle actuation interface, a brake actuation interface, a steering actuation assembly, and a pedal actuation mechanism). As per Claim 14, Tiwari, in view of Polansky, teaches the features of Claim 11, and Tiwari further discloses the features of wherein: programming instructions, when executed by the processor of the computing device coupled to the remote station system, are further configured to cause the processor to: receive the one or more driving actions (e.g. Paragraphs [0059], [0071], [0075]; where the control outputs can be used to control the actuation subsystem, where the task block can generate control instructions/ signals for control of the actuation subsystem to directly control the control elements of the vehicle (e.g., throttle, steering); and where the teleoperator receives a subset of data related to the actuation of the vehicle subsystem (i.e. receives driving action data from the operator)); causing the vehicle, via the one or more actuation controls, to perform the one or more driving actions (e.g. Paragraph [0071], [0075]; where the control outputs can be used to control the actuation subsystem, where the task block can generate control instructions/ signals for control of the actuation subsystem to directly control the control elements of the vehicle (e.g., throttle, steering)). Claims 5-6, 13, 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Tiwari, in view of Polansky, as applied to Claim 4 above, and further in view of U.S. Patent Publication No. 2019/0031202 A1, to Takeda (hereinafter referred to as Takeda; newly of record). As per Claim 5, Tiwari discloses the features of Claim 4, and Tiwari further discloses t
Read full office action

Prosecution Timeline

Aug 03, 2023
Application Filed
Feb 26, 2025
Non-Final Rejection — §103
Jul 03, 2025
Response Filed
Jul 14, 2025
Final Rejection — §103
Oct 22, 2025
Examiner Interview Summary
Oct 24, 2025
Request for Continued Examination
Nov 03, 2025
Response after Non-Final Action
Dec 01, 2025
Non-Final Rejection — §103
Mar 16, 2026
Response Filed
Apr 06, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601596
Estimation of Target Location and Sensor Misalignment Angles
2y 5m to grant Granted Apr 14, 2026
Patent 12603005
DRIVER ASSISTANCE MODULE FOR A MOTOR VEHICLE
2y 5m to grant Granted Apr 14, 2026
Patent 12594944
METHOD AND SYSTEM FOR VEHICLE DRIVE MODE SELECTION
2y 5m to grant Granted Apr 07, 2026
Patent 12594960
NAVIGATIONAL CONSTRAINT CONTROL SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12583382
SYNCHRONIZED LIGHTING FOR ELECTRIC VEHICLES
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
33%
Grant Probability
70%
With Interview (+36.6%)
3y 7m
Median Time to Grant
High
PTA Risk
Based on 78 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month