Prosecution Insights
Last updated: April 19, 2026
Application No. 17/892,571

SYSTEMS AND METHODS FOR CONTROLLING A VEHICLE BY TELEOPERATION BASED ON MAP CREATION

Non-Final OA §103
Filed
Aug 22, 2022
Examiner
HUBER, MELANIE GRACE
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kodiak Robotics Inc.
OA Round
5 (Non-Final)
72%
Grant Probability
Favorable
5-6
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
33 granted / 46 resolved
+19.7% vs TC avg
Strong +30% interview lift
Without
With
+29.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
28 currently pending
Career history
74
Total Applications
across all art units

Statute-Specific Performance

§101
10.1%
-29.9% vs TC avg
§103
55.6%
+15.6% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
10.3%
-29.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 46 resolved cases

Office Action

§103
DETAILED ACTION Status of Claims Claims 1-4, 7-8, 11-14, and 17-18 are currently pending and have been examined in this application. This action is NON-FINAL. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/27/2025 has been entered. Response to Arguments Applicant's arguments filed 10/27/2025 have been fully considered but they are not persuasive. Applicant argues: Regarding the 35 USC 103 rejection, “For the sake of argument and completeness, Lockwood's drive line 146 is revised by vehicle 102 based on its selected revised trajectory (which was generated by vehicle 102 based on the altered size of the overlaid shape 140). However, Lockwood's drive line 146 is not revised based on a teleoperation input as defined in claim 1, i.e., a teleoperation input is formed by combining and synthesizing inputs entered into the teleoperation system by a teleoperator via input devices facilitating mimicking direct control of a vehicle by a driver, wherein the inputs comprise a steering input to alter a steering angle of the vehicle, a throttle input to change a throttle position for the vehicle, and/or a brake input to change a brake position for the vehicle” (Remarks, pg. 8) Regarding the 35 USC 103 rejection, “On page 9 of the Office Action, the Examiner states that the combination of Lockwood and Pierfelice do not teach the teleoperation input of claim 1. Kobayashi was cited as curing this deficiency of the Lockwood/Pierfelice combination. The Examiner cites paragraph [0047] of Kobayashi as teaching that a teleoperator may provide an input to the user interface via a steering wheel 351, a brake pedal 353, and/or an acceleration pedal 352. However, Applicant asserts that there is no motivation to modify Lockwood's solution by replacing its teleoperation input for expanding the size of the overlaid shape 140 to comprise Kobayashi's teleoperation input since such modification would unnecessarily add complexity and computational intensity to (as well as require an extensive redesign of) Lockwood's solution. Applicant asserts that a conclusion otherwise uses impermissible hindsight reasoning. Applicant believes that the Examiner is picking and choosing from the teachings regarding multiple disparate concepts taught by Lockwood and Kobayashi, and then selectively combining the teachings regarding the disparate concepts so as to form an impermissible hindsight reconstruction of Applicant's invention.1 Accordingly, it is respectfully submitted the Examiner's interpretations of the Lockwood and the Lockwood/Kobayashi combination are improper.” (Remarks, pg. 8-9) Examiner respectfully disagrees. Regarding point (a), Kobayashi teaches combining teleoperator input from a steering wheel, accelerator pedal, and brake pedal to determine a remote operation for the vehicle to take (Kobayashi, para. [0047] “Operation amount obtainment circuit 314 obtains the amount of operation applied to operation accepter 35 by a remote operator. For example, the steering angle of steering wheel 351, the position of accelerator pedal 352, and the position of brake pedal 353 are obtained. Operation amount correction circuit 315 corrects the obtained amount of operation with reference to vehicle characteristics table 321t. Control command generation circuit 316 generates a control command including the corrected amount of operation, and transmits the generated control command to autonomous driving control apparatus 10 via the network.”). Kobayashi goes into further detail about how both the inputs from the steering wheel and accelerator pedal are used to determine controls for an autonomous vehicle (Kobayashi, para. [0059] “This curve radius is the curve radius of autonomous vehicle 1 to be remotely operated, and is calculated on the basis of the steering angle of steering wheel 351 of remote-operation apparatus 30, the value of the wheelbase of the subject vehicle, and the tire angle/steering angle coefficient of the subject vehicle. Note that assistance line al may be dynamically changed in length according to the vehicle speed or the accelerator position obtained at the time of displaying.”). In other words, Kobayashi teaches combining the teleoperator steering, braking, and acceleration inputs from controls that mimic direct control of a vehicle from a steering wheel, brake pedal, and accelerator pedal in order to generate an assistance line and determine commands for the movement of the vehicle based on the inputs. Kobayashi teaches forming a teleoperation input by combining and synthesizing two or more of the steering input, throttle input, and the brake input. Regarding point (b), Lockwood and Kobayashi are both in the same field of using inputs from a teleoperator to alter the trajectory of an autonomous vehicle and altering the mode of input from selection on a computer to controls that mimic manual driving controls would not change Lockwood’s function of using teleoperator input to control the autonomous vehicle. It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify the combination of Lockwood and Pierfelice with Kobayashi to include a steering wheel, accelerator pedal, and brake pedal that mimic driver input and control the vehicle in order to provide a technique for enabling remote operation without considering a difference between different autonomous vehicles (Kobayashi para. [0019]). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-4, 7-8, 11-14, and 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Lockwood et al. (US 20190011912 A1), in view of Pierfelice (US 20140005925 A1), and in further view of Kobayashi et al. (US 20190317491 A1). Regarding claim 1, Lockwood teaches: A method of controlling an autonomous vehicle by a teleoperation system, comprising: receiving, by the teleoperation system, sensor data from one or more sensors on the autonomous vehicle through a communication link, wherein the sensor data comprises environmental data associated with a physical environment of the autonomous vehicle and operation data comprising an existing trajectory of the autonomous vehicle; (Lockwood – [0070] “For example, a vehicle 102 may send communication signals via the network interface 234, which are received by the teleoperations receiver 304. In some examples, the communication signals may include, for example, sensor data from sensor signals generated by one or more sensors associated with the vehicle 102, and/or road network data from a road network data store. In some examples, the sensor data may include raw sensor data or processed sensor data, and the road network data may include data related to a global or local map of an area associated with operation of the vehicle 102. In some examples, the communication signals may include data associated with the current status of the vehicle 102 and its systems, such as, for example, its current position, current speed, current path and/or trajectory, current occupancy, the level of charge of one or more of its batteries, and/or the operational status of its sensors and drive systems.”) transforming, by the teleoperation system, the sensor data into visualization data configured to represent the physical environment of the autonomous vehicle on a visualization interface; (Lockwood – [0047] “The teleoperator interface 154 may include one or more displays 156 configured to provide the teleoperator 150 with data related to operation of the vehicle 102, a subset of the fleet of vehicles, and/or the fleet of vehicles. For example, the display(s) 156 may be configured to show data related to sensor signals received from the vehicles 102, data related to the road network 104, and/or additional data or information to facilitate providing assistance to the vehicles 102.”) performing user-software interactions using input devices configured to facilitate (Lockwood – [0110] “In some examples, the teleoperations system 148 may send teleoperations signals via the teleoperations transmitter 306, to the vehicles 102A-102C to provide guidance to the respective vehicle controllers 228 of the vehicles 102A-102C to switch from the first operating mode to the second operating mode while operating in the respective second geographic areas. In some examples, the second operating parameters may include one or more of altered performance parameters (e.g., speed, acceleration, braking rates, and steering input rates), altered vehicle operation policies (e.g., safety-related guidelines for controlling the vehicle), altered vehicle operation laws, or vehicle operation regulations.”) generating, at the teleoperation system, a map of an environment based on (i) the teleoperation input, (ii) a current position and a current orientation of the autonomous vehicle, and (iii) the existing trajectory of the autonomous vehicle, wherein the map provides a new path for the autonomous vehicle to traverse for avoiding an event or road condition; and (Lockwood – [0070] “In some examples, the sensor data may include raw sensor data or processed sensor data, and the road network data may include data related to a global or local map of an area associated with operation of the vehicle 102. In some examples, the communication signals may include data associated with the current status of the vehicle 102 and its systems, such as, for example, its current position, current speed, current path and/or trajectory, current occupancy, the level of charge of one or more of its batteries, and/or the operational status of its sensors and drive systems.” [0083] “Based at least in part on the communications signals, the active view zone 522 provides a real-time perspective view of the vehicle 102 and the relevant environment. In some examples, the active view zone 522 may display any permutation of sensor data, operation state data, and/or teleoperations data.” Examiner note: The data displayed in the active view zone 522, as shown in fig. 5B corresponds to the map.) transmitting, from the teleoperation system to the autonomous vehicle the map data, for use by a planner to determine a modified trajectory or a new trajectory for the autonomous vehicle by incorporating the map into the existing trajectory of the autonomous vehicle; (Lockwood – [0092] “Based on the teleoperator's 150 inputs, the teleoperations system 148 may transmit teleoperations signals to the vehicle 102 via the teleoperations transmitter 306 (FIG. 3). In the example shown, the vehicle 102 may expand the boundaries 140 of its driving corridor 138 in a manner consistent with the teleoperations signals, for example, as shown in FIG. 5B… Based at least in part on the selected revised trajectory, the vehicle 102 may determine a revised drive line 146 for use in maneuvering around the object 524. Thereafter, the vehicle controller 228 (FIG. 2) may be configured to operate the vehicle 102 according to the revised drive line 146, for example, as shown in FIG. 5B, and maneuver around the object 524.” Examiner note: The data displayed in the active view zone 522, as shown in fig. 5B corresponds to the map, and updates when the teleoperator modifies the driving corridor 138, which corresponds to the modified trajectory.) wherein the teleoperation input further comprises a modification to a classification of an object or obstacle in an environment of the autonomous vehicle or guidance to cause the autonomous vehicle to ignore or avoid the object or obstacle, (Lockwood – [0032] “In some examples, the teleoperations signals may provide guidance including one or more of causing the driverless vehicle to at least one of ignore the event, increase or decrease probabilities of classes of objects (e.g., in a school zone, increase a probability that a small object is a child), alter virtual boundaries of a driving corridor within which the vehicle operates, and operate the driverless vehicle according to a travel speed constraint (e.g., reducing a maximum travel speed).”) wherein the map comprises a speed limit determined (Lockwood – [0035] “The second operating parameters may include one or more of reducing energy expenditure of the driverless vehicles, setting a maximum operating speed, preventing the driverless vehicles from operating bidirectionally, changing a threshold confidence level required for autonomous operation, changing a threshold confidence level required for autonomous operation in a second geographic area, altering at least one of an object classification model or an object prediction model used by the driverless vehicles, or relaxing vehicle operation policies associated with complying with traffic laws and regulations.”) Lockwood does not explicitly teach the following limitation, however, Pierfelice teaches: wherein the (Pierfelice – [0032] “The known speed limits can be associated with the map data and stored in one of the memory devices. In alternative embodiments, the average speed limit can be determined by averaging the current speed of other vehicles traveling along the route 122, which can be detected by speed sensing systems and provided in real time via the network interface hardware 118 (FIG. 1).”) Lockwood and Pierfelice are both considered to be analogous to the claimed invention because they are both in the same field of providing route guidance to a vehicle. It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify Lockwood to include calculating speed limit based on throttle input as taught by Pierfelice in order to determine an optimal route based on factors such as traffic, road conditions, distance, and speed limits (Pierfelice para. [0030]). The combination of Lockwood and Pierfelice does not explicitly teach the following, however, Kobayashi teaches: performing user-software interactions using input devices configured to facilitate mimicking direct control of a vehicle by a driver, wherein the user-software interactions enter inputs into the teleoperation system that comprise a steering input to alter a steering angle of the vehicle, a throttle input to change a throttle position for the vehicle, and/or a brake input to change a brake position for the vehicle; (Kobayashi – [0047] “Operation amount obtainment circuit 314 obtains the amount of operation applied to operation accepter 35 by a remote operator. For example, the steering angle of steering wheel 351, the position of accelerator pedal 352, and the position of brake pedal 353 are obtained. Operation amount correction circuit 315 corrects the obtained amount of operation with reference to vehicle characteristics table 321t. Control command generation circuit 316 generates a control command including the corrected amount of operation, and transmits the generated control command to autonomous driving control apparatus 10 via the network.”) forming, at the teleoperation system, a teleoperation input by combining and synthesizing two or more of the steering input, the throttle input and the brake input; (Kobayashi – [0059] “FIG. 9A and FIG. 9B each illustrate an example of the image for remote operation with the operation assistance line superimposed thereon. Image 34a illustrated in FIG. 9A is obtained by superimposing, in a central area at the lower end of the screen, assistance line al modeling the curve radius obtained at the time of displaying. This curve radius is the curve radius of autonomous vehicle 1 to be remotely operated, and is calculated on the basis of the steering angle of steering wheel 351 of remote-operation apparatus 30, the value of the wheelbase of the subject vehicle, and the tire angle/steering angle coefficient of the subject vehicle. Note that assistance line al may be dynamically changed in length according to the vehicle speed or the accelerator position obtained at the time of displaying.”) Kobayashi is considered to be analogous to the claimed invention because it is in the same field of remotely controlling an autonomous vehicle. It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify the combination of Lockwood and Pierfelice with Kobayashi to include a steering wheel, accelerator pedal, and brake pedal that mimic driver input and control the vehicle in order to provide a technique for enabling remote operation without considering a difference between different autonomous vehicle (Kobayashi para. [0019]). Regarding claim 2, The combination of Lockwood, Pierfelice, and Kobayashi teaches the limitations of claim 1. Lockwood further teaches: further comprising determining, by the teleoperation system or by the autonomous vehicle, an event or a condition associated with a least a portion of the existing trajectory. (Lockwood – [0030] “For example, an event may include one or more of an activity associated with a portion of the path, an object along the path at least partially within the driving corridor as the vehicle approaches the object (e.g., people, animals, vehicles, or other static or dynamic objects) along the path at least partially within the driving corridor or moving with a trajectory toward the driving corridor as the vehicle approaches the object.” [0089] “FIG. 5A shows an example vehicle 102 in a first example event scenario in which the example static object 524 is in the road 106. In some examples, as the vehicle 102 approaches the object 524, the sensors 204 (FIG. 2) associated with the vehicle 102 may detect the object 524. Once detected, one or more of the planner 214, the object data calculator 216, the object classifier 218, the collision predictor system 220, and the kinematics calculator 222 (FIG. 2) may be used to determine the location of the object 524, classify the object 524, determine whether the object 524 is static or dynamic, and if the object is dynamic, predict a possible trajectory of the object 524.”) Regarding claim 3, The combination of Lockwood, Pierfelice, and Kobayashi teaches the limitations of claim 2. Lockwood further teaches: wherein the entering the teleoperation input is responsive to the event or condition associated with the at least the portion of the existing trajectory. (Lockwood – [0026] “In some examples, the teleoperations system may provide guidance and information to a driverless vehicle when the driverless vehicle encounters an event, so that the driverless vehicle will be able to avoid, maneuver around, and/or pass through the area associated with the event. The driverless vehicle may be configured to send communication signals to the remotely located teleoperations system, and based at least in part on the communication signals, the teleoperations system may provide the driverless vehicle with guidance, including instructions, proposed actions or maneuvers for the evaluation and/or execution by the driverless vehicle, and/or information to assist the driverless vehicle past the area associated with the event.”) Regarding claim 4, The combination of Lockwood, Pierfelice, and Kobayashi teaches the limitations of claim 2. Lockwood further teaches: wherein the event or condition comprises a road construction, a weather condition, a stop sign, or a school zone. (Lockwood – [0111] “Although the example events described with respect to FIG. 10 include accident, school, and construction zones, other geographic location-related zones are contemplated. For example, other events may be associated with flood zones, parade zones, special event zones, and/or zones associated with slow traffic, such as areas where vehicles are being driven into bright sunlight or areas where weather conditions such as rain or snow are affecting traffic rates.”) Regarding claim 7, The combination of Lockwood, Pierfelice, and Kobayashi teaches the limitations of claim 1. Lockwood further teaches: wherein the teleoperation input is based on real time information of the autonomous vehicle. (Lockwood – [0083] “For example, as shown in FIG. 5A, the view selector icon 518B has been selected, and the UI 500A includes an active view zone 522 providing a real-time simulated (or animated) perspective view of the vehicle 102 selected via the selector 504A. In the example shown, the active view zone 522 shows an animation depicting the vehicle 102 encountering an object 524 in the road 106. The teleoperator 150 may use the active view zone 522 to monitor the operation of, and the teleoperator's interaction with, the selected vehicle 102 (i.e., AV 001 in this example) before, during, and/or after the teleoperator 150 interacts with the vehicle 102 via the teleoperator interface 154. For example, the vehicle 102 may send communications signals to the teleoperator system 148 including sensor signals from one or more sensors associated with the vehicle 102 and/or a request for guidance and/or information from the teleoperations system 148. Based at least in part on the communications signals, the active view zone 522 provides a real-time perspective view of the vehicle 102 and the relevant environment. In some examples, the active view zone 522 may display any permutation of sensor data, operation state data, and/or teleoperations data.”) Regarding claim 8, The combination of Lockwood, Pierfelice, and Kobayashi teaches the limitations of claim 7. Lockwood further teaches: further comprising presenting the visualization data that comprises the real time information of the autonomous vehicle on a display to enable the teleoperator to determine whether to intervene in operation of a planner of the autonomous vehicle. (Lockwood – [0085] “The example UI 500A shown in FIG. 5A also includes a video view zone 528. In some examples, the video view zone 528 may provide a real-time video view from a video camera associated with the vehicle 102. In some examples, any data discussed herein as being “real-time” may additionally or alternatively include real-time data and/or historical data. This may assist the teleoperator 150 with quickly understanding the situation encountered by the vehicle 102.” [0089] “Once detected, one or more of the planner 214, the object data calculator 216, the object classifier 218, the collision predictor system 220, and the kinematics calculator 222 (FIG. 2) may be used to determine the location of the object 524, classify the object 524, determine whether the object 524 is static or dynamic, and if the object is dynamic, predict a possible trajectory of the object 524. As the vehicle 102 approaches the object 524, one or more of these systems may be used to calculate a confidence level associated with a probability that the vehicle 102 will be able to successfully maneuver past the object 524, for example, without assistance from the teleoperations system 148. As the confidence level drops below a threshold minimum confidence level, the vehicle 102 may slow its speed or stop, and use its network interface 234 (FIG. 2) to send communication signals to the teleoperations system 148 providing sensor data and a request for guidance from the teleoperations system 148.”) Regarding claim 11, Claim 11 recites a system comprising substantially the same limitation as claim 1 above, therefore it is rejected for the same reasons. Regarding claim 12, Claim 12 recites a system comprising substantially the same limitation as claim 2 above, therefore it is rejected for the same reasons. Regarding claim 13, Claim 13 recites a system comprising substantially the same limitation as claim 3 above, therefore it is rejected for the same reasons. Regarding claim 14, Claim 14 recites a system comprising substantially the same limitation as claim 4 above, therefore it is rejected for the same reasons. Regarding claim 17, Claim 17 recites a system comprising substantially the same limitation as claim 7 above, therefore it is rejected for the same reasons. Regarding claim 18, Claim 18 recites a system comprising substantially the same limitation as claim 8 above, therefore it is rejected for the same reasons. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure or directed to the state of the art is listed on the enclosed PTO-892. The following is a brief description for relevant prior art that was cited but not applied: Maeda et al. (US 20200341470 A1) discloses remote control information transmitted to the host vehicle may be configured to be generated as an operator operates the steering wheel, the accelerator pedal, and the brake pedal during display of a front image acquired from the in-vehicle sensor on a display device. The display device can acquired a state ahead of the host vehicle, in particular, a stationary obstacle and a road situation ahead of the host vehicle as an image, and the operator operates the steering wheel, the accelerator, and the brake at his/her own determination while viewing the image, and remotely controls the host vehicle. The display device illustrates a remote operation guide that is a future course of the host vehicle based on a current turning angle of the steering wheel. Biehler et al. (US 20200004240 A1) discloses remote control may for example detect actuation of a steering wheel of the operating element (in particular by an operator of the remote control) of the remote control, and transmit corresponding steering data to the driver assistance system of the vehicle. The driver assistance system may then for example control a steering system of the vehicle on the basis of the steering data received from the remote control, in such a way that the vehicle is steered in accordance with actuation of the steering wheel of the remote control. The same is true in particular of actuation of a gas pedal, a clutch pedal and a brake pedal. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MELANIE HUBER whose telephone number is (703)756-1765. The examiner can normally be reached M-F 7:30am-4pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JAMES LEE can be reached at (571)-270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /M.G.H./Examiner, Art Unit 3668 /JAMES J LEE/Supervisory Patent Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Aug 22, 2022
Application Filed
Jul 17, 2024
Non-Final Rejection — §103
Dec 12, 2024
Response Filed
Jan 10, 2025
Final Rejection — §103
Apr 03, 2025
Request for Continued Examination
Apr 08, 2025
Response after Non-Final Action
Apr 24, 2025
Non-Final Rejection — §103
Jul 03, 2025
Response Filed
Aug 22, 2025
Final Rejection — §103
Oct 27, 2025
Response after Non-Final Action
Dec 10, 2025
Request for Continued Examination
Dec 20, 2025
Response after Non-Final Action
Feb 05, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594856
VEHICLE AND METHOD FOR CONTROLLING POWER SUPPLY TO EXTERNAL LOAD BASED ON SOC
2y 5m to grant Granted Apr 07, 2026
Patent 12576856
Inferring Operator Characteristics from Device Motion Data
2y 5m to grant Granted Mar 17, 2026
Patent 12570158
METHOD FOR SHIFT USING SHIFT ENTRY PREDICTION AND VEHICLE THEREFOR
2y 5m to grant Granted Mar 10, 2026
Patent 12558957
METHOD FOR OPERATING A DISPLAY UNIT OF A VEHICLE, AND DISPLAY UNIT
2y 5m to grant Granted Feb 24, 2026
Patent 12553741
SYSTEM AND METHOD TO ADDRESS LOCALIZATION ERRORS
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
72%
Grant Probability
99%
With Interview (+29.6%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 46 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month