Prosecution Insights
Last updated: April 19, 2026
Application No. 18/633,038

SYSTEMS AND METHODS FOR COMPARING DRIVING PERFORMANCE FOR SIMULATED DRIVING

Non-Final OA §103§DP
Filed
Apr 11, 2024
Examiner
YIP, JACK
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Smartdrive Systems Inc.
OA Round
3 (Non-Final)
33%
Grant Probability
At Risk
3-4
OA Rounds
4y 1m
To Grant
70%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
229 granted / 702 resolved
-37.4% vs TC avg
Strong +38% interview lift
Without
With
+37.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
51 currently pending
Career history
753
Total Applications
across all art units

Statute-Specific Performance

§101
22.8%
-17.2% vs TC avg
§103
42.4%
+2.4% vs TC avg
§102
15.0%
-25.0% vs TC avg
§112
12.4%
-27.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 702 resolved cases

Office Action

§103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/20/2025 has been entered. Claims 1 – 20 are pending. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 5-7,9-12,15-17 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Kentley et al. (US 2017/0123422 A1) in view of Walther et al. (US 10,599,546 B1). Re claims 1, 11: Kentley teaches 1. A system configured to determine driving performance by a vehicle operator for simulated driving of a simulated vehicle in a simulation engine, wherein the simulated driving is based on real-world operation of individual real-world vehicles (Kentley, Abstract, “one or more probabilistic models associated with the one or more courses of action may also be determined”; [0081], “Simulator 740 is configured to simulate operation of one or more autonomous vehicles”), the system comprising: electronic storage configured to electronically store information (Kentley, [0053], “storage medium”); and one or more processors configured via machine-readable instructions (Kentley, [0134]) to: obtain output signals from at least two different sensors carried by the individual real-world vehicles during the real-world operation of the individual real-world vehicles (Kentley, fig. 3A, “SENSOR(S)”); determine dynamic vehicle parameters of the individual vehicles from the real-world vehicles, wherein the dynamic vehicle parameters are determined based on the output signals, wherein the dynamic vehicle parameters are determined multiple times in an ongoing manner during the real-world operation of the individual real-world vehicles, wherein the dynamic vehicle parameters include vehicle speed and direction of travel of multiple ones of the individual real-world vehicles (Kentley, [0067], “Perception engine 366 may be configured to determine locations of external objects based on sensor data and other data … Perception engine 366 may be able to detect and classify external objects as pedestrians, bicyclists, dogs, other vehicles, etc. … Examples of external objects likely to be labeled as dynamic include bicyclists, pedestrians, animals, other vehicles, etc. If the external object is labeled as dynamic, and further data about the external object may indicate a typical level of activity and velocity …”; fig. 10; [0086], “a number of trajectories generated by a planner Also displayed are other vehicles 1011 and dynamic objects 1013, such as pedestrians, that may cause sufficient confusion at the planner, thereby requiring teleoperation support. User interface 1010 also presents to teleoperator 1008 a current velocity 1022, a speed limit 1024”), and wherein the individual real-world vehicles include a first vehicle (Kentley, [0065], “other position-sensing data (e.g., wheel-related data, such as steering angles, angular velocity, etc.)”; fig. 3E, “Direction of Travel”; [0066], “data describing a local pose may include one or more of an x-coordinate, a y-coordinate, a z-coordinate (or any coordinate of any coordinate system, including polar or cylindrical coordinate systems, or the like), a yaw value, a roll value, a pitch value (e.g., an angle value), a rate (e.g., velocity), altitude, and the like”; [0072], “an actual environment in which the autonomous vehicle is positioned”; [0124], “captured over a duration of time”; [0168], “continuously captured data from the sensors”); detect vehicle events that have occurred in the real world at particular times during the real-world operation of the individual real-world vehicles, wherein detection of the vehicle events is based on the dynamic vehicle parameters, wherein the vehicle events include a first vehicle event that has occurred around a particular time during operation of the first vehicle (Kentley, [0065]; fig. 3E; [0066]; [0072], “an actual environment in which the autonomous vehicle is positioned”; [0124], “captured over a duration of time”; [0168], “continuously captured data from the sensors”; [0091], “This determination may optionally take into consideration other factors, including the time of day”); obtain, from the electronic storage, a set of vehicle event scenarios that correspond to the vehicle events that have been detected, wherein individual vehicle events are associated with physical surroundings of the individual real-world vehicles around the particular times the individual vehicle events occurred (Kentley, [0118], “creating a more realistic simulation of actual dynamic environments that exist in the real world.”; [0122]; [0061], ” An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle… An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external objects (or tracks) that are perceived by a perception engine”; [0073], “generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting”); create a set of simulation scenarios that are suitable for use by the simulation engine (Kentley, [0118], “a simulator configured to simulate an autonomous vehicle in a synthetic environment… a simulator 2840 that is configured to generate a simulated environment 2803. As shown, simulator 2840 is configured to use reference data … to generate simulated geometries… Simulated surfaces 2892a and 2892b may simulate walls … ”; [0061], ” An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle… An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external objects ( or tracks) that are perceived by a perception engine”; [0073], “generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting”); establish a communication link between the vehicle operator and the simulation engine, wherein the vehicle operator is an autonomous driving algorithm that controls operations of the simulated vehicle autonomously through the communication link (Kentley, [0122], “Simulator 2840 also includes a simulator controller 2856 configured to control the simulation to adapt the functionalities of any synthetically-generated element of simulated environment … simulator evaluator 2858 may analyze simulated vehicle commands 2880 (e.g., simulated steering angles and simulated velocities) … simulator 2840 may be used to explore the space of applicable controls and resulting trajectories so as to effect learning by self-simulation”); run the set of simulation scenarios in the simulation engine, wherein the simulated vehicle interacts with the vehicle operator and is operated by the vehicle operator based on input received from the autonomous driving algorithm (Kentley, [0118], “a simulator configured to simulate an autonomous vehicle in a synthetic environment… a simulator 2840 that is configured to generate a simulated environment 2803. As shown, simulator 2840 is configured to use reference data … to generate simulated geometries… Simulated surfaces 2892a and 2892b may simulate walls … ”; [0061], ” An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle… An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external objects ( or tracks) that are perceived by a perception engine”; [0073], “generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting”; [0122] – [0123]); detect, by the simulation engine during the running of the set of simulation scenarios, one or more simulated vehicle events that have occurred to the simulated vehicle as the autonomous driving algorithm controlled the operations of the simulated vehicle (Kentley, [0118]; [0122] – [0123]); determine one or more metrics that quantify a performance of the vehicle operator in running the set of simulation scenarios, wherein the determination of the one or more metrics is based on the one or more simulated vehicle events as detected (Kentley, Abstract, “confidence levels may also be determined to form a subset of the one or more courses of action”; [0062], “data representing a subset of candidate trajectories may be received from an autonomous vehicle responsive to the detection of the event”; [0149], “generate an operational efficiency metric based on the AV systems 3602 that have been deployed as part of the AV service 3660. The operational efficiency metric may be computed based on a number of factors”; [0096], “Simulator 1440 can access map data and other data necessary for performing a simulation on the set of candidate trajectories, whereby simulator 1440 need not exhaustively reiterate simulations to confirm sufficiency”); and generate a report based on the one or more metrics and transfer the report to one or more users (Kentley, [0062], “responsive to input from a teleoperator (e.g., a teleoperator may select at least one candidate trajectory as a guided trajectory from a group of differently-ranked candidate trajectories)”; a teleporter (a user) received a report related to a group of candidate trajectories). 11. A method for determining driving performance by a vehicle operator for simulated driving of a simulated vehicle in a simulation engine, wherein the simulated driving is based on real-world operation of individual real-world vehicles (Kentley, Abstract, “one or more probabilistic models associated with the one or more courses of action may also be determined”; [0081], “Simulator 740 is configured to simulate operation of one or more autonomous vehicles”), the method comprising: obtaining output signals from at least two different sensors carried by the individual real-world vehicles during the real-world operation of the individual real-world vehicles (Kentley, fig. 3A, “SENSOR(S)”); determining dynamic vehicle parameters of the individual vehicles, wherein the dynamic vehicle parameters are determined based on the output signals, wherein the dynamic vehicle parameters are determined multiple times in an ongoing manner during the real-world operation of the individual real-world vehicles, wherein the dynamic vehicle parameters include vehicle speed and direction of travel of multiple ones of the individual real-world vehicles (Kentley, [0067], “Perception engine 366 may be configured to determine locations of external objects based on sensor data and other data … Perception engine 366 may be able to detect and classify external objects as pedestrians, bicyclists, dogs, other vehicles, etc. … Examples of external objects likely to be labeled as dynamic include bicyclists, pedestrians, animals, other vehicles, etc. If the external object is labeled as dynamic, and further data about the external object may indicate a typical level of activity and velocity …”; fig. 10; [0086], “a number of trajectories generated by a planner Also displayed are other vehicles 1011 and dynamic objects 1013, such as pedestrians, that may cause sufficient confusion at the planner, thereby requiring teleoperation support. User interface 1010 also presents to teleoperator 1008 a current velocity 1022, a speed limit 1024”), and wherein the individual real-world vehicles include a first vehicle (Kentley, [0065], “other position-sensing data (e.g., wheel-related data, such as steering angles, angular velocity, etc.)”; fig. 3E, “Direction of Travel”; [0066], “data describing a local pose may include one or more of an x-coordinate, a y-coordinate, a z-coordinate (or any coordinate of any coordinate system, including polar or cylindrical coordinate systems, or the like), a yaw value, a roll value, a pitch value (e.g., an angle value), a rate (e.g., velocity), altitude, and the like”; [0072], “an actual environment in which the autonomous vehicle is positioned”; [0124], “captured over a duration of time”; [0168], “continuously captured data from the sensors”); detecting vehicle events that have occurred in the real world at particular times during the real-world operation of the individual real-world vehicles, wherein detection of the vehicle events is based on the dynamic vehicle parameters, wherein the vehicle events include a first vehicle event that has occurred around a particular time during operation of the first vehicle (Kentley, [0065]; fig. 3E; [0066]; [0072], “an actual environment in which the autonomous vehicle is positioned”; [0124], “captured over a duration of time”; [0168], “continuously captured data from the sensors”; [0091], “This determination may optionally take into consideration other factors, including the time of day”); obtaining a set of vehicle event scenarios that correspond to the vehicle events that have been detected, wherein individual vehicle events are associated with physical surroundings of the individual real-world vehicles around the particular times the individual vehicle events occurred (Kentley, [0118], “creating a more realistic simulation of actual dynamic environments that exist in the real world.”; [0122]; [0061], ” An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle… An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external objects (or tracks) that are perceived by a perception engine”; [0073], “generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting”); creating a set of simulation scenarios that are suitable for use by the simulation engine (Kentley, [0118], “a simulator configured to simulate an autonomous vehicle in a synthetic environment… a simulator 2840 that is configured to generate a simulated environment 2803. As shown, simulator 2840 is configured to use reference data … to generate simulated geometries… Simulated surfaces 2892a and 2892b may simulate walls … ”; [0061], ” An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle… An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external objects ( or tracks) that are perceived by a perception engine”; [0073], “generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting”); establishing a communication link between the vehicle operator and the simulation engine, wherein the vehicle operator is an autonomous driving algorithm that controls operations of the simulated vehicle autonomously through the communication link (Kentley, [0122], “Simulator 2840 also includes a simulator controller 2856 configured to control the simulation to adapt the functionalities of any synthetically-generated element of simulated environment … simulator evaluator 2858 may analyze simulated vehicle commands 2880 (e.g., simulated steering angles and simulated velocities) … simulator 2840 may be used to explore the space of applicable controls and resulting trajectories so as to effect learning by self-simulation”); running the set of simulation scenarios in the simulation engine, wherein the simulated vehicle interacts with the vehicle operator and is operated by the vehicle operator based on input received from the autonomous driving algorithm (Kentley, [0118], “a simulator configured to simulate an autonomous vehicle in a synthetic environment… a simulator 2840 that is configured to generate a simulated environment 2803. As shown, simulator 2840 is configured to use reference data … to generate simulated geometries… Simulated surfaces 2892a and 2892b may simulate walls … ”; [0061], ” An event may be a condition or situation affecting operation, or potentially affecting operation, of an autonomous vehicle… An event may include traffic conditions or congestion, as well as unexpected or unusual numbers or types of external objects ( or tracks) that are perceived by a perception engine”; [0073], “generating trajectories by identifying objects of interest in a surrounding environment in which autonomous vehicle 430 is transiting”; [0122] – [0123]); detecting, by the simulation engine during the running of the set of simulation scenarios, one or more simulated vehicle events that have occurred to the simulated vehicle as the autonomous driving algorithm controlled the operations of the simulated vehicle (Kentley, [0118]; [0122] – [0123]); determining one or more metrics that quantify a performance of the vehicle operator in running the set of simulation scenarios, wherein the determination of the one or more metrics is based on the one or more simulated vehicle events as detected (Kentley, Abstract, “confidence levels may also be determined to form a subset of the one or more courses of action”; [0062], “data representing a subset of candidate trajectories may be received from an autonomous vehicle responsive to the detection of the event”; [0149], “generate an operational efficiency metric based on the AV systems 3602 that have been deployed as part of the AV service 3660. The operational efficiency metric may be computed based on a number of factors”; [0096], “Simulator 1440 can access map data and other data necessary for performing a simulation on the set of candidate trajectories, whereby simulator 1440 need not exhaustively reiterate simulations to confirm sufficiency”); and generating a report based on the one or more metrics and transferring the report to one or more users (Kentley, [0062], “responsive to input from a teleoperator (e.g., a teleoperator may select at least one candidate trajectory as a guided trajectory from a group of differently-ranked candidate trajectories)”; a teleporter (a user) received a report related to a group of candidate trajectories). PNG media_image1.png 956 712 media_image1.png Greyscale Fig. 10 in Kentley shows multiple (other) individual real-world vehicles includes dynamic parameters such as trajectory (direction) and velocity (Kentley , [0067]; fig. 10; [0086]). Kentley does not explicitly disclose obtain, from the electronic storage, a set of vehicle event scenarios that correspond to the vehicle events that have been detected, wherein individual vehicle events are associated with physical surroundings of the individual real-world vehicles around the particular times the individual vehicle events occurred, wherein a first vehicle event scenario is associated with a first set of real-world circumstances that is based on a first set of physical surroundings of the first vehicle around the particular time the first vehicle event occurred, and wherein the first vehicle event scenario has a scenario time period that begins prior to an occurrence of a potential vehicle event; create a set of simulation scenarios that are suitable for use by the simulation engine, wherein individual ones of the set of simulation scenarios correspond to individual ones of the set of vehicle event scenarios, wherein the individual ones of the set of simulation scenarios mimic real-world circumstances associated with a corresponding vehicle event scenario, such that a first simulation scenario mimics the first set of real-world circumstances associated with the first vehicle event scenario, wherein the simulated vehicle is based on the first vehicle. Walther et al. (US 10,599,546 B1) teaches systems and methods for autonomous vehicle testing and a computer-implemented method includes obtaining, by a computing system, data indicative of a test of an autonomous vehicle computing system. Walther further teaches obtain, from the electronic storage, a set of vehicle event scenarios that correspond to the vehicle events that have been detected, wherein individual vehicle events are associated with physical surroundings of the individual real-world vehicles around the particular times the individual vehicle events occurred, wherein a first vehicle event scenario is associated with a first set of real-world circumstances that is based on a first set of physical surroundings of the first vehicle around the particular time the first vehicle event occurred, and wherein the first vehicle event scenario has a scenario time period that begins prior to an occurrence of a potential vehicle event (Walther, Abstract, “The method includes determining, by the computing system, a testing scenario that corresponds to the test. The testing scenario can generated at least in part using real-world data”; col. 12, lines 22 – 49, “that is based at least in part on driving log data (e.g., captured in the logs of autonomous vehicle operation in the real world)”; col. 4, lines 34 – 67, “the simulated environment (e.g., initial position, heading, speed, etc.); a type of each simulated object (e.g., vehicle, 55 bicycle, pedestrian, etc.); a geometry of each simulated object (e.g., shape, size etc.); the motion of the simulated object(s)”; figs. 4 – 6 includes different vehicle event scenarios in real world wherein the Autonomous vehicle can test); create a set of simulation scenarios that are suitable for use by the simulation engine, wherein individual ones of the set of simulation scenarios correspond to individual ones of the set of vehicle event scenarios, wherein individual ones of the set of simulation scenarios mimic real-world circumstances associated with a corresponding vehicle event scenario, such that a first simulation scenario mimics the first set of real-world circumstances associated with the first vehicle event scenario, wherein the simulated vehicle is based on the first vehicle (Walther, Abstract, “The testing scenario can generated at least in part using real-world data”; figs. 4 – 6 includes different vehicle event scenarios in real world wherein the Autonomous vehicle can test; col. 4, lines 34 – 67, “the testing system can create a simulated environment in which a simulated autonomous vehicle operates”; col. 3, lines 23 – 64, “The testing scenario can indicate testing parameters such as the type of geographic area represented in the testing environment, object(s) in the testing environment (e.g., vehicles, bicycles, pedestrians, etc.), weather condition(s), etc. These testing parameters can be determined based on real-world data (e.g., driving log data, sensor data, etc.)”). Therefore, in view of Walther, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the method/system described in Kentley, by recreating simulations based on real-world scenarios as taught by Walter, in order to test the feature, function, and behavior of an autonomous vehicle under various real-world situations as list in column 17, lines 19 – 50 in Walter. Re claims 2, 12: 2. The system of claim 1, wherein the potential vehicle event corresponds to the first vehicle event. 12. The method of claim 11, wherein the potential vehicle event corresponds to the first vehicle event (Kentley, [0065]; fig. 3E; [0066]; [0072], “an actual environment in which the autonomous vehicle is positioned”; [0124], “captured over a duration of time”; [0168], “continuously captured data from the sensors”; [0091], “This determination may optionally take into consideration other factors, including the time of day”; [0092], “an event at a geographic region may occur”). Re claims 5, 15: 5. The system of claim 1, wherein the dynamic vehicle parameters include a distance to an object in or near a current travelling lane of the individual real-world vehicles. 15. The method of claim 11, wherein the dynamic vehicle parameters include a distance to an object in or near a current travelling lane of the individual real-world vehicles (Kentley, [0089], “distances to external objects”). Re claims 6, 16: 6. The system of claim 1, wherein the first set of physical surroundings of the first vehicle is based on traffic conditions around the particular time the first vehicle event occurred. 16. The method of claim 11, wherein the first set of physical surroundings of the first vehicle is based on traffic conditions around the particular time the first vehicle event occurred (Kentley, [0061]; [0091]). Re claim 7, 17: 7. The system of claim 1, wherein the communication link provides the vehicle operator with control over the operations of the simulated vehicle. 17. The method of claim 11, wherein the communication link provides the vehicle operator with control over the operations of the simulated vehicle (Kentley, [0062], “responsive to input from a teleoperator (e.g., a teleoperator may select at least one candidate trajectory as a guided trajectory from a group of differently-ranked candidate trajectories)”; a teleporter (a user) received a report related to a group of candidate trajectories). Re claims 9, 19: 9. The system of claim 1, wherein one of the one or more metrics is reduced responsive to an individual one of the set of simulation scenarios resulting in a simulated accident. 19. The method of claim 11, wherein one of the one or more metrics is reduced responsive to an individual one of the set of simulation scenarios resulting in a simulated accident (Kentley, [0091], “This determination may optionally take into consideration other factors, including the time of day … traffic or accident data derived from a variety of sources”; Walther, col. 23, lines 1 – 29, “The performance metric can indicate whether the autonomous vehicle computing system 104 passed, failed, did not complete, etc. the test 106 and/or testing scenario 124”; col. 17, lines 44 – 50, “If tests associated with speed limits and directions fail, then failures to tests associated with stop-and-go traffic may be less meaningful”). Re claims 10, 20: 10. The system of claim 1, wherein modification of the one or more metrics due to an individual one of the set of simulation scenarios that resulted in a simulated accident is varied based on a difficulty level of the individual one of the set of simulation scenarios. 20. The method of claim 11, wherein modification of the one or more metrics due to an individual one of the set of simulation scenarios that resulted in a simulated accident is varied based on a difficulty level of the individual one of the set of simulation scenarios (Walther, fig. 5, “Test Variations”; col. 6, lines 29 – 52, “The testing scenario can indicate one or more testing parameters of a test such as, for example: the type of geographic area represented in the testing environment ( e.g., intersection, highway, cul-de-sac, dead end, etc.), features of the geographic area (e.g., train tracks, obstructions, etc.); one or more objects within the testing environment (e.g., vehicles, bicycles, pedestrians, etc.); weather condition(s), and/or other parameters. An individual test can be a variation of the testing scenario that evaluates the one or more autonomous vehicle capabilities”; col. 7, lines 36 – 51, “The testing system can obtain data indicative of user input that identifies a testing scenario as corresponding to the test. Additionally or alternatively, the user interface can present an interactive element (e.g., text entry box, etc.) via which the user can define a new testing scenario. For example, the user can provide user input (e.g., via the user interface) indicating one or more testing parameters (e.g., four way intersection, two objects) and the testing system can generate a new testing scenario based at least in part on such user input”; changing the test parameters also changes difficulty and complexity). Claims 3 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Kentley and Walther as applied to claims 1 and 11 above, and further in view of Switkes et al. (US 2013/0041576 A1). Re claims 3, 13: Kentley teaches 3. The system of claim 1, wherein the at least two different sensors include a lidar. 13. The method of claim 11, wherein the at least two different sensors include a lidar (Kentley, fig. 4, 472; fig. 3A). However, Kentley does not explicitly disclose a following distance vehicle parameter. Switkes et al. (US 20130041576 A1) teaches systems and methods for facilitating participants of vehicular convoys to closely follow one another through partial automation (Switkes, Abstract). Switkes further teaches wherein the dynamic vehicle parameters include a following distance to a particular real-world vehicle that is ahead of one of the individual real-world vehicles (Switkes, [0043], “A forward-looking RADAR or LIDAR unit 1130, which senses distance and relative speed of the vehicle in front 410”; [0039], “following distance behind the front vehicle”). Therefore, in view of Switkes, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system/method described in Kentley, by providing the following distance as taught by Switkes, in order to provide systems and methods for a Semi-Autonomous Vehicular Convoying includes the ability to follow closely together in a safe, efficient, convenient manner (Switkes, [0072]). Claims 4 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Kentley and Walther as applied to claims 1 and 11 above, and further in view of Rajvanshi et al. (US 2017/0341647 A1). Re claims 4, 14: Kentley does not explicitly disclose a distance to a lane marking. Rajvanshi teaches the methods and system described herein may be used to assist an automated driving system of a host vehicle (Rajvanshi, Abstract). Rajvanshi further teaches 4. The system of claim 1, wherein the dynamic vehicle parameters include a distance of another real-world vehicle to a lane marking near the first vehicle (Rajvanshi, [0023]; [0024]; [0033] – [0035]; fig. 1; Rajvanshi teaches a distance of another real-world vehicle (target vehicle) to a lane marking near the first vehicle (host vehicle); see fig. 1), and wherein the first vehicle event scenario includes a second simulated vehicle crossing the lane marking near the first vehicle (Rajvanshi, fig. 2A – 2D; [0016], “means to initiate or at least partially begin a lane change or lane departure from the subject vehicle's current lane”; [0044]; [0057], “when the leading target vehicle is cutting out or switching lanes (e.g., FIG. 2D), the method determines whether the host vehicle's current lane is clear”). 14. The method of claim 11, wherein the dynamic vehicle parameters include a distance of another real-world vehicle to a lane marking near the first vehicle (Rajvanshi, [0023]; [0024]; [0033] – [0035]; fig. 1; Rajvanshi teaches a distance of another real-world vehicle (target vehicle) to a lane marking near the first vehicle (host vehicle); see fig. 1), and wherein the first vehicle event scenario includes a second simulated vehicle crossing the lane marking near the first vehicle (Rajvanshi, fig. 2A – 2D; [0016], “means to initiate or at least partially begin a lane change or lane departure from the subject vehicle's current lane”; [0044]; [0057], “when the leading target vehicle is cutting out or switching lanes (e.g., FIG. 2D), the method determines whether the host vehicle's current lane is clear”). Therefore, in view of Rajvanshi, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system/method described in Kentley, by providing the distance to lane marking of another vehicle as taught by Rajvanshi, since one or more sensors may be used to track another real-world vehicle that is in front of the first vehicle and to determine the relative position of the another real-world vehicle with respect to the first vehicle. While this relative positional information may be useful in terms of maintaining a safe following distance, it may not be enough by itself to determine whether the first vehicle, another real-world vehicle, or both vehicles are switching lanes and how to control the first vehicle in response thereto. With sufficient information to determine which vehicle is switching lanes or "cutting out", vehicle autonomous or semi-autonomous systems, such as ACC systems, may be able to operate more favorably, thereby creating a better passenger and/or operator experience (Rajvanshi, [0003]). Claims 8 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Kentley and Walther as applied to claims 1 and 11 above, and further in view of Takahashi (US 2005/0137756 A1). Re claims 8, 18: Kentley does not explicitly disclose the simulation engine is run at faster-than-real-time. Takahashi teaches a traffic flow simulator simulates a virtual traffic flow based on the detected traffic flow, and a simulation result evaluating device predicts an approaching traffic flow around the vehicle based on the simulation result (Takahashi, Abstract). Takahashi teaches 8. The system of claim 1, wherein the simulation engine is computer-implemented and configured by a set of machine-readable instructions, and wherein the set of simulation scenarios in the simulation engine is run at faster-than-real-time. 18. The method of claim 11, wherein the set of simulation scenarios in the simulation engine is run at faster-than-real-time (Takahashi, [0039]). Therefore, in view of Takahashi, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system/method described in Kentley, by providing faster simulation time as taught by Takahashi, since the traffic flow simulation is made faster than real-time progress, in which an actual traffic flow realized in a minute can be calculated in about one-tenth to save time and analyze more simulated scenarios (Takahashi, [0039]). Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1 – 2, 5 - 9, 11 – 12, 15 - 19 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 – 2, 5, 7, 9, 10, 13 - 15 of U.S. Patent No. 12008922 (‘922) in view of Kentley et al. (US 2017/0123422 A1). Re claims 1, 11: Claims 1 and 9 in ‘922 teaches all the limitation in claims 1 and 11, except for the wherein the vehicle parameters include vehicle speed and direction of travel, and wherein the individual vehicles include a first vehicle. Kentley teaches determining vehicle parameters of individual vehicles from the real-world vehicles, wherein the vehicle parameters are determined based on the output signals, wherein the vehicle parameters are determined multiple times in an ongoing manner during the real-world operation of the real-world vehicles, wherein the vehicle parameters include vehicle speed and direction of travel, and wherein the individual vehicles include a first vehicle (Kentley, [0065], “other position-sensing data (e.g., wheel-related data, such as steering angles, angular velocity, etc.)”; fig. 3E, “Direction of Travel”; [0066], “data describing a local pose may include one or more of an x-coordinate, a y-coordinate, a z-coordinate (or any coordinate of any coordinate system, including polar or cylindrical coordinate systems, or the like), a yaw value, a roll value, a pitch value (e.g., an angle value), a rate (e.g., velocity), altitude, and the like”; [0072], “an actual environment in which the autonomous vehicle is positioned”; [0124], “captured over a duration of time”; [0168], “continuously captured data from the sensors”). Therefore, in view of Kentley, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the method and system of ‘922, by providing speed and direction vehicle parameters as taught by Kentley, since it was known in the art that speed and steering angle are essential parameters for controlling a vehicle. Re claims 2, 12: See claims 2 and 10 in ‘922. Re claims 7, 17: See claims 5 and 13 in ‘922. Re claims 8, 18: See claim 14 in ‘922. Re claims 9, 19: See claims 7 and 15 in ‘922. Re claims 5 – 6, 15 – 16: ‘922 does not explicitly disclose a distance to an object in or near a current travelling lane of the individual vehicles; nor disclose the first set of physical surroundings of the first vehicle is based on traffic conditions around the particular time the first vehicle event occurred. Kentley teaches 5. The system of claim 1, wherein the vehicle parameters include a distance to an object in or near a current travelling lane of the individual vehicles. 15. The method of claim 11, wherein the vehicle parameters include a distance to an object in or near a current travelling lane of the individual vehicles (Kentley, [0089], “distances to external objects”). 6. The system of claim 1, wherein the first set of physical surroundings of the first vehicle is based on traffic conditions around the particular time the first vehicle event occurred. 16. The method of claim 11, wherein the first set of physical surroundings of the first vehicle is based on traffic conditions around the particular time the first vehicle event occurred (Kentley, [0061]; [0091]). Therefore, in view of Kentley, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the method and system of ‘922, by providing distance to the object and traffic condition as taught by Kentley, since an obstacle that may impact path planning at planner (Kentley, [0073]) and An ETA may be determined based on information provided to a planner of the AV system, including traffic conditions, known route distance and expected travel velocity (Kentley, [0162]). Claims 10 and 20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 – 2, 5, 7, 9, 10, 13 - 15 of U.S. Patent No. 12008922 (‘922) in view of Kentley et al. (US 2017/0123422 A1), further in view of Walther et al. (US 10,599,546 B1). Re claims 10, 20: ‘922 does not explicitly disclose modification of the one or more metrics due to an individual one of the set of simulation scenarios that resulted in a simulated accident is varied based on a difficulty level of the individual one of the set of simulation scenarios. Walther teaches 10. The system of claim 1, wherein modification of the one or more metrics due to an individual one of the set of simulation scenarios that resulted in a simulated accident is varied based on a difficulty level of the individual one of the set of simulation scenarios. 20. The method of claim 11, wherein modification of the one or more metrics due to an individual one of the set of simulation scenarios that resulted in a simulated accident is varied based on a difficulty level of the individual one of the set of simulation scenarios (Walther, fig. 5, “Test Variations”; col. 6, lines 29 – 52, “The testing scenario can indicate one or more testing parameters of a test such as, for example: the type of geographic area represented in the testing environment ( e.g., intersection, highway, cul-de-sac, dead end, etc.), features of the geographic area (e.g., train tracks, obstructions, etc.); one or more objects within the testing environment (e.g., vehicles, bicycles, pedestrians, etc.); weather condition(s), and/or other parameters. An individual test can be a variation of the testing scenario that evaluates the one or more autonomous vehicle capabilities”; col. 7, lines 36 – 51, “The testing system can obtain data indicative of user input that identifies a testing scenario as corresponding to the test. Additionally or alternatively, the user interface can present an interactive element (e.g., text entry box, etc.) via which the user can define a new testing scenario. For example, the user can provide user input (e.g., via the user interface) indicating one or more testing parameters (e.g., four way intersection, two objects) and the testing system can generate a new testing scenario based at least in part on such user input”; changing the test parameters also changes difficulty and complexity). Therefore, in view of Walther, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system/method described in ‘922, by customizing the one or more metric as taught by Walther, to provide the user interface can present an interactive element (e.g., text entry box, etc.) via which the user can define a new testing scenario. For example, the user can provide user input (e.g., via the user interface) indicating one or more testing parameters (e.g., four way intersection, two objects) and the testing system can generate a new testing scenario based at least in part on such user input. Claims 3 and 13 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 – 2, 5, 7, 9, 10, 13 - 15 of U.S. Patent No. 12008922 (‘922) in view of Kentley et al. (US 2017/0123422 A1), further in view of Switkes et al. (US 20130041576 A1). Re claims 3, 13: ‘922 does not explicitly disclose a lidar; nor disclose a following distance vehicle parameter. Switkes et al. (US 20130041576 A1) teaches systems and methods for facilitating participants of vehicular convoys to closely follow one another through partial automation (Switkes, Abstract). Switkes further teaches the lidar and following distance vehicle parameter (Switkes, [0043], “A forward-looking RADAR or LIDAR unit 1130, which senses distance and relative speed of the vehicle in front 410”; [0039], “following distance behind the front vehicle”). Therefore, in view of Switkes, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system/method described in ‘922, by providing the following distance as taught by Switkes, in order to provide systems and methods for a Semi-Autonomous Vehicular Convoying includes the ability to follow closely together in a safe, efficient, convenient manner (Switkes, [0072]). Claims 4 and 14 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 – 2, 5, 7, 9, 10, 13 - 15 of U.S. Patent No. 12008922 (‘922) in view of Kentley et al. (US 2017/0123422 A1), further in view of Rajvanshi et al. (US 2017/0341647 A1). Re claims 4, 14: ‘922 does not explicitly disclose a distance to a lane marking. Rajvanshi teaches the methods and system described herein may be used to assist an automated driving system of a host vehicle (Rajvanshi, Abstract). Rajvanshi further teaches 4. The system of claim 1, wherein the dynamic vehicle parameters include a distance of another real-world vehicle to a lane marking near the first vehicle (Rajvanshi, [0023]; [0024]; [0033] – [0035]; fig. 1; Rajvanshi teaches a distance of another real-world vehicle (target vehicle) to a lane marking near the first vehicle (host vehicle); see fig. 1), and wherein the first vehicle event scenario includes a second simulated vehicle crossing the lane marking near the first vehicle (Rajvanshi, fig. 2A – 2D; [0016], “means to initiate or at least partially begin a lane change or lane departure from the subject vehicle's current lane”; [0044]; [0057], “when the leading target vehicle is cutting out or switching lanes (e.g., FIG. 2D), the method determines whether the host vehicle's current lane is clear”). 14. The method of claim 11, wherein the dynamic vehicle parameters include a distance of another real-world vehicle to a lane marking near the first vehicle (Rajvanshi, [0023]; [0024]; [0033] – [0035]; fig. 1; Rajvanshi teaches a distance of another real-world vehicle (target vehicle) to a lane marking near the first vehicle (host vehicle); see fig. 1), and wherein the first vehicle event scenario includes a second simulated vehicle crossing the lane marking near the first vehicle (Rajvanshi, fig. 2A – 2D; [0016], “means to initiate or at least partially begin a lane change or lane departure from the subject vehicle's current lane”; [0044]; [0057], “when the leading target vehicle is cutting out or switching lanes (e.g., FIG. 2D), the method determines whether the host vehicle's current lane is clear”). Therefore, in view of Rajvanshi, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system/method described in ‘922, by providing the distance to lane marking of another vehicle as taught by Rajvanshi, since one or more sensors may be used to track another real-world vehicle that is in front of the first vehicle and to determine the relative position of the another real-world vehicle with respect to the first vehicle. While this relative positional information may be useful in terms of maintaining a safe following distance, it may not be enough by itself to determine whether the first vehicle, another real-world vehicle, or both vehicles are switching lanes and how to control the first vehicle in response thereto. With sufficient information to determine which vehicle is switching lanes or "cutting out", vehicle autonomous or semi-autonomous systems, such as ACC systems, may be able to operate more favorably, thereby creating a better passenger and/or operator experience (Rajvanshi, [0003]). Response to Arguments Applicant's arguments filed 11/20/2025 have been fully considered but they are not persuasive. Applicant argues: However, par. [0118] refers to a simulator simulating an autonomous vehicle in a synthetic environment. Even if some of the simulated surfaces correspond to, e.g., real-world walls, this is not the same as or similar to the claimed concept. Likewise, even if some of the simulated agents are dynamic, this is not the same as or similar to the claimed concept, as these simulated agents are still synthetic in Kentley. Kentley's simulation scenarios do not mimic real-world circumstances associated with a corresponding vehicle event scenario. For example, the simulated cyclist in paragraph [0118] is not related or connected to any of the identified objects of interest in paragraph [0073]: paragraphs [0061 ;0073] merely discuss operations of a real-world autonomous vehicle monitoring its surroundings while driving. The examiner submits that Walther teaches the newly added limitation: “wherein a first vehicle event scenario is associated with a first set of real-world circumstances that is based on a first set of physical surroundings of the first vehicle around the particular time the first vehicle event occurred, and wherein the first vehicle event scenario has a scenario time period that begins prior to an occurrence of a potential vehicle event … wherein the individual ones of the set of simulation scenarios mimic real-world circumstances associated with a corresponding vehicle event scenario, such that a first simulation scenario mimics the first set of real-world circumstances associated with the first vehicle event scenario .. ” Walther teaches a simulation method includes determining, by the computing system, a testing scenario that corresponds to the test. The testing scenario is generated at least in part using real-world data (Walther, col. 1, lines 33 – 51) and Walther further states “The testing scenario can be generated at least in part using real-world data (e.g., driving log data, LIDAR data, RADAR data, image data, etc.)” (Walther, col. 10, lines 23 – 41) and “a test can be based on previously collected driving logs that were acquired from one or more autonomous vehicles deployed in the real-world. For such log-based testing, the testing system and/or a user can select driving log data (and/or a section thereof) associated with an event that occurred in the real-world” (Walther, col. 14, lines 15 - 44) and “the autonomous vehicle can generate driving log data that is indicative of the parameters of the vehicle (e.g., location, speed, etc.) as well as the parameters of the real-world scenario (e.g., geographic area, objects, etc.)” (Walther, col. 19, lines 18 - 37). Walther clearly teaches a wherein a first vehicle event scenario is associated with a first set of real-world circumstances that is based on a first set of physical surroundings of the first vehicle … Applicant argues: Regarding (2), the Office Action relies on paragraphs [0062;0149;0096] of Kentley to teach or describe determine one or more metrics that quantify a performance of the vehicle operator in running the set of simulation scenarios. [Office Action, p. 6.] However, the confidence levels discussed in par. [0062] of Kentley pertain to trajectories or routes, and not to the performance as claimed. The examiner submits that confidence level as explained in Kentley: trajectory evaluator 465 has insufficient information to ensure a confidence level high enough to provide collision-free, optimized travel, planner 464 can generate a request to teleoperator 404 for teleoperator support (Kentley, [0076]). Confidence level of a trajectory is a performance metric of a path selected by the autonomous vehicle (Kentley, [0058]). Furthermore, Kentley teaches one or more performance metric generated for the Autonomous Vehicle (AV) (Kentley, [0149], “generate an operational efficiency metric based on the AV systems 3602 that have been deployed as part of the AV service 3660. The operational efficiency metric may be computed based on a number of factors”; [0096], “Simulator 1440 can access map data and other data necessary for performing a simulation on the set of candidate trajectories, whereby simulator 1440 need not exhaustively reiterate simulations to confirm sufficiency”). According to MPEP 2111 [R-5], during patent examination, the pending claims must be “given their broadest reasonable interpretation consistent with the specification.” The Federal Circuit’s en banc decision in Phillips v. AWH Corp., 415 F.3d 1303, 75 USPQ2d 1321 (Fed. Cir. 2005) expressly recognized that the USPTO employs the “broadest reasonable interpretation” standard. The operational efficiency metric of the Autonomous Vehicle (AV) system is a performance metrics. Applicant argues: Rajvanshi to disclose or describe the vehicle parameters include a distance to a lane marking near one of the individual vehicles, and wherein the first vehicle event scenario includes a second simulated vehicle crossing the lane marking near the first vehicle. [Office Action, pp. 17-18.] However, Rajvanshi merely describes measuring or determining real-world distances, and not the use of such a distance in any simulation. More specifically, Kentley and Rajvanshi fail to teach or describe that the first vehicle event scenario includes a second simulated vehicle crossing the lane marking near the first vehicle, as in amended claims 4 and 14. The Examiner submits that Rajvanshi teaches wherein the first vehicle event scenario includes a second simulated vehicle crossing the lane marking near the first vehicle (Rajvanshi, fig. 2A – 2D; [0016], “means to initiate or at least partially begin a lane change or lane departure from the subject vehicle's current lane”; [0044]; [0057], “when the leading target vehicle is cutting out or switching lanes (e.g., FIG. 2D), the method determines whether the host vehicle's current lane is clear”). Rajvanshi teaches a lead vehicle (near the host vehicle) performs a lane change in a first vehicle event (Rajvanshi, fig. 2D). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACK YIP whose telephone number is (571)270-5048. The examiner can normally be reached Monday thru Friday; 9:00 AM - 5:00 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, XUAN THAI can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JACK YIP/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Apr 11, 2024
Application Filed
May 22, 2025
Non-Final Rejection — §103, §DP
Sep 09, 2025
Response Filed
Sep 17, 2025
Final Rejection — §103, §DP
Nov 20, 2025
Request for Continued Examination
Dec 04, 2025
Response after Non-Final Action
Mar 06, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588859
SYSTEM AND METHOD FOR INTERACTING WITH HUMAN BRAIN ACTIVITIES USING EEG-FNIRS NEUROFEEDBACK
2y 5m to grant Granted Mar 31, 2026
Patent 12592160
System and Method for Virtual Learning Environment
2y 5m to grant Granted Mar 31, 2026
Patent 12558290
BLOOD PRESSURE LOWERING TRAINING DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12525140
SYSTEMS AND METHODS FOR PROGRAM TRANSMISSION
2y 5m to grant Granted Jan 13, 2026
Patent 12512012
SYSTEM FOR EVALUATING RADAR VECTORING APTITUDE
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
33%
Grant Probability
70%
With Interview (+37.6%)
4y 1m
Median Time to Grant
High
PTA Risk
Based on 702 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month