Prosecution Insights
Last updated: April 19, 2026
Application No. 18/308,627

SYSTEM AND METHOD OF TRANSITIONING VEHICLE CONTROL

Non-Final OA §103
Filed
Apr 27, 2023
Examiner
MCANDREWS, TAWRI MATSUSHIGE
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Nissan North America, Inc.
OA Round
3 (Non-Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
93%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
69 granted / 103 resolved
+15.0% vs TC avg
Strong +26% interview lift
Without
With
+26.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
21 currently pending
Career history
124
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
50.8%
+10.8% vs TC avg
§102
11.3%
-28.7% vs TC avg
§112
23.7%
-16.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 103 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 1/27/2026 has been entered. Response to Arguments This Office Action is in response to the applicant’s amendments and remarks filed on 1/27/2026. Claims 5 and 18 are cancelled. Claims 1-4, 6-17, and 19-22 are pending for examination. Regarding the rejection of claims 1-4, 6-17, and 19-22 under 35 U.S.C. §103, applicant’s arguments have been considered but are deemed moot in view of the new grounds of rejection necessitated by applicant’s amendment, outlined below. Further, the new prior art used in rejection of the amended claims, Oba (US 20210286357 A1), Oba ‘357, was previously cited as relevant prior art not used in rejection of the claims. Applicant states “In the Office Action, additional prior art references are made of record. Applicant believes that these references do not render the claimed invention obvious.” Oba ‘357 teaches the amended limitations, as outlined below. Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-4, 6-11, 13-17, and 19-22 are rejected under 35 U.S.C. 103 as being obvious over Oba (US 20200231182 A1), in view of Oba (US 20210286357 A1), henceforth known as Oba ‘182 and Oba ‘357, respectively. Regarding claim 1, the claim limitations recite a method having limitations similar to those of claim 14 and is therefore rejected on the same basis, as outlined below. Regarding claim 14, Oba ‘182 discloses: A vehicle control system to transition control of a vehicle from an autonomous control state to a manual control state, the control system comprising: (Oba ‘182, FIG. 1; FIG. 3; ¶[0054]-¶[0055]; ¶[0127]) an on-board satellite navigation system in communication with a global positioning system; (Oba ‘182, FIG. 1; ¶[0055]; ¶[0059]) an on-board sensor network configured to monitor conditions internally and externally of the vehicle; (Oba ‘182, FIG. 1; FIG. 2; ¶[0055]; ¶[0056]-¶[0058]; ¶[0064]; ¶[0083]-¶[0086]) a display device; and (Oba ‘182, FIG. 1; ¶[0054]; ¶[0055]; ¶[0073]; ¶[0076]; ) a processor configured to: (Oba ‘182, FIG. 1; FIG. 23; ¶[0563]-¶[0565]) determine a transition point at which control of the vehicle changes from the autonomous control state to the manual control state; (Oba ‘182, FIG. 8; FIG. 9; FIG. 14: (Q11); ¶[0213]; ¶[0248]; ¶[0391]) present at least one task to a driver in advance of the transition point through the display device based on information obtained by the on-board satellite navigation system and the on-board sensor network, the at least one task presented to the driver being based on a current […] environment external of the vehicle detected by the on-board sensor network; (Oba ‘182, FIG. 3; FIG. 7; ¶[0124]-¶[0126]; ¶[0128]; ¶[0211]-¶[0213]; ¶[0221]-¶[0222]; ¶[0056]; ¶[0059]; ¶[0084]-¶[0086]; Where the vehicle control system alerts the driver the vehicle is switching to the manual driving mode, requiring the driver to perform the pointing and checking gesture, through the display (present at least one task to a driver in advance of the transition point through the display device) wherein the pointing and checking gesture must be made in the advancing direction of the vehicle and to the sides of the vehicle, which is determined by the GNSS position and surroundings information sensors (based on information obtained by the on-board satellite navigation system and the on-board sensor network), wherein the vehicle control system alerts the driver the vehicle is switching to the manual driving mode, requiring the driver to perform the pointing and checking gesture, through the display (the at least one task presented to the driver) wherein the pointing and checking gesture must be made in the advancing direction of the vehicle and to the sides of the vehicle, which is determined by the GNSS position and surroundings information sensors (being based on a current… environment external of the vehicle detected by the on-board sensor network); see ¶[0221] wherein the pointing action uses the road to determine the vertical plane) determine whether the response of the driver to the at least one task indicates preparedness of the driver; (Oba ‘182, FIG. 3; FIG. 7; FIG. 17; ¶[0185]; ¶[0190]; ¶[0201]-¶[0203]; Where the vehicle control system determines whether the driver performs the appropriate point and check gesture in the advancing direction of the vehicle (determine whether the response of the driver to the at least one task) indicating the driver is ready for manual driving (indicates preparedness of the driver)) transition control of the vehicle from the autonomous control state to the manual control state when the driver is determined to be prepared; and (Oba ‘182, FIG. 9; FIG. 12; FIG. 21; ¶[0467]-¶[0470]; ¶[0529]-¶[0532]; Where the vehicle control system transitions from the automatic driving mode to the manual driving mode (transition control of the vehicle from the autonomous control state to the manual control state) when the driver performs the appropriate point and check gesture in the advancing direction of the vehicle and/or meets the other requirements for transitioning to manual control (when the driver is determined to be prepared)) prevent the transition from the autonomous control state to the manual control state when the driver is determined to not be prepared. (Oba ‘182, FIG. 9; FIG. 12; FIG. 21; ¶[0467]-¶[0468]; ¶[0471]-¶[0480]; ¶[0251]; Where the vehicle control system maintains automatic driving control and performs an emergency evacuation mode (prevent the transition from the autonomous control state to the manual control state) when the driver fails to perform the gesture and/or other readiness determining steps (when the driver is determined to not be prepared)). Oba ‘182 is silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Oba ‘357 teaches: …the at least one task presented to the driver being based on a current traffic environment external of the vehicle detected by the on-board sensor network… (Oba ‘357, FIG. 1; FIG. 2; ¶[0058]-¶[0063], ¶[0175]-¶[0176]: automatic driving ending, prompts driver with instructions; ¶[0146]: monitors driver’s line of sight; ¶[0152]-¶[0153]: driver’s line of sight follows car driving ahead; ¶[0051], ¶[0155], ¶[0161], ¶[0165]: on board sensor system determines obstacles, vehicle ahead of the autonomous vehicle; Where, after informing the driver of the upcoming end of autonomous driving, the monitor unit tracks the driver’s line of sight to a car driving ahead (the at least one task presented to the driver being based on a current traffic environment external of the vehicle) detected by on-board sensors (detected by the on-board sensor network)). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Oba ‘182 with the features taught by Oba ‘357 because “…in order to more safely perform automatic driving, the driver needs to have a driving recovery ability when the driver plans to run in a manual driving section before the start of automatic cruising associated with driving of automatic driving, and it is necessary to have a mechanism to allow dedicated automatic driving cruising upon determining whether the driver has the driving recovery ability or not.” (Oba ‘357, ¶[0009]). Regarding claim 2, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 further discloses: wherein the manual control state is a fully manual control state. (Oba ‘182, FIG. 8; FIG. 9; ¶[0119]; ¶[0227]). Regarding claim 3, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 further discloses: wherein the manual control state is a partially manual control state. (Oba ‘182, FIG. 8; FIG. 9; ¶[0119]; ¶[0228]-¶[0230]). Regarding claim 4, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 further discloses: wherein the transition point is determined based on the autonomous control state not being available. (Oba ‘182, ¶[0111]-¶[0112]; ¶[0124]; ¶[0281]-¶[0289]). Regarding claim 6, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 further discloses wherein the at least one task presented to the driver is a query requiring a visual inspection by the driver, and eye movement of the driver responsive to the query is tracked to determine preparedness of the driver. (Oba ‘182, FIG. 3; FIG. 4; FIG. 7; ¶[0013]-¶[0014]; ¶[0124]-¶[0126]; ¶[0128]; ¶[0211]-¶[0213]; ¶[0221]-¶[0222]; ¶[0056]; ¶[0059]; ¶[0084]-¶[0086]; ¶[0089]; ¶[0155]; Where the pointing and checking gesture requires the driver’s eye to align with their pointing gesture on a vertical plane (wherein the at least one task presented to the driver is a query requiring a visual inspection by the driver), and where the driver’s eye is tracked in order to determine the pointing and checking gesture is complete (and eye movement of the driver responsive to the query is tracked to determine preparedness of the driver); additionally, the driver’s eyes are tracked to detect wakefulness; see FIG. 3; ¶[0164]-¶[0166]). Regarding claim 7, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 further discloses: wherein the at least one task presented to the driver is a query; and (Oba ‘182, ¶[0312]; ¶[0334]; ¶[0514]; Where the vehicle control system presents instructions to the driver) the query is responded to through a display device of the vehicle. (Oba ‘182, ¶[0312]; ¶[0334]; ¶[0514]; Where the driver responds to the instructions by double touching the display or entering a check mark to indicate that the driver knows about an event at a predetermined position during advancement of the vehicle). Regarding claim 8, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 7. Oba ‘182 further discloses: further comprising responding to the query with an audible response. (Oba ‘182, FIG. 3; ¶[0167]-¶[0170]; ¶[0312]; ¶[0334]; ¶[0514]; Where the driver responds to the instructions by loudly reading a word or pronouncing a calculation result). Regarding claim 9, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 further discloses: wherein the at least one task presented to the driver is a task requiring operation of the vehicle by the driver; and (Oba ‘182, FIG. 3; ¶[0139]; ¶[0172]-¶[0175]; Where the vehicle control system requires the driver to operate the steering wheel) the operation of the vehicle by the driver responsive to the task is compared to an autonomous control for the task under an autonomous control state to determine preparedness of the driver. (Oba ‘182, FIG. 3; ¶[0139]; ¶[0172]-¶[0175]; Where the vehicle control system requires the driver to operate the steering wheel and counteract the noise applied force in order correct the steering wheel angle, i.e. compares the driver’s steering input to overcome the added steering torque to the vehicle control system’s reference of the correct steering angle). Regarding claim 10, the claim limitations recite a method having limitations similar to those of claim 15 and is therefore rejected on the same basis, as outlined below. Regarding claim 15, Oba ‘182 and Oba ‘357 teach the vehicle control system according to claim 14. Oba ‘182 further discloses: wherein the processor is further configured to determine an amount of time for the vehicle to reach the transition point. (Oba ‘182, ¶[0170]; Where the vehicle determines the time until the takeover point). Regarding claim 11, the claim limitations recite a method having limitations similar to those of claim 16 and is therefore rejected on the same basis, as outlined below. Regarding claim 16, Oba ‘182 and Oba ‘357 teach the vehicle control system according to claim 15. Oba ‘182 further discloses: wherein the at least one task presented to the driver is based on the amount of time determined to reach the transition point. (Oba ‘182, ¶[0169]-¶[0170]; Where the vehicle determines whether the question to the driver can be retried, i.e. amount of questions, based on the time until the takeover point). Regarding claim 13, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 further discloses: further comprising upon presenting the at least one task to the driver, transmitting response data input by the driver responsive to the least one task to a remote server, (Oba ‘182, FIG. 2; FIG. 10; ¶[0081]; ¶[0090]; ¶[0102]; ¶[0337]-¶[0344]; Where the vehicle control system transmits the learning results to a remote server, which includes the driver’s response to the request to point and check, requiring the task be presented to the driver in order to capture the driver’s response) transmitting sensor data captured by a vehicle sensor associated with the at least one task to the remote server, and (Oba ‘182, FIG. 2; FIG. 10; ¶[0081]; ¶[0090]; ¶[0102]; ¶[0337]-¶[0344]; Where the vehicle control system transmits other sensor data captured of the driver such as passive monitoring data as part of the learning result to the remote server) updating the analysis of the sensor data captured by the vehicle sensor based on the response data input by the driver. (Oba ‘182, FIG. 2; FIG. 10; ¶[0081]; ¶[0090]; ¶[0102]; ¶[0337]-¶[0344]; Where the learning unit 126 uses the learned driver specific traits in order to more accurately determine the driver’s driving ability based on the active monitoring, i.e. sensor data captured of the driver and the actions performed by the driver). Regarding claim 17, Oba ‘182 and Oba ‘357 teach the vehicle control system according to claim 14. Oba ‘182 further discloses: wherein the on-board sensor network includes at least one internal camera positioned to detect behavior of the driver responsive to the at least one task. (Oba ‘182, FIG. 1; FIG. 2; ¶[0085]-¶[0086]; ¶[0155]-¶[0156]). Regarding claim 19, Oba ‘182 and Oba ‘357 teach the vehicle control system according to claim 14. Oba ‘182 further discloses: wherein the display device includes at least one of a display screen and a speaker configured to present the at least one task to the driver. (Oba ‘182, FIG. 3; FIG. 7; ¶[0077]; ¶[0124]-¶[0126]; ¶[0128]; ¶[0211]-¶[0213]; ¶[0221]-¶[0222]; ¶[0056]; ¶[0059]; ¶[0084]-¶[0086]; ¶[0167]-¶[0170]; ¶[0312]; ¶[0334]; ¶[0514]; Where the vehicle control system alerts the driver of the required task via the display or through sound). Regarding claim 20, Oba ‘182 and Oba ‘357 teach the vehicle control system according to claim 14. Oba ‘182 further discloses: wherein the at least one task presented to the driver is based on information input by the driver through the display device. (Oba ‘182, ¶[0312]; ¶[0334]; ¶[0514]; Where the driver responds to the instructions by double touching the display or entering a check mark to indicate that the driver knows about an event at a predetermined position during advancement of the vehicle). Regarding claim 21, the claim limitations recite a method having limitations similar to those of claim 22 and is therefore rejected on the same basis, as outlined below. Regarding claim 22, Oba ‘182 and Oba ‘357 teach the vehicle control system according to claim 14. Oba ‘182 further discloses: wherein the task based on the current […] environment external of the vehicle presents a query to the driver requiring a visual inspection of the current […] environment external of the vehicle. (Oba ‘182, FIG. 3; FIG. 7; ¶[0124]-¶[0126]; ¶[0128]; ¶[0211]-¶[0213]; ¶[0221]-¶[0222]; ¶[0056]; ¶[0059]; ¶[0084]-¶[0086]; ¶[0162]; Where the vehicle control system alerts the driver the vehicle is switching to the manual driving mode requiring the driver to perform the pointing and checking gesture (wherein the task based on the current environment external of the vehicle), through the display, requesting the predetermined gesture as specified in ¶[0162] (presents a query to the driver), wherein the pointing and checking gesture must be made in the advancing direction of the vehicle and to the sides of the vehicle, which is determined by the GNSS position and surroundings information sensors and where the pointing action uses the road to determine the vertical plane as specified in ¶[0221] (requiring a visual inspection of the current environment external of the vehicle)). Additionally, Oba ‘357 further teaches: wherein the task based on the current traffic environment external of the vehicle presents a query to the driver requiring a visual inspection of the current traffic environment external of the vehicle. (Oba ‘357, FIG. 1; FIG. 2; ¶[0058]-¶[0063], ¶[0175]-¶[0176]: automatic driving ending, prompts driver with instructions; ¶[0146]: monitors driver’s line of sight; ¶[0152]-¶[0153]: driver’s line of sight follows car driving ahead; ¶[0051], ¶[0155], ¶[0161], ¶[0165]: on board sensor system determines obstacles, vehicle ahead of the autonomous vehicle; Where, after informing the driver of the upcoming end of autonomous driving (wherein the task based on the current traffic environment external of the vehicle presents a query to the driver), the monitor unit tracks the driver’s line of sight to a car driving ahead (requiring a visual inspection of the current traffic environment external of the vehicle) detected by on-board sensors). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Oba ‘182 with the features taught by Oba ‘357 for at least the same reasons outlined in claim 14, above. Claim 12 is rejected under 35 U.S.C. 103 as being obvious over Oba ‘182 and Oba ‘357 as applied to claim 1, above, and in further view of Sugano (US 20210309262 A1), henceforth known as Sugano. Regarding claim 12, Oba ‘182 and Oba ‘357 teach the method of transitioning control according to claim 1. Oba ‘182 and Oba ‘357 fail to teach the limitations of claim 12 as a whole. However, in the same field of endeavor, Sugano teaches: wherein the at least one task is based on whether the vehicle is transitioning to a fully manual control state or to a partially manual control state. (Sugano, FIG. 1; FIG. 2; ¶[0035]; ¶[0046]-¶[0047]; ¶[0048]; ¶[0033]; Where the driver is required to perform different tasks (wherein the at least one task), i.e. holding the steer wheel and/or operating the brake/acceleration pedals, based on whether the transition is to the manual driving or to the second automated driving mode which is partially manual (is based on whether the vehicle is transitioning to a fully manual control state or to a partially manual control state); the driver must operate both the steering wheel and a pedal to switch to the manual driving whereas the driver need only operate the steering wheel to switch to the second automated driving mode, i.e. partially manual mode). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Oba ‘182 and Oba ‘357 with the features taught by Sugano because “…in the case where the automated driving is switched to the manual driving too easily, the automated driving is switched to the manual driving before the driver is ready therefor. As a result, a safety problem possibly occurs” (Sugano, ¶[0006]). That is, the features taught by Sugano prepare the driver for the different levels of manual driving, providing a safer transition between driving modes. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Okada et al. (US 20240043031 A1) discloses an HCU, which controls information presentation to a driver of a vehicle with autonomous driving function, is configured to: determine a switch between a monitoring unnecessary state and a monitoring necessary state, the monitoring unnecessary state being a state in which stoppage of periphery monitoring by the driver during execution of the autonomous driving being permitted, the monitoring necessary state being a state in which the stoppage of the periphery monitoring by the driver during execution of the autonomous driving being prohibited; determine whether stoppage of steering wheel grip by the driver is permittable in the monitoring necessary state; and permit, when the monitoring unnecessary state is switched to the monitoring necessary state in which the stoppage of steering wheel grip is permittable, the stoppage of steering wheel grip in the monitoring necessary state after executing a grip request, which requests the driver to grip the steering wheel. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tawri M McAndrews whose telephone number is (571)272-3715. The examiner can normally be reached M-W (0800-1000). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Lee can be reached at (571)270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TAWRI M MCANDREWS/ Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Apr 27, 2023
Application Filed
Jun 17, 2025
Non-Final Rejection — §103
Sep 17, 2025
Response Filed
Nov 24, 2025
Final Rejection — §103
Jan 14, 2026
Interview Requested
Jan 26, 2026
Examiner Interview Summary
Jan 26, 2026
Examiner Interview (Telephonic)
Jan 27, 2026
Request for Continued Examination
Feb 20, 2026
Response after Non-Final Action
Mar 04, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597299
SYSTEM AND METHOD FOR REPOSITIONING VEHICLES IN A GEOGRAPHIC AREA BASED ON UTILIZATION METRIC
2y 5m to grant Granted Apr 07, 2026
Patent 12594969
VEHICLE CONTROLLER, METHOD, AND PROGRAM FOR STEERING REACTION DURING MANUAL DRIVING FOR RETURNING TO A PRESET ROUTE
2y 5m to grant Granted Apr 07, 2026
Patent 12572809
Generating Labeled Training Instances for Autonomous Vehicles Using Temporally Correlated Timestamps
2y 5m to grant Granted Mar 10, 2026
Patent 12573091
SYSTEM AND METHOD OF CALIBRATING AN OPTICAL SENSOR MOUNTED ON BOARD OF A VEHICLE USING A GRADUATED MOUNTING BAR
2y 5m to grant Granted Mar 10, 2026
Patent 12540455
WORKING MACHINE CONTROL METHOD USING TARGET POSITION CURVE AND REWARD MODEL, WORKING MACHINE CONTROL DEVICE AND WORKING MACHINE
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
93%
With Interview (+26.1%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 103 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month