Prosecution Insights
Last updated: April 19, 2026
Application No. 17/289,128

DEVICE AND METHODS FOR TRANSRECTAL ULTRASOUND-GUIDED PROSTATE BIOPSY

Final Rejection §103§112
Filed
Apr 27, 2021
Examiner
FEDORKY, MEGAN TAYLOR
Art Unit
3796
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
The Johns Hopkins University
OA Round
2 (Final)
32%
Grant Probability
At Risk
3-4
OA Rounds
4y 2m
To Grant
74%
With Interview

Examiner Intelligence

Grants only 32% of cases
32%
Career Allow Rate
10 granted / 31 resolved
-37.7% vs TC avg
Strong +42% interview lift
Without
With
+41.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
51 currently pending
Career history
82
Total Applications
across all art units

Statute-Specific Performance

§101
17.9%
-22.1% vs TC avg
§103
39.3%
-0.7% vs TC avg
§102
19.5%
-20.5% vs TC avg
§112
20.9%
-19.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 31 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims The amendments and remarks filed on 23JUN2025 have been entered and considered. Claims 1-6, 8-10, 13-14, & 16 have been amended. Claims 15-22 were previously withdrawn. No claims have been added or cancelled. New matter has been added to claim 1. Claims 1-14 are under examination. Response to Arguments Applicant's amendments filed 23JUN2025 regarding the claim objections have been fully considered and have been found to obviate the objections. Therefore, the claim objections have been withdrawn. Applicant's amendments filed 23JUN2025regarding the interpretations under 35 USC 112(f) have been fully considered and have been found to obviate the interpretations. Therefore, the 112(f) interpretations have been withdrawn. Applicant's amendments filed 23JUN2025 regarding the rejections under 35 USC 112(b) have been fully considered and have been found to obviate the rejections. Therefore, the 112(b) rejections have been withdrawn. Applicant's amendments filed 23JUN2025 regarding the rejections under 35 USC 102(a)(1) have been fully considered and have been found to obviate the rejections of claims 1-8, 11-12, & 14. Therefore, the 102(a)(1) rejections have been withdrawn. Applicant's amendments filed 23JUN2025 regarding the rejections under 35 USC 103(a) have been fully considered and have been found to obviate the rejections of claims 9-10 & 13. Therefore, the 103(a) rejections have been withdrawn. Claim Objections Claim 1-2 are objected to because of the following informalities: Regarding Claim 1: The claim recites “to perform operation comprising” in lines 12-13. The examiner believes that this should be “to perform operations comprising” to be grammatically correct. Appropriate correction is required. The claim recites “recording coordinates for the actuated, robotic manipulation arm” in line 16. The examiner believes that this is missing words and should be “recording position coordinates for the actuated, robotic manipulation arm” to be grammatically correct. Appropriate correction is required. Regarding Claim 2: The claim has been amended to now depend from claim 4 rather than claim 1. Claim 2 cannot come from a claim that hasn’t been introduced prior and therefore cannot depend from claim 4. This appears to be an error since claim 2 appears to come from independent claim 1. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-14 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Regarding Claim 1: The limitation “a processing device configured with processor executable instructions to perform operation”, found in lines 12-13 of the claim, is not found in the specification. The closest item found is a computing device, which can be found in ¶0101 describing the software and database storage. Therefore, the claims lack written description since this term is not supported by the originally filed disclosure. For the purposes of examination, the examiner is interpreting this as being the same as a processor executing software instructions. Clarification or correction is required. The limitation “receiving data from and transmitting data to the robot controller”, found in line 14 of the claim, is not found in the specification. The closest item to “receiving data from and transmitting data to the robot controller” is seen in Figure 2 of the Drawings. Therefore, the claims lack written description since this term is not supported by the originally filed disclosure. For the purposes of examination, the examiner is interpreting this as being the same as the robot controller having a feedback loop. Clarification or correction is required. Claims 2-14 are further rejected under 112(a) for depending upon the rejected independent claim 1. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding Claim 1: The claim recites “with respect to image data from the same time ” in lines 17-18 of the claim. It is unclear what time frame is being referenced . For the purposes of examination, the examiner is interpreting this as meaning in real time. Clarification or correction is required. The claim recites “receiving data from and transmitting data to the robot controller” in line 14 of the claim. It is unclear what data is being collected from the robot controller, and what is being transmitted to. For the purposes of examination, the examiner is interpreting this as a receiving positional data regarding the robot controller from the sensors on the controller. Clarification or correction is required. Claims 2-14 are further rejected under 112(b) for depending upon the rejected independent claim 1. Regarding Claim 2: The metes and bounds of the claim are unclear as the claim as amended depends on itself rather than an independent claim. Thus, the metes and bounds cannot be determined for claims 2-5. Claims 3-5 are further rejected under 112(b) for depending upon the rejected claim 2. Regarding Claim 7: Claim 7 recites the limitation "manipulation arm" in line 2. There is insufficient antecedent basis for this limitation in the claim. The claim recites “manipulation arm” in line 2 of the claim. It is unclear if this the same as the actuated robotic manipulation arm as previously claimed or if this is a new aspect of the invention. For the purposes of examination, the examiner is interpreting this as the actuated robotic manipulation arm. Clarification or correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-9, 11-12, & 14 are rejected under 35 U.S.C. 103 as being unpatentable over Kumar et al. (Us Publication No 20130116548; Previously Cited) in view of Kamen et al. (US Publication No. 20150366546). Regarding claim 1, Kumar discloses a system for prostate biopsy (Kumar Abstract “The invention presents tools to improve a 3-D image aided biopsy or treatment procedure”; ¶0040) comprising: an ultrasound probe (Kumar ¶0040 “The probe handle is held by a robotic arm”) a robot controller (Kumar ¶0042 “The software is installed onto the computer system hard drive and/or electronic memory, and is accessed and controlled by the computer's operating system… The user interacts with the software via keyboard, mouse, voice recognition, and other user-interface devices (e.g., user I/O devices) connected to the computer system”), wherein the robot controller is configured to communicate with and control the actuated, robotic manipulation arm and the ultrasound probe in a manner that minimizes prostate deflection (Kumar ¶0060 “automatically load a system generated plan customized to the prostate. The custom plan may be computed such that the user-specified regions such as urethra and neighboring organs and nerve bundles may be avoided during plan generation. This helps avoid accidental placement of needle for either biopsy”; ¶0063); an ultrasound machine configured for receiving and processing signals from the ultrasound probe and configured for transmitting image data (Kumar ¶0041 “The ultrasound probe 10 sends signal to the ultrasound system 30, which may be connected to the same computer (e.g., via a video image grabber) as the output of the position sensors 14. In the present embodiment, this computer is integrated into the imaging system 30. The computer 20 therefore has real-time 2D and/or 3D images of the scanning area in memory 22. The image coordinate system and the robotic arm coordinate system are unified by a transformation. Using the acquired 2D images, a prostate surface 50 (e.g., 3D model of the organ) and biopsy needle 52 are simulated and displayed on a display screen 40 with their coordinates displayed in real-time as best shown in FIG. 2. A biopsy needle may also be modeled on the display, which has a coordinate system so the doctor has the knowledge of the exact locations of the needle and the prostate.”); recording coordinates for the actuated, robotic manipulation arm (Kumar ¶0047 “The user then clicks on a number of points along the urethra and the information is stored in 3-D world coordinate systems.”). Kumar does not clearly disclose an actuated, robotic manipulation arm; wherein the actuated, robotic manipulation arm is configured to control movement of the ultrasound probe; a processing device configured with processor executable instructions to perform operation comprising: receiving data from and transmitting data to the robot controller; receiving image data from the ultrasound machine; calculating a position of the actuated, robotic manipulation arm with respect to image data from the same time. Kamen in a similar field of endeavor of ultrasound led robotic prostate biopsies teaches an actuated, robotic manipulation arm; wherein the actuated, robotic manipulation arm is configured to control movement of the ultrasound probe (Kamen ¶0022 “The robotic arm 101 is configured, via the computer 111, to operate in a plurality of operating modes to control the movement of the joints 103 and the ultrasound probe 107.”); a processing device configured with processor executable instructions to perform operation comprising: receiving data from and transmitting data to the robot controller; receiving image data from the ultrasound machine; calculating a position of the actuated, robotic manipulation arm with respect to image data from the same time (Kamen ¶0019 “The computer 111, or a processor therein, is further configured to receive one or more real-time ultrasound images captured by the ultrasound probe 107 operably connected therewith. Using the real-time ultrasound image, the computer 111, or a processor therein, is configured to generate an updated three-dimensional model”). Before the effective filing date of the claimed invention, it would have been obvious to a person of skill in the art to modify system of Kumar with an actuated, robotic manipulation arm; wherein the actuated, robotic manipulation arm is configured to control movement of the ultrasound probe; a processing device configured with processor executable instructions to perform operation comprising: receiving data from and transmitting data to the robot controller; receiving image data from the ultrasound machine; calculating a position of the actuated, robotic manipulation arm with respect to image data from the same time, as taught by Kamen, for the purposes of facilitating active or passive control over the robotic arm and the ultrasound probe (Kamen ¶0021). Regarding claim 2, Kumar further discloses the processing device being programmed with the prostate coordinate system (Kumar ¶0041 “Using the acquired 2D images, a prostate surface 50 (e.g., 3D model of the organ) and biopsy needle 52 are simulated and displayed on a display screen 40 with their coordinates displayed in real-time as best shown in FIG. 2.” Showing a prostate coordinate system being generated by the computer program; ¶0040 “Hence, the computer 20 has real-time information of the location and orientation of the probe 10 in reference to a unified Cartesian (x, y, z) coordinate system.” Showing an initial coordinate system already built into the computer calibration.). Regarding claim 3, Kumar further discloses wherein the prostate coordinate system comprises a program for determining the prostate coordinate system based on anatomical landmarks of a prostate. (Kumar ¶0015 “automatic or semi-automatic segmentation of prostate. Using this information, the base and apex of the prostate can be identified as intersection of the urethra with the prostate surface. After identification of base and apex, the system can divide the prostate into a number of zones”; ¶0025; ¶0046 “The zone identification subsystem in presented inventions not only computes the zones based on a urethra delineation or segmentation, but also can compute and report the zones for sampled biopsy cores”; ¶0047). Regarding claim 4, Kumar further discloses where the anatomical landmarks are an apex (A) and a base (B) of the prostate (Kumar ¶0015 “After identification of base and apex, the system can divide the prostate into a number of zones”; ¶0046; ¶0049); and the program for determining the prostate coordinate system further includes using A and B to determine a prostate coordinate system (PCS) for the prostate (Kumar ¶0015; ¶0046); and determining the direction of the PCS based on a Left-Posterior-Superior (LPS)system (Kumar ¶0048, full paragraph describing the alignment of points based on the anatomical structures; Figure 4 showing the sagittal view of the prostate; Figure 10 showing the S axis relationship to the apex and base in view of the sagittal plane.). Regarding claim 5, Kumar further discloses calculating an optimal approach and order for a set of biopsy points is determined from the PCS. (Kumar ¶0046 “The zone identification subsystem in presented inventions not only computes the zones based on a urethra delineation or segmentation, but also can compute and report the zones for sampled biopsy cores. The definition of zones may be adjustable as per user's preferences, for example if the user wishes to only distinguish between left and right, the system can be adjusted by the user accordingly.”; ¶0048 “The three points may be computed automatically for selection of most robust set of points”). Regarding claim 6, Kumar further discloses the processing device being programmed with a systematic or targeted biopsy plan. (Kumar ¶0060 “The grid may have a user-adjustable spacing and may automatically place planned sites on all grid elements lying inside the prostate that avoid certain regions. The user may then, either manually select sites based on this place for a plan, or automatically load a system generated plan customized to the prostate.” ; ¶0026). Regarding claim 7, Kumar further discloses wherein the robot controller allows for computer control of the ultrasound probe and manipulation arm. (Kumar ¶0048 “This image may be captured and a deformable model may be fitted on it to automatically detect the urethra 60, or it may be detected semi-automatically by user clicking on the base P.sub.0 and apex points P.sub.2.” Showing how the physician may manually provide inputs for the imaging and selection of data to create a biopsy plan, where the machine is also able to do so automatically). Regarding claim 8, Kumar further discloses a graphical user interface, such that a physician can visualize and control the actuated, robotic manipulation arm and in turn the ultrasound probe, and such that the physician can view image data and position data. (Kumar ¶0061 “The invention contains a subsystem for providing visual feedback to the user for reaching a planned target for biopsy or dose delivery procedure.”; ¶0042). Regarding claim 9, Kumar in combination with Kamen discloses the limitations of claim 1 as described above, but Kumar does not further disclose the actuated, robotic manipulation arm moves the probe with 4-degrees- of-freedom. Kamen further teaches the actuated, robotic manipulation arm moves the probe with 4-degrees- of-freedom (Kamen ¶0017 “The multi-link robotic arm 101 includes a plurality of joints 103, an end-effector 105 and an ultrasound probe 107.” Where each joint can represent 1-2 degrees of freedom in the system). Before the effective filing date of the claimed invention, it would have been obvious to a person of skill in the art to modify system of Kumar with an actuated, robotic manipulation arm moves the probe with 4-degrees- of-freedom, as taught by Kamen, for the purposes of enhancing the robot's dexterity and ability to manipulate objects. Regarding claim 11, Kumar further discloses wherein the ultrasound probe is configured to apply minimal pressure over a prostate gland to avoid prostate deformations and skewed imaging. (Kumar ¶0060 “automatically load a system generated plan customized to the prostate. The custom plan may be computed such that the user-specified regions such as urethra and neighboring organs and nerve bundles may be avoided during plan generation. This helps avoid accidental placement of needle for either biopsy”; ¶0021 “For a given target point and needle type, the system displays how deep the needle should be inserted before firing. It may be desirable to know this information before firing, since there are cases when a physician may not want to overshoot the needle and damage neighboring organ by placing a radioactive seed there or piercing it.”; ¶0063-¶0064). Regarding claim 12, Kumar further discloses wherein the prostate can be approached with minimal pressure and deformations also for biopsy. (Kumar ¶0021; ¶0060; ¶0063). Regarding claim 14, Kumar further discloses wherein the images are acquired for a purpose of documenting a clinical measure. (Kumar ¶0019 “also provide methods for identification and reporting the zone for a sampled core following a procedure and recording of actual location of a sampled or treated site.”; ¶0023). Claims 10 & 13 are rejected under 35 U.S.C. 103 as being unpatentable over Kumar et al. (Us Publication No 20130116548; Previously Cited) in view of Kamen et al. (US Publication No. 20150366546), and Glossup et al. (US Publication No 20130338477; Previously Cited). Regarding claim 10, Kumar in combination with Kamen teaches the limitations of claim 1, but does not disclose a microphone, wherein the microphone triggers automatic acquisition of ultrasound images based on a firing noise or a signal from the biopsy needle. Glossup in a similar field of prostate imaging teaches a microphone, wherein the microphone triggers automatic acquisition of ultrasound images based on a firing noise or a signal from the biopsy needle (Glossup ¶0061 “Microphone 401 may be for example a microelectromechanical system (MEMS) microphone, piezoelectric, carbon or other sound sensitive microphone that may be used to "listen" for the sound of the biopsy gun click. Sound may be conducted through signal cable 403 to a processor capable of receiving and interpreting signal 404, thereby triggering the correct sampling of the data and images.”; Figures 4 & 5; ¶0064). Before the effective filing date of the claimed invention, it would have been obvious to a person of skill in the art to modify system of Kumar combined with Kamen with the aspects of Glossup by integrating a microphone that triggers automatic acquisition of ultrasound images based on a firing noise or a signal from the biopsy needle into the biopsy device of Kumar combined with Kamen. The motivation to integrate this into the system of Kumar combined with Kamen would be to have an imaging system that can automatically acquire images during a biopsy at the times of sample retrieval to improve tracking of biopsy locations (Glossup ¶0061; ¶0009). This means with the integration of the microphone to trigger ultrasound imaging at the sound of sampling, it is possible to determine from any coordinate on the ultrasound image, the location that the point represents in patient space (Glossup ¶0082), and aids in tracking the locations of biopsies for targeted sample acquisition and later revisiting of sites. Regarding claim 13, Kumar combined with Kamen further does not disclose automatically acquiring images from medical imaging equipment based on a firing noise of a biopsy needle, or signal from another medical instrument. Glossup teaches automatically acquiring images from medical imaging equipment based on a firing noise of a biopsy needle, or signal from another medical instrument. (Glossup ¶0061 “Microphone 401 may be for example a microelectromechanical system (MEMS) microphone, piezoelectric, carbon or other sound sensitive microphone that may be used to "listen" for the sound of the biopsy gun click. Sound may be conducted through signal cable 403 to a processor capable of receiving and interpreting signal 404, thereby triggering the correct sampling of the data and images.”; Figures 4 & 5; ¶0064 “An accelerometer or vibration detector may be substituted for the microphone.”). Before the effective filing date of the claimed invention, it would have been obvious to a person of skill in the art to modify system of Kumar combined with Kamen with the aspects of Glossup by integrating the ability for automatically acquiring images from medical imaging equipment based on firing noise of a biopsy needle, or signal from another medical instrument into the biopsy device of Kumar combined with Kamen. The motivation to integrate this into the system of Kumar combined with Kamen would be to have an imaging system that can automatically acquire images during a biopsy at the times of sample retrieval to improve tracking of biopsy locations (Glossup ¶0061; ¶0009). This means with the integration of the microphone or another device to trigger ultrasound imaging at the instant of sampling, it is possible to determine from any coordinate on the ultrasound image, the location that the point represents in patient space (Glossup ¶0082), and aids in tracking the locations of biopsies for targeted sample acquisition and later revisiting of sites. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEGAN FEDORKY whose telephone number is (571)272-2117. The examiner can normally be reached M-F 9:30-4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer McDonald can be reached on M-F 9:30-4:30. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MEGAN T FEDORKY/ Examiner, Art Unit 3796 /Jennifer Pitrak McDonald/Supervisory Patent Examiner, Art Unit 3796
Read full office action

Prosecution Timeline

Apr 27, 2021
Application Filed
Aug 09, 2024
Non-Final Rejection — §103, §112
Feb 17, 2025
Response Filed
Feb 17, 2025
Response after Non-Final Action
Jul 09, 2025
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12527959
Compliance Voltage Monitoring and Adjustment in an Implantable Medical Device Using Low Side Sensing
2y 5m to grant Granted Jan 20, 2026
Patent 12396787
CATHETER WITH INTEGRATED THIN-FILM MICROSENSORS
2y 5m to grant Granted Aug 26, 2025
Patent 12376904
DYNAMIC LASER STABILIZATION AND CALIBRATION SYSTEM
2y 5m to grant Granted Aug 05, 2025
Patent 12350026
PHOTOPLETHYSMOGRAPHY SENSOR AND SEMICONDUCTOR DEVICE INCLUDING THE SAME
2y 5m to grant Granted Jul 08, 2025
Patent 12295647
HIGH DENSITY MAPPING CATHETER FOR CRYOBALOON ABLATION
2y 5m to grant Granted May 13, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
32%
Grant Probability
74%
With Interview (+41.9%)
4y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 31 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month