Prosecution Insights
Last updated: April 19, 2026
Application No. 18/790,529

Secure Exclaves

Non-Final OA §101§103
Filed
Jul 31, 2024
Examiner
NIPA, WASIKA
Art Unit
2433
Tech Center
2400 — Computer Networks
Assignee
Apple Inc.
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
2y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
226 granted / 302 resolved
+16.8% vs TC avg
Strong +30% interview lift
Without
With
+29.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
18 currently pending
Career history
320
Total Applications
across all art units

Statute-Specific Performance

§101
13.5%
-26.5% vs TC avg
§103
50.6%
+10.6% vs TC avg
§102
3.3%
-36.7% vs TC avg
§112
17.4%
-22.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 302 resolved cases

Office Action

§101 §103
Detailed Action The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This is the initial office action that has been issued in response to patent application, 18/790,529, filed on 07/31/2024. Claims 1-20, as originally filed, are currently pending and have been considered below. Claim 1, 16 and 19 are independent claim. Priority This application has PRO 63/584,032 filed on 09/20/2023. This application has PRO 63/584,029 filed on 09/20/2023. This application has PRO 63/584,037 filed on 09/20/2023. Drawing The drawings filed on 07/31/2024 are accepted by the examiner. Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. Information Disclosure Statement The information disclosure statements (IDS's) submitted on 07/31/2024 and 09/15/2025, are in compliance with provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. The claimed invention is not directed to patent eligible subject matter. Based upon consideration of all of the relevant factors with respect to the claim as a whole, claims 1-20 are determined to be directed to an abstract idea. The rationale for this determination is explained below: The claims 1-20 are directed to the abstract idea, an idea of itself. The claim 1-20 recite executing trusted and untrusted process, restricting tasks associated with untrusted process and using security criteria. Limitations in claim herein can be related to different situations like collecting new information and comparing new and stored information. Claim 1 recites isolating trusted and untrusted tasks and implementing security measures. Thus all of these concepts relate to comparing new and stored information and using rules to identify option. The steps can also be related to collecting information, analyzing it, and displaying certain results of the collection and analysis (Electric Power Group, LLC v. Alstom S.A. (Fed. Cir. 2016)). The claims do not require the use of a machine and may simply be performed mentally or with a pen and paper. A person can do the calculations and make the decisions without the assistance of a computer. The limitations, under broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, other than recited “memory/processing circuitry, non-transitory medium, nothing in the claim elements precludes the step from practically being performed in the mind (or with a pen and paper). For example, but for the generically claimed hardware language, the steps amount to a user manually reviewing the entitlements of user currently has and using this data to perform a corrective action. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “mental process” grouping of abstract ideas. Therefore the claims are abstract. The machines claimed to implement the abstract idea are merely generic computer components, including the processor, modules, computing system and computer readable medium. There is nothing present in the claims to indicate that these computing elements differ from any standard computer setup that can execute code with instructions as detailed in the claims. Furthermore, with respect to the claimed details towards performing set of neural network operations, these do not provide a practical application nor significantly more than abstract idea. The neural network model amounts to automating steps previously performed by a human (where a human can generate prompts e,g. questions related to the entitlements). The specification further shows that the neural network models are conventionally pre-trained models. Claims can be directed to an abstract idea that stands alone without technical implementation required to execute it to be considered abstract. Even if the steps of collecting data, comparing data, updating data and sending the updated data back are implemented via computer instructions, the step of “a user pressing a button to initiate the process” is directed to a conventional human activity that is irrelevant to the patentable features of the instant claim(s) (mayo v. prometheus). The claims do not recite additional elements that are sufficient to amount to significantly more than the judicial exception because the limitations are merely instructions to implement the abstract idea on a computer and require no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known to the industry. Viewed as a whole, these additional claim elements do not provide meaningful limitations to transform the abstract idea into a patent eligible application of the abstract idea such that the claims amount to significantly more than the abstract idea itself. Therefore, the claims are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter. See Alice Corporation Pty. Ltd. v. CLS Bank International. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1-6, 9, 11-20 are rejected under 35 U.S.C. 103 as being unpatentable over Bali (US Patent Application Publication No 2016/0364559 A1) in view of Yitbarek (US Patent Application Publication No 2020/0145419 A1). Regarding Claim 1, Bali discloses a computing device, comprising: one or more processors configured to (Bali, Fig-1, ¶[0019], one or more processor): co-execute trusted processes and untrusted processes in an isolated manner that includes implementing a secure environment in which a set of security criteria is enforced for data of the trusted processes (Bali, Fig-1, ¶[0020]- ¶[0022], the application execution environment includes an untrusted application execution environment and a trusted application execution environment. Unauthorized or unauthenticated users may access or modify data and applications in the untrusted application execution environment. For protecting confidential information from theft or misuse, the untrusted application execution environment may not be suitable for storing or processing biometric data. ¶[0023], the trusted execution environment includes data storage that is isolated from memory used by unauthorized process. Trusted environment securely quarantines certain data, including biometric data); and Bali does not explicitly discuss the following limitation that Yitbarek teaches: multiple heterogenous hardware accelerators configured to (Yitbarek, Fig-1, ¶[0034], the accelerator may be FPGA, ASIC, GPU): implement exclaves of the secure environment that extend enforcement of one or more of the set of security criteria within the hardware accelerators for data distributed to the hardware accelerators for performance of tasks associated with the trusted processes (Yitbarek, ¶[0041], the provisioning agent may be configured to determine whether any other trusted execution environment is using the TIO device and securely command the TIO device to exit the trusted I/O mode. ¶[0044], each of the TEEs may be embodied as a trust domain. Each TEE is isolated or otherwise protected from VMM, host partition and other TEEs with hardware support of the computing device). Bali in view of Yitbarek are analogous art because they are from the “same field of endeavor” and are from the same “problem solving area”. Namely, they pertain to the field of “trusted and untrusted communication”. It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the invention of Bali in view of Yitbarek to include the idea of accelerator devices that typically communicate directly with host devices and also directly with each other to ensure that only accelerators that are within the same trust boundary can communicate and malicious accelerator can not extract sensitive data (Yitbarek, ¶[0005]). Regarding Claim 2, Bali in view of Yitbarek discloses the computing device of claim 1, wherein, to extend the enforcement of the one or more security criteria, one or more of the hardware accelerators are further configured to: restrict tasks associated with the untrusted processes from accessing memory regions assigned to the trusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes). Regarding claim 3, Bali in view of Yitbarek discloses the computing device of claim 2, wherein, to restrict the tasks, the one or more hardware accelerators are further configured to: store virtual-to-physical address translations for the memory regions assigned to the trusted processes and for memory regions assigned to the untrusted processes (Yitbarek, ¶[0031], computing device may control access to particular memory pages by configuring one or more page tables. ¶[0112], the VMM, the host operating system and host driver may set registers, read and write memory addresses); and prevent the tasks associated with the untrusted processes from accessing the virtual-to-physical address translations for the memory regions assigned to the trusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes). Regarding Claim 4, Bali in view of Yitbarek discloses the computing device of claim 2, wherein the one or more hardware accelerators includes a graphics processing unit (GPU) configured to (Yitbarek, ¶[0034], ¶[0147], accelerators may be implemented as GPU): maintain a table identifying the memory regions assigned to the trusted processes and memory regions assigned to the untrusted processes (Yitbarek, ¶[0127], trusted agent may maintain a requester ID table or other data structure with bindings); and perform memory accesses in accordance with the table (Yitbarek, ¶[0099], page table for each trust domain). Regarding Claim 5, Bali in view of Yitbarek discloses the computing device of claim 1, wherein, to extend the enforcement of the one or more security criteria, one or more of the hardware accelerators are further configured to: physically isolate the distributed data associated with the trusted processes from distributed data associated with the untrusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes. Also Yitbarek, Fig-1, ¶[0034]). Regarding Claim 6, Bali in view of Yitbarek discloses the computing device of claim 5, wherein the one or more hardware accelerators include pipelines configured to: perform tasks associated with the untrusted processes and tasks associated with the trusted processes, wherein the pipelines include one or more additional pipeline stages that process the distributed data associated with the trusted processes and do not process the distributed data associated with the untrusted processes (Bali, Fig-3, ¶[0033], in response to receiving a biometrics request from the untrusted application, the driver sends a command to the trusted biometric application via the trusted biometric service to generate a cryptographic key for encrypting biometric data captured from biometric fingerprint sensor). Regarding Claim 9, Bali in view of Yitbarek discloses the computing device of claim 5, wherein the one or more hardware accelerators include one or more additional data buffers configured to: store the distributed data associated with the trusted processes, wherein the one or more additional buffers do not store the distributed data associated with the untrusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes. Also Yitbarek, Fig-1, ¶[0034]). Regarding Claim 11, Bali in view of Yitbarek discloses the computing device of claim 1, wherein, to extend the enforcement of the one or more security criteria, one or more of the hardware accelerators are further configured to: restrict sensor data provided to trusted processes from being provided to untrusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes). Regarding Claim 12, Bali in view of Yitbarek discloses the computing device of claim 11, wherein the one or more hardware accelerators include an image signal processor configured to: process sensor data received from a camera sensor, wherein the image signal processor includes a cutoff switch configured to enable or disable providing the sensor data to an untrusted process in response to the one or more security criteria being satisfied (Bali, ¶[0020], the biometric component includes sensor hub. Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes). Regarding Claim 13, Bali in view of Yitbarek discloses the computing device of claim 12, wherein the one or more security criteria include an indication being present in a frame being displayed to a user to indicate that the camera sensor is currently in use (Bali, ¶[0068], biometric component includes biometric sensor). Regarding Claim 14, Bali in view of Yitbarek discloses the computing device of claim 11, wherein the one or more hardware accelerators include an audio unit configured to: process sensor data received from a microphone sensor, wherein the audio unit includes a cutoff switch configured to enable or disable providing the sensor data to an untrusted process in response to the one or more security criteria being satisfied (Bali, ¶[0014], voice recognition sensor). Regarding Claim 15, Bali in view of Yitbarek discloses the computing device of claim 1, further comprising: a system on a chip (SoC) that includes the one or more processors and the one or more hardware accelerators (Yitbarek, ¶[0032], I/O subsystem may form a portion of a system on chip (SoC)). Regarding Claim 16, Bali discloses a computing device, comprising: one or more processors (Bali, Fig-1, ¶[0019], one or more processor); memory have program instructions stored therein that are executable by the one or more processors to (Bali, Fig-1, ¶[0019], one or more processor): isolate co-executing trusted processes and untrusted processes (Bali, Fig-1, ¶[0020]- ¶[0022], the application execution environment includes an untrusted application execution environment and a trusted application execution environment. Unauthorized or unauthenticated users may access or modify data and applications in the untrusted application execution environment. For protecting confidential information from theft or misuse, the untrusted application execution environment may not be suitable for storing or processing biometric data. ¶[0023], the trusted execution environment includes data storage that is isolated from memory used by unauthorized process. Trusted environment securely quarantines certain data, including biometric data); and Bali does not explicitly discuss the following limitation that Yitbarek teaches: a plurality of heterogenous hardware accelerators (Yitbarek, Fig-1, ¶[0034], the accelerator may be FPGA, ASIC, GPU); and distribute data to ones of the plurality of heterogenous hardware accelerators to perform tasks requested by the processes (Yitbarek, Fig-1, ¶[0034], the accelerator may be FPGA, ASIC, GPU); and wherein the heterogenous hardware accelerators are configured to: receive indications of a manner in which the trusted processes and untrusted processes are isolated (Yitbarek, ¶[0041], the provisioning agent may be configured to determine whether any other trusted execution environment is using the TIO device and securely command the TIO device to exit the trusted I/O mode. ¶[0044], each of the TEEs may be embodied as a trust domain. Each TEE is isolated or otherwise protected from VMM, host partition and other TEEs with hardware support of the computing device); and based on the received indications, extend isolation of the trusted and untrusted processes for co-executing tasks operating on the distributed data (Yitbarek, ¶[0041], the provisioning agent may be configured to determine whether any other trusted execution environment is using the TIO device and securely command the TIO device to exit the trusted I/O mode. ¶[0044], each of the TEEs may be embodied as a trust domain. Each TEE is isolated or otherwise protected from VMM, host partition and other TEEs with hardware support of the computing device). Regarding Claim 17, Bali in view of Yitbarek discloses the computing device of claim 16, wherein, to extend isolation of the trusted and untrusted processes, one or more of the hardware accelerators are further configured to: store virtual-to-physical address translations for memory regions assigned to the trusted processes and for memory regions assigned to the untrusted processes (Yitbarek, ¶[0031], computing device may control access to particular memory pages by configuring one or more page tables. ¶[0112], the VMM, the host operating system and host driver may set registers, read and write memory addresses); and prevent the tasks associated with the untrusted processes from accessing the virtual-to-physical address translations for the memory regions assigned to the trusted processes to restrict tasks associated with the untrusted processes from accessing memory regions assigned to the trusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes). Regarding Claim 18, Bali in view of Yitbarek discloses the computing device of claim 16, wherein, to extend isolation of the trusted and untrusted processes, one or more of the hardware accelerators are further configured to: physically isolate the distributed data associated with the trusted processes from distributed data associated with the untrusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purposes. Also Yitbarek, Fig-1, ¶[0034]). Regarding Claim 19, Bali discloses a computing device, comprising: one or more processors configured to: co-execute trusted processes and untrusted processes such that the trusted processes are isolated from the untrusted processes (Bali, Fig-1, ¶[0020]- ¶[0022], the application execution environment includes an untrusted application execution environment and a trusted application execution environment. Unauthorized or unauthenticated users may access or modify data and applications in the untrusted application execution environment. For protecting confidential information from theft or misuse, the untrusted application execution environment may not be suitable for storing or processing biometric data. ¶[0023], the trusted execution environment includes data storage that is isolated from memory used by unauthorized process. Trusted environment securely quarantines certain data, including biometric data); and Bali does not explicitly discuss the following limitation that Yitbarek teaches: multiple heterogenous hardware accelerators configured to (Yitbarek, Fig-1, ¶[0034], the accelerator may be FPGA, ASIC, GPU): perform tasks requested by the trusted processes (Yitbarek, ¶[0041], the provisioning agent may be configured to determine whether any other trusted execution environment is using the TIO device and securely command the TIO device to exit the trusted I/O mode. ¶[0044], each of the TEEs may be embodied as a trust domain. Each TEE is isolated or otherwise protected from VMM, host partition and other TEEs with hardware support of the computing device); and negotiate conditions in which tasks requested by the untrusted processes are permitted to be performed (Yitbarek, ¶[0041], the provisioning agent may be configured to determine whether any other trusted execution environment is using the TIO device and securely command the TIO device to exit the trusted I/O mode. ¶[0044], each of the TEEs may be embodied as a trust domain. Each TEE is isolated or otherwise protected from VMM, host partition and other TEEs with hardware support of the computing device). Bali in view of Yitbarek are analogous art because they are from the “same field of endeavor” and are from the same “problem solving area”. Namely, they pertain to the field of “trusted and untrusted communication”. It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the invention of Bali in view of Yitbarek to include the idea of accelerator devices that typically communicate directly with host devices and also directly with each other to ensure that only accelerators that are within the same trust boundary can communicate and malicious accelerator can not extract sensitive data (Yitbarek, ¶[0005]). Regarding Claim 20, Bali in view of Yitbarek discloses the computing device of claim 19, wherein one of the negotiated conditions includes a notification to a user indicative of a task requested by one of the untrusted processes (Bali, Fig-3, ¶[0032], the device does not permit applications or other processes executing in the untrusted environment, including the untrusted application to directly access sensitive data that unauthorized users could exploit for improper purpose). Claim 7-8 are rejected under 35 U.S.C. 103 as being unpatentable over Bali (US Patent Application Publication No 2016/0364559 A1) in view of Yitbarek (US Patent Application Publication No 2020/0145419 A1) and further in view of Bates (US Patent No US 8,493,399 B1). Regarding Claim 7, Bali in view of Yitbarek does not disclose the following limitation that Bates teaches: computing device of claim 6, wherein the one or more hardware accelerators include a display unit having a display pipeline configured to: render frames for a display of the computing device, wherein the display pipeline includes a blend pipeline stage configured to: blend pixel data received from a trusted process into a frame being rendered by the display pipeline and including pixel data from an untrusted process (Bates, col 3, line 10-25, application 110 may be graphics application that processes trusted and untrusted application content. Application 110 may include FPU that renders frames from application content and SPU that renders frames from application content. Col 3, line 50-60, FPU may render or generate frame). Bali in view of Yitbarek and Bates are analogous art because they are from the “same field of endeavor” and are from the same “problem solving area”. Namely, they pertain to the field of “trusted and untrusted communication”. It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the invention of Bali in view of Yitbarek and Bates to include the idea of rendering process to render a frame of data from an application onto a buffer. Then a GPU renders the buffered data (Bates, col, 1, line 10-20). Regarding Claim 8, Bali in view of Yitbarek and Bates discloses the computing device of claim 6, wherein the one or more hardware accelerators include a display unit having a display pipeline configured to: render frames for a display of the computing device, wherein the display pipeline includes an extraction pipeline stage configured to: extract pixel data from a frame being rendered by the display pipeline; and provide the extracted pixel data to a trusted process (Bates, col 3, line 10-25, application 110 may be graphics application that processes trusted and untrusted application content. Application 110 may include FPU that renders frames from application content and SPU that renders frames from application content. Col 3, line 50-60, FPU may render or generate frame). Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Bali (US Patent Application Publication No 2016/0364559 A1) in view of Yitbarek (US Patent Application Publication No 2020/0145419 A1) and further in view of Chen (US Patent Application Publication No 2020/0050945 A1). Regarding Claim 10, Bali in view of Yitbarek does not disclose the following limitation that Chen discloses: the computing device of claim 9, wherein the one or more hardware accelerators include a neural engine configured to: perform set of neural network operations, wherein the neural engine includes the one or more data buffers to store distributed data for tasks associated with the trusted processes (Chen, ¶[0048], a deep learning method to train an initial neural network based on a training set. The training set is an untrusted data set (i.e., not from a trusted source). Bali in view of Yitbarek and Chen are analogous art because they are from the “same field of endeavor” and are from the same “problem solving area”. Namely, they pertain to the field of “trusted and untrusted communication”. It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify the invention of Bali in view of Yitbarek and Chen to include the idea of training a machine learning model to perform a particular task using trusted and untrusted data (Chen, ¶[0026]) Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure (see PTO-Form 892). Any inquiry concerning this communication or earlier communications from the examiner should be directed to WASIKA NIPA whose telephone number is (571)272-8923. The examiner can normally be reached on M-F, 8 am to 5 pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jeffrey Pwu can be reached on 571-272-6798. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, Applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WASIKA NIPA/ Primary Examiner, Art Unit 2433
Read full office action

Prosecution Timeline

Jul 31, 2024
Application Filed
Sep 11, 2025
Response after Non-Final Action
Feb 21, 2026
Non-Final Rejection — §101, §103
Apr 13, 2026
Examiner Interview Summary
Apr 13, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592965
SECURITY SCORING FOR TYPOGRAPHICAL ERRORS
2y 5m to grant Granted Mar 31, 2026
Patent 12587857
SIGNAL SPOOF DETECTION AT BASE STATIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12585807
AUTHORIZATION AUDIT FOR ACCESS TO PRIVILEGED USER DATA
2y 5m to grant Granted Mar 24, 2026
Patent 12587847
ENABLING COORDINATED IDENTITY MANAGEMENT BETWEEN AN OPERATOR-MANAGED MOBILE-EDGE PLATFORM AND AN EXTERNAL NETWORK
2y 5m to grant Granted Mar 24, 2026
Patent 12574367
ESTABLISHING A DATA SUBSCRIPTION FOR UTILITY USAGE INFORMATION
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
99%
With Interview (+29.7%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 302 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month