Prosecution Insights
Last updated: April 19, 2026
Application No. 18/891,195

THREAT MITIGATION SYSTEM AND METHOD

Non-Final OA §103§DP
Filed
Sep 20, 2024
Examiner
TOLENTINO, RODERICK
Art Unit
2439
Tech Center
2400 — Computer Networks
Assignee
Reliaquest Holdings LLC
OA Round
1 (Non-Final)
77%
Grant Probability
Favorable
1-2
OA Rounds
3y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
545 granted / 705 resolved
+19.3% vs TC avg
Strong +35% interview lift
Without
With
+35.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
25 currently pending
Career history
730
Total Applications
across all art units

Statute-Specific Performance

§101
15.7%
-24.3% vs TC avg
§103
56.2%
+16.2% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
8.3%
-31.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 705 resolved cases

Office Action

§103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Detailed Action Office Action is in response to the instant Application 18/891,195 filed on 9/20/2024. Claims 61-90 are pending. This Office Action is Non-Final. Information Disclosure Statement The information disclosure statement (IDS)s, submitted on 10/2/2024, 12/27/2024, 3/3/2025, 6/2/2025, 9/23/2025, 12/8/2025 and 2/13/2026, is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 61, 71 and 81 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 9 and 17 of U.S. Patent No. 11,588,838. Although the claims at issue are not identical, they are not patentably distinct from each other because all the limitations of claims 61, 71 and 81 of the instant Application, with regards to a threat mitigation system with the use of a holistic/comprehensive report and limitations therein are being met and are anticipated by the limitations recited in 11, 9 and 17 of U.S. Patent No. 11,588,838. Regarding claims 62-70, 72-80 and 82-90; claims 62-70, 72-80 and 82-90 are also rejected under Double Patenting for similar reasons respectively and are dependent on claims 61, 71 and 81 and therefore inherit the rejection from issues of the independent claims. Claims 61, 71 and 81 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 10 and 19 of U.S. Patent No. US 11,687,659. Although the claims at issue are not identical, they are not patentably distinct from each other because all the limitations of claims 61, 71 and 81 of the instant Application, with regards to a threat mitigation system with the use of a holistic/comprehensive report and limitations therein are being met and are anticipated by the limitations recited in 1, 10 and 19 of U.S. Patent No. US 11,687,659. Regarding claims 62-70, 72-80 and 82-90; claims 62-70, 72-80 and 82-90 are also rejected under Double Patenting for similar reasons respectively and are dependent on claims 61, 71 and 81 and therefore inherit the rejection from issues of the independent claims Claims 61, 71 and 81 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 11 and 21 of U.S. Patent No. 12,229,276. Although the claims at issue are not identical, they are not patentably distinct from each other because all the limitations of claims 61, 71 and 81 of the instant Application, with regards to a threat mitigation system with the use of a holistic/comprehensive report and limitations therein are being met and are anticipated by the limitations recited in 1, 11 and 21 of U.S. Patent No. 12,229,276. Regarding claims 62-70, 72-80 and 82-90; claims 62-70, 72-80 and 82-90 are also rejected under Double Patenting for similar reasons respectively and are dependent on claims 61, 71 and 81 and therefore inherit the rejection from issues of the independent claims Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 61, 63-65, 68-71, 73-75, 78-81, 83-85 and 88-90 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herrod (US 2010/0333168) in view of Patnail et al. (US 2017/0329701). As per claim 61, Herrod teaches a computer-implemented method, executed on a computing device, comprising: obtaining system-defined consolidated platform information for a computing platform from an independent information source, the system-defined consolidated platform information defining one or more security-relevant subsystems present on the computing platform (Herrod, Paragraph 0109 recites “The system 400 includes a Settings Objects application module 410, a settings log 420, a Settings Objects verification module 440, platform and application subsystems 450, an automated security assessment module 470 and a security assessment log 490. In one implementation, modules 410, 440, 470 are executable code that might be deployed as part of a single or multiple Control Modules, depending on how the functionality is partitioned for deployment.”). But fails to teach obtaining application performance information concerning one or more applications deployed within the computing platform; generating a holistic platform report concerning the computing platform based, wherein the holistic platform report identifies one or more known conditions concerning the computer platform; and effectuating one or more remedial operations concerning the one or more known conditions. However, in an analogous art Patnaik teaches obtaining application performance information concerning one or more applications deployed within the computing platform; generating a holistic platform report concerning the computing platform based (Patnaik, Paragraph 0024 recites “In operation, when an application developer develops an application, such as a mobile app, the application can be uploaded to the server 102 for analysis by the pre-release analysis tool 112. Upon completion of the application analysis, the report 114 is generated and can be made available for the application developer. The report is accessed through a robust, informative, and rich user interface that provides comprehensive and detailed information regarding such things as application crashes, performance, security, usability, statistics, code warnings, localization issues, network issues, and the like. In addition, the report provides actionable information that can be used by developer to ensure that their application is more secure, reliable, efficient, and performant.”), wherein the holistic platform report identifies one or more known conditions concerning the computer platform (Patnaik, Paragraph 0031 recites “Dynamic analysis can also test applications for malicious behavior. Vulnerability scanning can be used to detect things such as the use of bad libraries, exposed private keys, and improper SSL. JavaScript injection and penetration tests can also be performed during dynamic analysis, as well as analysis of various data storage parameter issues such as unsafe local data storage, unencrypted storage, whether external storage is used, and the like. Dynamic analysis can also be used to check advertising ID policies, detect API violations, blacklisted advertising SDKs, advertising SDKs that violate particular policies, malformed files such as manifest XML files, transmission of unauthorized data over unencrypted connections, and the like. Dynamic analysis can also test for malware running on the device, check various compliances, whether certificates are valid and being used properly.”); And effectuating one or more remedial operations concerning the one or more known conditions (Patnaik, Paragraph 0022 recites “Actionable feedback can include feedback that suggests various remedial measures that an application developer may put in place in order to improve their applications.”). It would have been obvious to a person of ordinary skill in the art, before the earliest effective filing date to use Patnaik’s Application Pre-Release Report with Herrod’s Methods and apparatus for rating device security and automatically assessing security compliance because it offers the advantage of making a comprehensive analysis of applications before being launched. As per claim 63, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61, Patnaik further teaches obtaining hardware performance information concerning operation and/or functionality of one or more hardware systems deployed within the computing platform (Patnaik, Paragraph 0024 recites “In operation, when an application developer develops an application, such as a mobile app, the application can be uploaded to the server 102 for analysis by the pre-release analysis tool 112. Upon completion of the application analysis, the report 114 is generated and can be made available for the application developer. The report is accessed through a robust, informative, and rich user interface that provides comprehensive and detailed information regarding such things as application crashes, performance, security, usability, statistics, code warnings, localization issues, network issues, and the like. In addition, the report provides actionable information that can be used by developer to ensure that their application is more secure, reliable, efficient, and performant.”). It would have been obvious to a person of ordinary skill in the art, before the earliest effective filing date to use Patnaik’s Application Pre-Release Report with Herrod’s Methods and apparatus for rating device security and automatically assessing security compliance because it offers the advantage of making a comprehensive analysis of applications before being launched. As per claim 64, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61, Patnaik further teaches obtaining platform performance information concerning operation and/or functionality of the computing platform (Patnaik, Paragraph 0024 recites “In operation, when an application developer develops an application, such as a mobile app, the application can be uploaded to the server 102 for analysis by the pre-release analysis tool 112. Upon completion of the application analysis, the report 114 is generated and can be made available for the application developer. The report is accessed through a robust, informative, and rich user interface that provides comprehensive and detailed information regarding such things as application crashes, performance, security, usability, statistics, code warnings, localization issues, network issues, and the like. In addition, the report provides actionable information that can be used by developer to ensure that their application is more secure, reliable, efficient, and performant.”). It would have been obvious to a person of ordinary skill in the art, before the earliest effective filing date to use Patnaik’s Application Pre-Release Report with Herrod’s Methods and apparatus for rating device security and automatically assessing security compliance because it offers the advantage of making a comprehensive analysis of applications before being launched. As per claim 65, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61, Herrod further teaches obtaining platform performance information including: obtaining client-defined consolidated platform information for the computing platform from a client information source, the client-defined consolidated platform information defining one or more security-relevant subsystems that the client believes are present on the computing platform (Herrod, Paragraph 0109 recites “The system 400 includes a Settings Objects application module 410, a settings log 420, a Settings Objects verification module 440, platform and application subsystems 450, an automated security assessment module 470 and a security assessment log 490. In one implementation, modules 410, 440, 470 are executable code that might be deployed as part of a single or multiple Control Modules, depending on how the functionality is partitioned for deployment.” And Paragraph 0006 recites “Once mobile devices are up and running, MSP includes automated provisioning functionality to keep applications, device settings, operating systems and firmware on all mobile devices up to date with minimal effort or interaction from the end-user. Policy-based provisioning and over-the-air update capabilities can greatly reduce time and cost required to keep devices updated. Network administrators can keep devices updated by setting policies that define when mobile devices should upload their current status information (e.g., a complete inventory of all software on the device including applications as well as operating system information and device settings) to their associated relay server. Devices can be grouped by device type, type of user, operating system and location thereby providing the granular management capabilities needed to achieve maximum efficiency in the provisioning function.”). As per claim 68, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61, Herrod further teaches obtaining platform performance information concerning operation of the computing platform including: obtaining consolidated platform information to identify current security-relevant capabilities for the computing platform; determining possible security-relevant capabilities for the computing platform; and generating comparison information that compares the current security-relevant capabilities of the computing platform to the possible security-relevant capabilities of the computing platform to identify security-relevant deficiencies (Herrod, Paragraph 0111 recites “The set of Settings Objects 225-1 . . . 225-n having an expected overall device security rating (ODSR) 350, the particular security interaction template 320 and the overall security test cases 360 are generated using the systems and processing described above with respect to FIGS. 2 and 3, and are deployed to the device 120. It is assumed that the set of Settings Objects 225-1 . . . 225-n have an expected ODSR 350 that has been determined to be acceptable by the user of computer 130. The device 120 can use the SIT 320 to calculate an Actual ODSR based on the Settings Objects that are actually applied on the device 120. If the Actual ODSR is sent to the control server, it can be compared with the expected ODSR 350 to see how well actual security of the device 120 compares to the expected security.”). As per claim 69, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61, Herrod further teaches obtaining platform performance information concerning operation of the computing platform including: obtaining consolidated platform information to identify current security-relevant capabilities for the computing platform; determining comparative platform information that identifies security-relevant capabilities for a comparative platform; and generating comparison information that compares the current security-relevant capabilities of the computing platform to the comparative platform information of the comparative platform to identify a threat context indicator (Herrod, Paragraph 0111 recites “The set of Settings Objects 225-1 . . . 225-n having an expected overall device security rating (ODSR) 350, the particular security interaction template 320 and the overall security test cases 360 are generated using the systems and processing described above with respect to FIGS. 2 and 3, and are deployed to the device 120. It is assumed that the set of Settings Objects 225-1 . . . 225-n have an expected ODSR 350 that has been determined to be acceptable by the user of computer 130. The device 120 can use the SIT 320 to calculate an Actual ODSR based on the Settings Objects that are actually applied on the device 120. If the Actual ODSR is sent to the control server, it can be compared with the expected ODSR 350 to see how well actual security of the device 120 compares to the expected security.”). As per claim 70, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61, Herrod further teaches wherein the application performance information concerns operation and/or functionality of one or more software applications deployed within the computing platform (Herrod, Paragraph 0109 recites “The system 400 includes a Settings Objects application module 410, a settings log 420, a Settings Objects verification module 440, platform and application subsystems 450, an automated security assessment module 470 and a security assessment log 490. In one implementation, modules 410, 440, 470 are executable code that might be deployed as part of a single or multiple Control Modules, depending on how the functionality is partitioned for deployment.”). Regarding claims 71 and 81, claims 71 and 81 are directed to a non-transitory readable medium and a system associated with the method of claim 61. Claims 71 and 81 are of similar scope to claim 61, and are therefore rejected under similar rationale. Regarding claims 73 and 83, claims 73 and 83 are directed to a non-transitory readable medium and a system associated with the method of claim 63. Claims 73 and 83 are of similar scope to claim 63, and are therefore rejected under similar rationale. Regarding claims 74 and 84, claims 74 and 84 are directed to a non-transitory readable medium and a system associated with the method of claim 64. Claims 74 and 84 are of similar scope to claim 64, and are therefore rejected under similar rationale. Regarding claims 75 and 85, claims 75 and 85 are directed to a non-transitory readable medium and a system associated with the method of claim 65. Claims 75 and 85 are of similar scope to claim 65, and are therefore rejected under similar rationale. Regarding claims 78 and 88, claims 78 and 88 are directed to a non-transitory readable medium and a system associated with the method of claim 68. Claims 78 and 88 are of similar scope to claim 68, and are therefore rejected under similar rationale. Regarding claims 79 and 89, claims 79 and 89 are directed to a non-transitory readable medium and a system associated with the method of claim 69. Claims 79 and 89 are of similar scope to claim 69, and are therefore rejected under similar rationale. Regarding claims 80 and 90, claims 80 and 90 are directed to a non-transitory readable medium and a system associated with the method of claim 70. Claims 80 and 90 are of similar scope to claim 70, and are therefore rejected under similar rationale. Claim(s) 62, 66, 67, 72, 76, 77, 82, 86 and 87 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herrod (US 2010/0333168) and Patnail et al. (US 2017/0329701) and in further view of Kraft (US 2013/0191898). As per claim 62, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61 but fails to teach providing the holistic report to a third-party. However, in an analogous art Kraft teaches providing the holistic report to a third-party (Kraft, Paragraph 0025 recites “As described below, a True Online Credential (TOC) may be generated, which may serve as a comprehensive report or body of data summarizing the information stored in first/second/third-party data sources and which may otherwise be available to others about the subject.”). It would have been obvious to a person of ordinary skill in the art, before the earliest effective filing date to use Kraft’s Identity verification credential with continuous verification and intention-based authentication systems and methods with Herrod’s Methods and apparatus for rating device security and automatically assessing security compliance because it offers the advantage of storing the report in a secure location. As per claim 66, Herrod in combination with Patnaik teaches the computer-implemented method of claim 65, Herrod further teaches wherein obtaining platform performance information concerning operation of the computing platform includes: obtaining the system-defined consolidated platform information for the computing platform from the independent information source; obtaining the client-defined consolidated platform information for the computing platform from the client information source (Herrod, Paragraph 0109 recites “The system 400 includes a Settings Objects application module 410, a settings log 420, a Settings Objects verification module 440, platform and application subsystems 450, an automated security assessment module 470 and a security assessment log 490. In one implementation, modules 410, 440, 470 are executable code that might be deployed as part of a single or multiple Control Modules, depending on how the functionality is partitioned for deployment.”). But fails to teach presenting differential consolidated platform information for the computing platform to a third-party. However, in an analogous art Kraft teaches presenting differential consolidated platform information for the computing platform to a third-party (Kraft, Paragraph 0025 recites “As described below, a True Online Credential (TOC) may be generated, which may serve as a comprehensive report or body of data summarizing the information stored in first/second/third-party data sources and which may otherwise be available to others about the subject.”). It would have been obvious to a person of ordinary skill in the art, before the earliest effective filing date to use Kraft’s Identity verification credential with continuous verification and intention-based authentication systems and methods with Herrod’s Methods and apparatus for rating device security and automatically assessing security compliance because it offers the advantage of storing the report in a secure location. As per claim 67, Herrod in combination with Patnaik teaches the computer-implemented method of claim 61, Herrod further teaches obtaining platform performance information concerning operation of the computing platform including: obtaining consolidated platform information for the computing platform to identify one or more deployed security-relevant subsystems; processing the consolidated platform information to identify one or more non-deployed security-relevant subsystems; generating a list of ranked & recommended security-relevant subsystems that ranks the one or more non-deployed security-relevant subsystems (Herrod, Paragraph 0109 recites “The system 400 includes a Settings Objects application module 410, a settings log 420, a Settings Objects verification module 440, platform and application subsystems 450, an automated security assessment module 470 and a security assessment log 490. In one implementation, modules 410, 440, 470 are executable code that might be deployed as part of a single or multiple Control Modules, depending on how the functionality is partitioned for deployment.”). But fails to teach providing the list of ranked & recommended security-relevant subsystems to a third-party. However, in an analogous art Kraft teaches providing the list of ranked & recommended security-relevant subsystems to a third-party (Kraft, Paragraph 0025 recites “As described below, a True Online Credential (TOC) may be generated, which may serve as a comprehensive report or body of data summarizing the information stored in first/second/third-party data sources and which may otherwise be available to others about the subject.”). It would have been obvious to a person of ordinary skill in the art, before the earliest effective filing date to use Kraft’s Identity verification credential with continuous verification and intention-based authentication systems and methods with Herrod’s Methods and apparatus for rating device security and automatically assessing security compliance because it offers the advantage of storing the report in a secure location. Regarding claims 72 and 82, claims 72 and 82 are directed to a non-transitory readable medium and a system associated with the method of claim 62. Claims 72 and 82 are of similar scope to claim 62, and are therefore rejected under similar rationale. Regarding claims 76 and 86, claims 76 and 86 are directed to a non-transitory readable medium and a system associated with the method of claim 66. Claims 76 and 86 are of similar scope to claim 66, and are therefore rejected under similar rationale. Regarding claims 77 and 87, claims 77 and 87 are directed to a non-transitory readable medium and a system associated with the method of claim 67. Claims 77 and 87 are of similar scope to claim 67, and are therefore rejected under similar rationale. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RODERICK TOLENTINO whose telephone number is (571)272-2661. The examiner can normally be reached Mon- Fri 8am-4pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Luu Pham can be reached at 571-270-5002. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. RODERICK . TOLENTINO Examiner Art Unit 2439 /RODERICK TOLENTINO/Primary Examiner, Art Unit 2439
Read full office action

Prosecution Timeline

Sep 20, 2024
Application Filed
Mar 26, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603907
SERVER AND METHOD FOR PROVIDING ONLINE THREAT DATA BASED ON USER-CUSTOMIZED KEYWORDS FOR PRIVATE CHANNEL
2y 5m to grant Granted Apr 14, 2026
Patent 12592915
INFERENCE-BASED SELECTIVE FLOW INSPECTION
2y 5m to grant Granted Mar 31, 2026
Patent 12580946
SYSTEMS AND METHODS FOR TRIGGERING TOKEN ALERTS
2y 5m to grant Granted Mar 17, 2026
Patent 12580948
CYBERSECURITY OPERATIONS MITIGATION MANAGEMENT
2y 5m to grant Granted Mar 17, 2026
Patent 12572632
SYSTEMS AND METHODS FOR DATA SECURITY MODEL MODIFICATION AND ANOMALY DETECTION
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
77%
Grant Probability
99%
With Interview (+35.4%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 705 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month