Prosecution Insights
Last updated: April 19, 2026
Application No. 17/572,730

Forensics Analysis for Malicious Insider Attack Attribution based on Activity Monitoring and Behavioral Biometrics Profiling

Final Rejection §103
Filed
Jan 11, 2022
Examiner
CATTUNGAL, DEREENA T
Art Unit
2431
Tech Center
2400 — Computer Networks
Assignee
Plurilock Security Solutions Inc.
OA Round
4 (Final)
80%
Grant Probability
Favorable
5-6
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
218 granted / 272 resolved
+22.1% vs TC avg
Strong +30% interview lift
Without
With
+30.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
28 currently pending
Career history
300
Total Applications
across all art units

Statute-Specific Performance

§101
7.0%
-33.0% vs TC avg
§103
48.9%
+8.9% vs TC avg
§102
14.3%
-25.7% vs TC avg
§112
14.1%
-25.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 272 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1.The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments 2. According to applicant’s arguments filed on 01/20/2026; independent claim(s) 1,15 and 20 have been amended; and new claims 22-24 have been added, hereby acknowledged. 3. Applicant’s arguments with respect to independent claim(s) 1,15 and 20 have been fully considered but are moot based on the new ground of rejection. 4. Applicant argues that the prior art of record do not discloses the newly amended features of independent claims, which recites: “identifying biometric data associated with the electronic device during the intrusion, wherein the biometric data comprises one or more behavioral signals; identifying a subset of organization insiders potentially involved in the intrusion based on the biometric data, the subset comprising the second user, wherein identifying the subset is based on comparing intrusion biometric data captured during the intrusion against stored biometric profiles for multiple organization insiders to determine similarity scores for the multiple organization insiders”.. 5. Examiner would like to point out that the new secondary reference Wu (US Pub.No.2021/0152549) teaches the above claimed limitation (see, the rejection below). Claim Rejections - 35 USC § 103 6.The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 7. Claims 1-15 and 19-24 are rejected under 35 U.S.C. 103 as being unpatentable over Mosby (US Pub.No.2023/0070546) in view of Wu (US Pub.No.2021/0152549). 8. Regarding claims 1,15 and 20 Mosby teaches a method, a system and a non-transitory computer-readable storage medium comprising: at a processor, detecting an intrusion at an electronic device accessing non-public information or systems of an organization, wherein the intrusion comprises a deviation from expected activity at the electronic device; wherein the instruction comprises a second user accessing the non-public information or system using access information of a first user inside the organization; identifying biometric data associated with the electronic device during the intrusion, wherein the data comprises one or more behavioral signals; identifying that the intrusion involved a subset of organization insiders based on the biometric data; and providing a report based on the subset of organization insiders (Fig.7 and Para:0097-0098 teaches an enterprise in which many users bring their own devices 104 and in which the devices 104 are accessing the data and services 110 of the enterprise by way of external networks. Access from external networks and by possibly compromised devices 104 leaves an enterprise vulnerable to attacks by malicious insiders, compromised accounts, data exfiltration, and account privilege escalation. For improving the security of a device 104, a machine learning model 702 may be trained using data describing typical or baseline behavior of the user, such as historical authentication behavior 704 and historical usage behavior 706. Para:0099 teaches Historical authentication behavior 704 describes aspects of the user behavior when authenticating and may include such things as a number of failed attempts (e.g., per week, month, or other time interval), typical login times, typical login locations, network from which the user logs in using the device 104, or other behavior describing circumstances of the user logging in, the authentication behavior also includes biometric data (see, claims 9-11). Para:0100-0101 teaches Historical usage behavior 706 may include data describing such things as typical device holding, typical device posture during use, typical typing behavior, typical tapping behavior, typical application usage (which applications, what times, how often, etc.), typical browsing behaviors (typical sites, categories of sites, topics of sites, etc.). As known in the art, some behaviors (which include various behavioral biometrics) may indicate theft of a device. Accordingly, baseline values for these threat-indicating behaviors may be included in the usage behavior 706. The data 704, 706 may be input to a machine learning algorithm executing on the device 104 of the user to train the machine learning model 702 to identify typical authentication and usage behavior. The machine learning algorithm and machine learning model 702 may include any approach known in the art for performing anomaly detection. The machine learning model 702 may be trained over time as additional data 704, 706 is collected. Fig.9 and Para:0155-0158 teaches the method 900 may include locally training 904 a machine learning model 702 to detect anomalies in authentication behavior and usage behavior on the device 104. The current activities 708 of the user may then be collected periodically over a predefined time window and evaluated 906 with respect to the machine learning model 702 to obtain a behavior security score 710, which may be a pair of scores (authentication score and usage score). The method 900 may include evaluating 908, 910, 912 application risk, network risk, and user behavior risk according to the application behavior 716, network behavior 718, and browsing behavior 720 . The behavior security score 710 may be combined with sub-scores based on the application risk, network risk, and behavior risk to generate 914 the device security score 712. Note that any one of the scores including the behavior security score 710 and the sub-scores indicating application risk, network risk, and behavior risk can be valuable to characterize the security risk of a device 104. The device security score 712 may be transmitted 916 to the server 102. The server 102 receives the security score 712 and selects corresponding security actions for the access controls 802, 804. The server 102 transmits a response to the device security score 712 to the device 104. The device 104 receives 918 the response and implements 920 the security actions indicated in the response. This may include programming the access controls 802, 804 to remove or add restrictions to access of third party sites and enterprise services 110 or third party services provided to the enterprise). Mosby teaches all the above claimed limitations but does not expressly teach identifying biometric data associated with the electronic device during the intrusion, wherein the biometric data comprises one or more behavioral signals; identifying a subset of organization insiders potentially involved in the intrusion based on the biometric data, the subset comprising the second user, wherein identifying the subset is based on comparing intrusion biometric data captured during the intrusion against stored biometric profiles for multiple organization insiders to determine similarity scores for the multiple organization insiders. Wu teaches the intrusion comprises a second user accessing the non-public information or systems using access information of a first user, identifying biometric data associated with the electronic device during the intrusion, wherein the biometric data comprises one or more behavioral signals; identifying a subset of organization insiders potentially involved in the intrusion based on the biometric data, the subset comprising the second user, wherein identifying the subset is based on comparing intrusion biometric data captured during the intrusion against stored biometric profiles for multiple organization insiders to determine similarity scores for the multiple organization insiders (Para:0002-0003 teaches a biometric system may operate either in an authentication/verification mode or in an identification mode. Authentication answers the question: “is the user who they say they are?” Authentication mode operates by comparing the captured biometric data to the user's own biometric template(s) in the database to determine resemblance. In more detail, a similarity module compares the features extracted during recognition against the stored templates to generate similarity scores. The similarity module may also make a decision as to whether the user's claimed identity is confirmed. The identification mode is the process of determining the identity of a user. Identification modes answers the question “Who is the user?” In the identification mode, the system recognizes a user by searching the templates of all the users in the database for a match. In this case, the user is identified as one among a group of others, which is a one-to-many comparison. The identification mode also uses the similarity module to compare the features extracted during recognition against the stored templates to generate similarity scores. Para:0007 and Para:0029-0031 teaches for improving security of a biometrics-based authentication system using an enrolled biometric dataset to detect adversarial examples. Aspects of the embodiments include receiving enrolled biometric samples of a user that is enrolled during the enrollment stage of the biometrics-based authentication system. Augmented biometric samples are created by adding perturbations to the enrolled biometric samples of the enrolled user. During a request for authentication, submitted biometric samples are received from a second user. The submitted biometric samples of the second user are compared to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics. Based on the comparison it is determined whether the submitted biometric samples of the second user have been modified to impersonate the enrolled user. Figs.1-2 and Para:0025-0027 teaches the biometric authentication system 12 receives the enrolled biometric samples 30, represented as x.sub.i, of the enrolled user. The enrolled biometric samples x.sub.i may be one or many samples. The projection DNN 22 applies a mapping function f(⋅) to the enrolled biometric samples 30 to generate a biometric template f(x.sub.i) 200, which can be a one dimensional feature vector or a high dimensional tensor. One or more biometric templates may be generated. That is, each biometric sample may generate one template, or a single composite biometric template may be generated from different biometric samples. The enrolled biometric samples x.sub.i and the biometric template f(x.sub.i) 200 are stored as the enrolled biometric dataset 32 in the enrollment database 24. Similarly, the biometric authentication system 12 receives the submitted biometric sample x′ 36 (one or many) from the same or a second user (attacker). The projection DNN 22 applies the same mapping function f(⋅) to the submitted biometric samples 36 to generate a biometric template f(x′) 202. The mapping function f(⋅) projects x.sub.i and x′ to a common embedding subspace. The similarity comparator 26 then compares the biometric template f(x.sub.i) 200 with the biometric template f(x′) 202 in the embedding subspace. This is done by computing a distance between the biometric template 200 and biometric template 202. The distance may be represented as similarity score or distance score, such that D=Σ.sub.i d(f(x)), f(x′)). If the similarity comparator 26 determines that the distance D is greater than a first threshold (t), the authorization request is rejected. Otherwise, the authentication request is authorized). Therefore, it would have been obvious to one of the ordinary skills in the art before the invention was filed to modify Mosby to include identifying biometric data associated with the electronic device during the intrusion, wherein the biometric data comprises one or more behavioral signals; identifying a subset of organization insiders potentially involved in the intrusion based on the biometric data, the subset comprising the second user, wherein identifying the subset is based on comparing intrusion biometric data captured during the intrusion against stored biometric profiles for multiple organization insiders to determine similarity scores for the multiple organization insiders as taught by Wu, in such as setup the adversarial defense system uses the enrolled biometric samples from the enrolled user to detect any fraudulently submitted biometric samples (Para:0032). 9. Regarding claims 2 and 16 Mosby teaches the method and the system wherein the intrusion comprises a malicious insider using credential of another user to gain access to one or more electronic devices to steal confidential data or perform an unauthorized action (Para:0097-0103 and Para:0150-0155 teaches malicious insider using credential (passcode) of another user to steal data. Capturing the baseline activity of the user, which may include baseline measurements of the authentication behavior and usage behavior. This may occur over a time period, such as a week, month, or some other time period) . 10. Regarding claims 3 and 17 Mosby teaches the method and the system wherein biometrics of the organization insiders are tracked using one or more agents that monitor activities on electronic devices and systems accessed by the organization insiders (Figs.7,9 Para:0097-0103 and Para:0153-0155 teaches monitor or track the activities on electronic devices accessed by the organization insiders). 11. Regarding claim 4 Mosby teaches the method further comprising developing biometric profiles using data tracked by agents, wherein the data comprises: a. behavior biometric data; b. foreground process data; c. Operating system events data; d. contextual data; e. application-specific data; f. open network connections data; or g. network topology data (Figs.7,9 and Para:0097-0103 and Para:0153-0158 teaches developing a biometric profile, application specific data, network topology data for tracking). 12. Regarding claims 5 and 18 Mosby in view of Wu teaches the method and the system further comprising identifying one or more potential perpetrators behind an intrusion using behavioral biometric user identification (Mosby: Figs.7,9 Para:0097-0103 and Para:0153-0155 teaches identifying intrusion based on behavioral data of user. Wu: Para:0001-0003 teaches identifying anomaly using behavioral biometric user identification). 13. Regarding claims 6 and 19 Mosby teaches the method and the system, wherein identifying the biometric data comprises extracting the biometric data of multiple profiles within a threshold amount of time of the intrusion (Para:0090-0092, Para:0031-0039 teaches extracting the biometric data of multiple profiles. The claims do not specifically define the “threshold amount of time” using the broadest reasonable interpretation any time used by the system to identify the user profile would meet the claim limitation). 14. Regarding claim 7 Mosby teaches the method wherein identifying the subset of organization insiders comprises selecting suspects based on scores determined using intrusion data and stored profiles (Para:0031-0039 teaches determine suspicions based on score). 15. Regarding claim 8 Mosby teaches the method wherein the scores are determined based on a behavioral characteristic exhibited by the intruder (Para:0031-0039 teaches scores are determined based on a behavioral characteristic). 16. Regarding claim 9 Mosby teaches the method wherein the scores are determined based on a characteristic recorded as having been exhibited by an insider in the past or identified as appropriate for an insider’s profile (Fig.7, Para:0097-0103 and Para:0150-0155 teaches scores are determined based on a characteristic recorded as having been exhibited by an insider in the past). 17. Regarding claim 10 Mosby teaches the method, wherein identifying the subset of organization insiders comprises selecting suspects and performing one or more queries for each of the suspects (Para:00162-0136 teaches performing query). 18. Regarding claim 11 Mosby teaches the method wherein the intrusion is an in-person intrusion with local data exfiltration and the one or more queries comprise: whether a suspect's electronic device was idle or not during a time of intrusion, and if not, whether scores during intrusion deviate from that user's historical scores; or whether any device was plugged into a data port during intrusion and whether said device was ever plugged into the suspect's electronic device (Fig.3, Para:0049-0058 and Para:0097-0101 teaches the query include monitoring the activities of the users and checking whether it deviates from the user's historical scores). 19. Regarding claim 12 Mosby teaches the method, wherein the intrusion is an in-person intrusion with installation of malicious software and the one or more queries comprise: whether a suspect's electronic device was idle or not during a time of intrusion, and if not, whether scores during intrusion deviate from that user's historical scores; whether applications executed during the intrusion opened any new connections during the intrusion; or whether any device was plugged into a data port during intrusion and whether said device was ever plugged into the suspect's electronic device (Fig.3, Para:0049-0058 and Para:0097-0101 teaches the query include monitoring the activities of the users and checking whether it deviates from the user's historical scores). 20. Regarding claim 13 Mosby teaches the method, wherein the intrusion is a remote intrusion with unauthorized actions and the one or more queries comprise: whether the agents on a suspect's electronic device were running or not during the time of intrusion; whether the suspect's electronic device was idle or not during the time of intrusion, and if not, whether the data collected locally during a time of intrusion matches intrusion data from a remote electronic device; whether, during the intrusion, a virtual desktop application was used; whether a local virtual desktop application logged usage or authentication of a user login; or whether there are any reports of a connection from the suspect's IP address in the remote OS event logs during the time of intrusion; or whether the remote electronic device network data contains connections between, from, or to the suspect's IP address (Para:0051 teaches detect anomalies by analyzing the connection features such as browser and OS type) . 21. Regarding claim 14 Mosby teaches the method, wherein the report ranks the suspects based on a score calculated or one or more queries performed for each of the suspects (Fig.4,Para:0059-0061 and Para:0097-0101 teaches report and rank the suspect as higher, medium or lower based on device security score, element.404 and generate user score element.406. For e.g., if the score meets and/or exceeds the threat detection threshold value). 22. Regarding claim 21 Mosby teaches the method, wherein the access information comprises login credentials of the first user (Mosby:0152 teaches the access information comprises of user credential). 23. Regarding claim 22 Wu teaches the method, wherein the report identifies multiple organization insiders of the subset, ordered according to the similarity scores (Para:0002-0003 teaches a biometric system may operate either in an authentication/verification mode or in an identification mode. Authentication answers the question: “is the user who they say they are?” Authentication mode operates by comparing the captured biometric data to the user's own biometric template(s) in the database to determine resemblance. In more detail, a similarity module compares the features extracted during recognition against the stored templates to generate similarity scores. The similarity module may also make a decision as to whether the user's claimed identity is confirmed. The identification mode is the process of determining the identity of a user. Identification modes answers the question “Who is the user?” In the identification mode, the system recognizes a user by searching the templates of all the users in the database for a match. In this case, the user is identified as one among a group of others, which is a one-to-many comparison. The identification mode also uses the similarity module to compare the features extracted during recognition against the stored templates to generate similarity scores. Para:0007 and Para:0029-0031 teaches for improving security of a biometrics-based authentication system using an enrolled biometric dataset to detect adversarial examples. Aspects of the embodiments include receiving enrolled biometric samples of a user that is enrolled during the enrollment stage of the biometrics-based authentication system. Augmented biometric samples are created by adding perturbations to the enrolled biometric samples of the enrolled user. During a request for authentication, submitted biometric samples are received from a second user. The submitted biometric samples of the second user are compared to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics. Based on the comparison it is determined whether the submitted biometric samples of the second user have been modified to impersonate the enrolled user. Figs.1-2 and Para:0025-0027 teaches the biometric authentication system 12 receives the enrolled biometric samples 30, represented as x.sub.i, of the enrolled user. The enrolled biometric samples x.sub.i may be one or many samples. The projection DNN 22 applies a mapping function f(⋅) to the enrolled biometric samples 30 to generate a biometric template f(x.sub.i) 200, which can be a one dimensional feature vector or a high dimensional tensor. One or more biometric templates may be generated. That is, each biometric sample may generate one template, or a single composite biometric template may be generated from different biometric samples. The enrolled biometric samples x.sub.i and the biometric template f(x.sub.i) 200 are stored as the enrolled biometric dataset 32 in the enrollment database 24. Similarly, the biometric authentication system 12 receives the submitted biometric sample x′ 36 (one or many) from the same or a second user (attacker). The projection DNN 22 applies the same mapping function f(⋅) to the submitted biometric samples 36 to generate a biometric template f(x′) 202. The mapping function f(⋅) projects x.sub.i and x′ to a common embedding subspace. The similarity comparator 26 then compares the biometric template f(x.sub.i) 200 with the biometric template f(x′) 202 in the embedding subspace. This is done by computing a distance between the biometric template 200 and biometric template 202. The distance may be represented as similarity score or distance score, such that D=Σ.sub.i d(f(x)), f(x′)). If the similarity comparator 26 determines that the distance D is greater than a first threshold (t), the authorization request is rejected. Otherwise, the authentication request is authorized). 24. Regarding claim 23 Wu teaches the method, wherein the report identifies a single organization insider of the subset based on the similarity scores (Para:0002-0003 teaches a biometric system may operate either in an authentication/verification mode or in an identification mode. Authentication answers the question: “is the user who they say they are?” Authentication mode operates by comparing the captured biometric data to the user's own biometric template(s) in the database to determine resemblance. In more detail, a similarity module compares the features extracted during recognition against the stored templates to generate similarity scores. The similarity module may also make a decision as to whether the user's claimed identity is confirmed. The identification mode is the process of determining the identity of a user. Identification modes answers the question “Who is the user?” In the identification mode, the system recognizes a user by searching the templates of all the users in the database for a match. In this case, the user is identified as one among a group of others, which is a one-to-many comparison. The identification mode also uses the similarity module to compare the features extracted during recognition against the stored templates to generate similarity scores. Para:0007 and Para:0029-0031 teaches for improving security of a biometrics-based authentication system using an enrolled biometric dataset to detect adversarial examples. Aspects of the embodiments include receiving enrolled biometric samples of a user that is enrolled during the enrollment stage of the biometrics-based authentication system. Augmented biometric samples are created by adding perturbations to the enrolled biometric samples of the enrolled user. During a request for authentication, submitted biometric samples are received from a second user. The submitted biometric samples of the second user are compared to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics. Based on the comparison it is determined whether the submitted biometric samples of the second user have been modified to impersonate the enrolled user. Figs.1-2 and Para:0025-0027 teaches the biometric authentication system 12 receives the enrolled biometric samples 30, represented as x.sub.i, of the enrolled user. The enrolled biometric samples x.sub.i may be one or many samples. The projection DNN 22 applies a mapping function f(⋅) to the enrolled biometric samples 30 to generate a biometric template f(x.sub.i) 200, which can be a one dimensional feature vector or a high dimensional tensor. One or more biometric templates may be generated. That is, each biometric sample may generate one template, or a single composite biometric template may be generated from different biometric samples. The enrolled biometric samples x.sub.i and the biometric template f(x.sub.i) 200 are stored as the enrolled biometric dataset 32 in the enrollment database 24. Similarly, the biometric authentication system 12 receives the submitted biometric sample x′ 36 (one or many) from the same or a second user (attacker). The projection DNN 22 applies the same mapping function f(⋅) to the submitted biometric samples 36 to generate a biometric template f(x′) 202. The mapping function f(⋅) projects x.sub.i and x′ to a common embedding subspace. The similarity comparator 26 then compares the biometric template f(x.sub.i) 200 with the biometric template f(x′) 202 in the embedding subspace. This is done by computing a distance between the biometric template 200 and biometric template 202. The distance may be represented as similarity score or distance score, such that D=Σ.sub.i d(f(x)), f(x′)). If the similarity comparator 26 determines that the distance D is greater than a first threshold (t), the authorization request is rejected. Otherwise, the authentication request is authorized). 25. Regarding claim 24 Wu teaches the method, wherein the report identifies the second user as being involved in the intrusion based on the similarity scores (Para:0002-0003 teaches a biometric system may operate either in an authentication/verification mode or in an identification mode. Authentication answers the question: “is the user who they say they are?” Authentication mode operates by comparing the captured biometric data to the user's own biometric template(s) in the database to determine resemblance. In more detail, a similarity module compares the features extracted during recognition against the stored templates to generate similarity scores. The similarity module may also make a decision as to whether the user's claimed identity is confirmed. The identification mode is the process of determining the identity of a user. Identification modes answers the question “Who is the user?” In the identification mode, the system recognizes a user by searching the templates of all the users in the database for a match. In this case, the user is identified as one among a group of others, which is a one-to-many comparison. The identification mode also uses the similarity module to compare the features extracted during recognition against the stored templates to generate similarity scores. Para:0007 and Para:0029-0031 teaches for improving security of a biometrics-based authentication system using an enrolled biometric dataset to detect adversarial examples. Aspects of the embodiments include receiving enrolled biometric samples of a user that is enrolled during the enrollment stage of the biometrics-based authentication system. Augmented biometric samples are created by adding perturbations to the enrolled biometric samples of the enrolled user. During a request for authentication, submitted biometric samples are received from a second user. The submitted biometric samples of the second user are compared to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics. Based on the comparison it is determined whether the submitted biometric samples of the second user have been modified to impersonate the enrolled user. Figs.1-2 and Para:0025-0027 teaches the biometric authentication system 12 receives the enrolled biometric samples 30, represented as x.sub.i, of the enrolled user. The enrolled biometric samples x.sub.i may be one or many samples. The projection DNN 22 applies a mapping function f(⋅) to the enrolled biometric samples 30 to generate a biometric template f(x.sub.i) 200, which can be a one dimensional feature vector or a high dimensional tensor. One or more biometric templates may be generated. That is, each biometric sample may generate one template, or a single composite biometric template may be generated from different biometric samples. The enrolled biometric samples x.sub.i and the biometric template f(x.sub.i) 200 are stored as the enrolled biometric dataset 32 in the enrollment database 24. Similarly, the biometric authentication system 12 receives the submitted biometric sample x′ 36 (one or many) from the same or a second user (attacker). The projection DNN 22 applies the same mapping function f(⋅) to the submitted biometric samples 36 to generate a biometric template f(x′) 202. The mapping function f(⋅) projects x.sub.i and x′ to a common embedding subspace. The similarity comparator 26 then compares the biometric template f(x.sub.i) 200 with the biometric template f(x′) 202 in the embedding subspace. This is done by computing a distance between the biometric template 200 and biometric template 202. The distance may be represented as similarity score or distance score, such that D=Σ.sub.i d(f(x)), f(x′)). If the similarity comparator 26 determines that the distance D is greater than a first threshold (t), the authorization request is rejected. Otherwise, the authentication request is authorized). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEREENA T CATTUNGAL whose telephone number is (571)270-0506. The examiner can normally be reached Mon-Fri : 7:30 AM-5 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lynn Feild can be reached on 571-272-2092. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DEREENA T CATTUNGAL/Primary Examiner, Art Unit 2431
Read full office action

Prosecution Timeline

Jan 11, 2022
Application Filed
Sep 27, 2024
Non-Final Rejection — §103
Feb 10, 2025
Applicant Interview (Telephonic)
Feb 22, 2025
Examiner Interview Summary
Feb 28, 2025
Response Filed
Mar 22, 2025
Final Rejection — §103
Sep 29, 2025
Request for Continued Examination
Sep 30, 2025
Response after Non-Final Action
Oct 17, 2025
Non-Final Rejection — §103
Jan 12, 2026
Applicant Interview (Telephonic)
Jan 20, 2026
Response Filed
Jan 23, 2026
Examiner Interview Summary
Feb 07, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596800
TECHNIQUES FOR CROSS-SOURCE ALERT PRIORITIZATION AND REMEDIATION
2y 5m to grant Granted Apr 07, 2026
Patent 12592930
Generating zero-trust policy for application access based on sequence-based application segmentation
2y 5m to grant Granted Mar 31, 2026
Patent 12579284
TRACEABLE DECENTRALIZED CONTROL OF NETWORK ACCESS TO PRIVATE INFORMATION
2y 5m to grant Granted Mar 17, 2026
Patent 12580921
Generating zero-trust policy for application access utilizing knowledge graph based application segmentation
2y 5m to grant Granted Mar 17, 2026
Patent 12547712
TECHNIQUES FOR CROSS-SOURCE ALERT PRIORITIZATION AND REMEDIATION
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+30.0%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 272 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month