Prosecution Insights
Last updated: April 19, 2026
Application No. 18/651,379

REAL-TIME STREAMING EVENT ENRICHMENT FOR SECURITY ENDPOINTS

Final Rejection §103
Filed
Apr 30, 2024
Examiner
NGUYEN, ANH
Art Unit
2458
Tech Center
2400 — Computer Networks
Assignee
Crowdstrike Inc.
OA Round
2 (Final)
79%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
282 granted / 359 resolved
+20.6% vs TC avg
Strong +25% interview lift
Without
With
+24.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
23 currently pending
Career history
382
Total Applications
across all art units

Statute-Specific Performance

§101
12.8%
-27.2% vs TC avg
§103
58.6%
+18.6% vs TC avg
§102
9.0%
-31.0% vs TC avg
§112
12.1%
-27.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 359 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This communication is in response to the amendment filed on 12/15/2025. Claims 1-20 are rejected. Information Disclosure Statement The information disclosure statement (IDS) submitted on 12/15/2025was filed after the mailing date of the Non-Final rejection on 09/15/2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Response to Arguments Applicant’s arguments with respect to claims 1, 10, and 16 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 6-8, 14, 16, 10-11, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Lasky (US 20250068641 A1, provisional application No. 63/534030) in view of Mondragon (US 20150348076 A1) and further in view of Sweeney (US 20190207981 A1). Regarding claim 1, Lasky teaches a computer-implemented method, comprising: receiving, by a host in a digital security system, event data sent by a sensor executing on a computing endpoint ([0071] the enrichment pipeline augments or enriches the received data stream or event stream by appending to it), wherein: the event data indicates information associated with an occurrence of a computing event, detected by the security agent, on the computing endpoint ([0024] the stream enrichment system creates a source corresponding to a table in a customer data warehouse, the source associated with an entity model); and generating, by the host, enriched event data by adding the one or more types of the enrichment data, from the enrichment cache, to the event data ([0074] If a matching stored data object (e.g., object ID, entity ID, etc.) is identified for a data object mention at a particular stream location, Entities_API/Enrich 518 enriches the data stream or event stream by appending, at the particular stream location). Lasky does not explicitly teach event data sent by a sensor executing on a computing endpoint; the sensor comprises a security agent configured to detect computing events that occur on the computing endpoint, the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint; determining, by the host, that one or more types of the enrichment data associated with the computing endpoint, indicated by the enrichment cache are absent from the event data sent by the sensor; Mondragon teaches determining, by the host, that one or more types of the enrichment data associated with the endpoint, indicated by an enrichment cache maintained by the host, is absent from the event data sent by the sensor ([0043] For example, data enrichment subsystem 308 can identify missing attributes that can be useful for data validation and user experience improvement purposes. Second, it needs to identify optimal sources to augment the missing information); It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky disclosure, a missing data is determined for a data enrichment, as taught by Mondragon. One would be motivated to do so to perform validation on venue data from data ingestion subsystem and integrated data from data integration subsystem using various validation rules and mining existing venue data for detection of new validation rules and validation processes. Lasky and Mondragon do not explicitly teach event data sent by a sensor executing on a computing endpoint; the sensor comprises a security agent configured to detect computing events that occur on the computing endpoint, and the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint. Sweeney teaches event data sent by a sensor executing on a computing endpoint ([0074], fig. 1, event processor 106 can be configured to generate enriched events based on security events received from sensors 136; [0070] IT environment 130 include assets 134A-C or computers (computing endpoint)); the sensor comprises a security agent configured to detect computing events that occur on the computing endpoint ([0071] of sensors 136 may include security appliances (security agent); [0080] each sensor can be configured to detect a type of security information in the IT environment), and the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint ([0095] event manager can be configured to receive an enriched event corresponding to security event from event enrichment processor and store enriched event in security event index database). It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky and Mondragon disclosure, sensor executing on a computing endpoint, as taught by Sweeney. One would be motivated to do so to detect a type of security information associated with an asset within a network domain. Regarding claim 2, Lasky, Mondragon, and Sweeney teach the computer-implemented method of claim 1, wherein Lasky further teaches the enrichment data associated with the computing endpoint, stored in the enrichment cache maintained by the host, indicates at least one of: a name of the computing endpoint ([0065] entities-model-<entity name>), a type of the computing endpoint ([0075] entity type or entity class), an Internet Protocol (IP) address used by the computing endpoint ([0125]internet protocol), a physical address of the computing endpoint, or a mapping of a username to a user identifier associated with the computing endpoint. Regarding claims 3 and 11, Lasky, Mondragon, and Sweeney teach all limitations of parent claims 1 and 10, Lasky further teaches providing, by the host, the enriched event data to at least one other system in the digital security system ([0076] once the data stream sample and/or point has been enriched, the enriched data stream sample (e.g., entities result 508) is delivered to a service component (e.g., integrations monoservice 510) that maps the enriched data stream sample onto a payload and/or configuration of a downstream destination). Regarding claim 6, Lasky, Mondragon, and Sweeney teach the computer-implemented method of claim 1, wherein Lasky further teaches the enrichment cache is maintained in local memory of the host ([0052], fig. 2, The ingest pipeline 204 loads data from a customer data warehouse (DWH) 210 into a high-performance cache 212). Regarding claim 7, Lasky, Mondragon, and Sweeney teach all limitations of parent claim 1, wherein Lasky further teaches: the host is one of a plurality of hosts of an enrichment system within the digital security system ([0055] The data plane refers to the components that process and route customer event data, such as real-time data pipelines that transform and analyze data streams: data processing engine(s) and infrastructure in the enrichment pipeline that helps enrich an incoming data stream, a high-speed cache as implemented for example by one or more databases), different hosts, of the plurality of hosts, respectively maintain different enrichment caches that correspond to different sets of computing endpoints ([0144] the data stream corresponds to an event stream and the storage component corresponds to a cache component), and the host is, within the plurality of hosts, associated with a set of computing endpoints that includes the computing endpoint ([0045] A networked system in the example form of a cloud computing service, such as Microsoft Azure or other cloud service, provides server-side functionality, via a network (e.g., the Internet or Wide Area Network (WAN)) to one or more endpoints). Regarding claim 8, Lasky, Mondragon, and Sweeney teach all limitations of parent claims 7, wherein Lasky further teaches: the different sets of computing endpoints are respectively associated with different shards, of a plurality of shards, in the digital security system ([0135] neurons (or nodes) may be arranged hierarchically into a number of layers, including an input layer, an output layer, and multiple hidden layers), an event data investor, of the digital security system, determines that the event data is associated with a particular shard, of the plurality of shards, and routes the event data to the particular shard ([0088], fig. 1, , the Enrichments table enables additional types of enrichments: flexible target-type enrichments (e.g., source and/or insert-level enrichments), multiple enrichments configured per subscription (e.g., stream enrichment with data from multiple entity tables), many-to-one action subscriptions, destination-level enrichments (e.g., enrichments that take place at a particular pre-specified destination), and so forth), and the host corresponds to the particular shard ([0046], fig. 1, an API server 120 and a web server 126 are coupled to, and provide programmatic and web interfaces respectively to, one or more software services, which may be hosted on a software-as-a-service (SaaS) layer or platform 102). Regarding claim 10, Lasky teaches a computing system, comprising: one or more processors; and memory storing computer-executable instructions associated with a host of a digital security system that ([0111], fig. 12, processors 1304, memory/storage 1306), when executed by the one or more processors, cause the host to: receive event data sent by a sensor executing on a computing endpoint [0071] the enrichment pipeline augments or enriches the received data stream or event stream by appending to it), wherein: the event data indicates information associated with an occurrence of a computing event, detected by the security agent, on the computing endpoint ([0024] the stream enrichment system creates a source corresponding to a table in a customer data warehouse, the source associated with an entity model), and generate enriched event data by adding the one or more types of the enrichment data, from the enrichment cache, to the event data ([0074] If a matching stored data object (e.g., object ID, entity ID, etc.) is identified for a data object mention at a particular stream location, Entities_API/Enrich enriches the data stream or event stream by appending, at the particular stream location). Lasky does not explicitly teach sent by a sensor executing on a computing endpoint; the sensor comprise a security agent configured to detect computing events that occur on the computing endpoint, the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint; determine that one or more types of the enrichment data associated with the computing endpoint, indicated by the enrichment cache are absent from the event data sent by the sensor; Mondragon teaches determine that enrichment data associated with the endpoint, indicated by an enrichment cache maintained by the host, is absent from the event data sent by the sensor (For example, data enrichment subsystem 308 can identify missing attributes that can be useful for data validation and user experience improvement purposes. Second, it needs to identify optimal sources to augment the missing information). It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky disclosure, a missing data is determined for a data enrichment, as taught by Mondragon. One would be motivated to do so to perform validation on venue data from data ingestion subsystem and integrated data from data integration subsystem using various validation rules and mining existing venue data for detection of new validation rules and validation processes. Lasky and Mondragon do not explicitly teach sent by a sensor executing on a computing endpoint; the sensor comprise a security agent configured to detect computing events that occur on the computing endpoint, the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint; Sweeney teaches sent by a sensor executing on a computing endpoint ([0074], fig. 1, event processor 106 can be configured to generate enriched events based on security events received from sensors 136; [0070] IT environment 130 include assets 134A-C or computers (computing endpoint)); the sensor comprises a security agent configured to detect computing events that occur on the computing endpoint ([0071] of sensors 136 may include security appliances (security agent); [0080] each sensor can be configured to detect a type of security information in the IT environment); the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint ([0095] event manager can be configured to receive an enriched event corresponding to security event from event enrichment processor and store enriched event in security event index database); It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky and Mondragon disclosure, sensor executing on a computing endpoint, as taught by Sweeney. One would be motivated to do so to detect a type of security information associated with an asset within a network domain. Regarding claim 14, Lasky, Mondragon, and Sweeney teach the computing system of claim 10, wherein Lasky further teaches: the host is one of a plurality of hosts of an enrichment system within the digital security system ([0055] The data plane refers to the components that process and route customer event data, such as real-time data pipelines that transform and analyze data streams: data processing engine(s) and infrastructure in the enrichment pipeline that helps enrich an incoming data stream, a high-speed cache as implemented for example by one or more databases), different hosts, of the plurality of hosts, respectively maintain different enrichment caches that correspond to different sets of endpoints ([0144] the data stream corresponds to an event stream and the storage component corresponds to a cache component), the host is, within the plurality of hosts, associated with a set of computing endpoints that includes the computing endpoint ([0045] A networked system in the example form of a cloud computing service, such as Microsoft Azure or other cloud service, provides server-side functionality, via a network (e.g., the Internet or Wide Area Network (WAN)) to one or more endpoints), the different sets of computing endpoints are respectively associated with different shards, of a plurality of shards, in the digital security system ([0135] neurons (or nodes) may be arranged hierarchically into a number of layers, including an input layer, an output layer, and multiple hidden layers), an event data ingestor, of the digital security system, determines that the event data is associated with a particular shard, of the plurality of shards, and routes the event data to the particular shard ([0088], fig. 1, , the Enrichments table enables additional types of enrichments: flexible target-type enrichments (e.g., source and/or insert-level enrichments), multiple enrichments configured per subscription (e.g., stream enrichment with data from multiple entity tables), many-to-one action subscriptions, destination-level enrichments (e.g., enrichments that take place at a particular pre-specified destination), and so forth), and the host corresponds to the particular shard ([0046], fig. 1, an API server 120 and a web server 126 are coupled to, and provide programmatic and web interfaces respectively to, one or more software services, which may be hosted on a software-as-a-service (SaaS) layer or platform 102). Regarding claim 16, Lasky teaches one or more non-transitory computer-readable media storing computer-executable instructions associated with a host of a digital security system that, when executed by one or more processors, cause the host to: receive event data sent by a sensor executing on an computing endpoint ([0071] the enrichment pipeline augments or enriches the received data stream or event stream by appending to it), the event data indicates information associated with an occurrence of a computing event, detected by the security agent, on the computing endpoint ([0024] the stream enrichment system creates a source corresponding to a table in a customer data warehouse, the source associated with an entity model), and generate enriched event data by adding the one or more types of the enrichment data, from the enrichment cache, to the event data ([0074] If a matching stored data object (e.g., object ID, entity ID, etc.) is identified for a data object mention at a particular stream location, Entities_API/Enrich 518 enriches the data stream or event stream by appending, at the particular stream location). Lasky does not explicitly teach event data sent by a sensor executing on a computing endpoint; the sensor comprises a security agent configured to detect computing events that occur on the computing endpoint, the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint; determine that one or more types of the enrichment data associated with the computing endpoint, indicated by the enrichment cache absent from the event data sent by the sensor; Mondragon teaches determining, by the host, that enrichment data associated with the endpoint, indicated by an enrichment cache maintained by the host, is absent from the event data sent by the sensor (For example, data enrichment subsystem 308 can identify missing attributes that can be useful for data validation and user experience improvement purposes. Second, it needs to identify optimal sources to augment the missing information). It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky disclosure, a missing data is determined for a data enrichment, as taught by Mondragon. One would be motivated to do so to perform validation on venue data from data ingestion subsystem and integrated data from data integration subsystem using various validation rules and mining existing venue data for detection of new validation rules and validation processes. Lasky and Mondragon do not explicitly teach event data sent by a sensor executing on a computing endpoint; the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint; the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint; Sweeney teaches event data sent by a sensor executing on a computing endpoint ([0074], fig. 1, event processor 106 can be configured to generate enriched events based on security events received from sensors 136; [0070] IT environment 130 include assets 134A-C or computers (computing endpoint)); the sensor comprises a security agent configured to detect computing events that occur on the computing endpoint ([0071] of sensors 136 may include security appliances (security agent); [0080] each sensor can be configured to detect a type of security information in the IT environment). the host maintains an enrichment cache that stores enrichment data associated with the computing endpoint ([0095] event manager can be configured to receive an enriched event corresponding to security event from event enrichment processor and store enriched event in security event index database). It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky and Mondragon disclosure, sensor executing on a computing endpoint, as taught by Sweeney. One would be motivated to do so to detect a type of security information associated with an asset within a network domain. Regarding claim 19, Lasky, Mondragon, and Sweeney teach the one or more non-transitory computer-readable media of claim 16, wherein Lasky further teaches: the host is one of a plurality of hosts of an enrichment system within the digital security system ([0055] The data plane refers to the components that process and route customer event data, such as real-time data pipelines that transform and analyze data streams: data processing engine(s) and infrastructure in the enrichment pipeline that helps enrich an incoming data stream, a high-speed cache as implemented for example by one or more databases), different hosts, of the plurality of hosts, respectively maintain different enrichment caches that correspond to different sets of endpoints ([0144] the data stream corresponds to an event stream and the storage component corresponds to a cache component), the host is, within the plurality of hosts, associated with a set of computing endpoints that includes the computing endpoint ([0045] A networked system in the example form of a cloud computing service, such as Microsoft Azure or other cloud service, provides server-side functionality, via a network (e.g., the Internet or Wide Area Network (WAN)) to one or more endpoints). the different sets of computing endpoints are respectively associated with different shards, of a plurality of shards, in the digital security system ([0135] neurons (or nodes) may be arranged hierarchically into a number of layers, including an input layer, an output layer, and multiple hidden layers), an event data investor, of the digital security system, determines that the event data is associated with a particular shard, of the plurality of shards, and routes the event data to the particular shard ([0088], fig. 1, , the Enrichments table enables additional types of enrichments: flexible target-type enrichments (e.g., source and/or insert-level enrichments), multiple enrichments configured per subscription (e.g., stream enrichment with data from multiple entity tables), many-to-one action subscriptions, destination-level enrichments (e.g., enrichments that take place at a particular pre-specified destination), and so forth), and the host corresponds to the particular shard ([0046], fig. 1, an API server 120 and a web server 126 are coupled to, and provide programmatic and web interfaces respectively to, one or more software services, which may be hosted on a software-as-a-service (SaaS) layer or platform 102). Claims 4-5, 13-14, and 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Lasky (US 20250068641 A1) in view of Mondragon (US 20150348076 A1) in view of Sweeney (US 20190207981 A1) and further in view of Zhao (US 8874524 B1). Regarding claims 4, 12, and 17, Lasky, Mondragon, and Sweeney teach all limitations of parent claims 1, 10, and 16, Lasky does not explicitly teach: determining, by the host, that the event data indicates new enrichment data associated with the computing endpoint that is not indicated by the enrichment cache maintained by the host; and updating, by the host, the enrichment cache maintained by the host to indicate the new enrichment data. Zhao teaches determining, by the host, that the event data indicates new enrichment data associated with the endpoint that is not indicated by the enrichment cache (col. 7, lines 38-41, the control circuitry of the electronic apparatus checks the first update table upon receipt of each write instruction to determine whether new data to be written creates a CoFW situation); and updating, by the host, the enrichment cache to indicate the new enrichment data (col. 7, lines 52-56, after the control circuitry copies the original data from the cache memory to the snapshot storage, the control circuitry updates the data block in the cache memory with the new data). It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky disclosure, update the cache when receive new data, as taught by Zhao. One would be motivated to do so to quickly identifying the Copy on First Write data in the file system buffer cache and passing the Copy on First Write data from the file system buffer cache to the snapshot save area. Regarding claims 5, 13, and 18, Lasky, Mondragon, and Sweeney teach all limitations of parent claims 4, 12, and 17, Lasky does not explicitly teach: generating, by the host, a backup of the enrichment cache at a first time after the updating of the enrichment cache based on the event data; updating, by the host, the enrichment cache at a second time based on new second enrichment data indicated by second event data received from the sensor; restoring, by the host, the enrichment cache at a third time based on the backup generated at the first time; and re-processing, by the host, the second event data after the third time, wherein re-processing the second event data updates the enrichment cache, restored based on the backup generated at the first time, based on the new second enrichment data indicated by the second event data. Zhao teaches generating, by the host, a backup of the enrichment cache at a first time after the updating of the enrichment cache based on the event data (col. 7, lines 41-44, if so, the control circuitry copies the original data from the cache memory to the snapshot storage and updates the first update table and the pointer table); updating, by the host, the enrichment cache at a second time based on new second enrichment data indicated by second event data received from the sensor (col. 7, lines 45-48, the control circuitry changes the value in a first update entry corresponding to the data block in the cache memory (e.g., sets the contents to "1") and updates a pointer entry in the pointer table); restoring, by the host, the enrichment cache at a third time based on the backup generated at the first time (col. 7, lines 50-52, accordingly, if the control circuitry later needs to restore a snapshot, the control circuitry 84 is able to find the original data in the snapshot storage); and re-processing, by the host, the second event data after the third time, wherein re-processing the second event data updates the enrichment cache, restored based on the backup generated at the first time, based on the new second enrichment data indicated by the second event data (col. 8, lines 28-30, if the electronic apparatus 40 determines that the new data is not a first update of the original data block since the snapshot was taken; col. 8, lines 43-46, the electronic apparatus updates the original data block in the cache memory with the new data of the write instruction). It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky disclosure, update the cache when receive new data, as taught by Zhao. One would be motivated to do so to quickly identifying the Copy on First Write data in the file system buffer cache and passing the Copy on First Write data from the file system buffer cache to the snapshot save area. Claims 9, 15, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Lasky (US 20250068641 A1) in view of Mondragon (US 20150348076 A1) in view of Sweeney (US 20190207981 A1) and further in view of Godowski (US 20180365418 A1). Regarding claims 9, 15, and 20, Lasky, Mondragon, and Sweeney teach all limitations of parent claims 1, 10, and 20, Lasky further teaches: retrieving, by the host, and via a network from at least one source within the digital security system, cache seed data that: is associated with a set of computing endpoints that corresponds to the host, and indicates pre-determined values of the enrichment data ([0021] The enrichment point appends, at an automatically determined insertion point into the data stream, a matching entity data payload retrieved from the high-performance cache); and filling, by the host, the enrichment cache with the pre-determined values of the enrichment data ([0083] entities service functions as a primary store of record for entities and/or as a matching module that performs), wherein the filling of the enrichment cache with the pre-determined values of the enrichment data configures the host to begin generating enriched event data instances based on corresponding event data instances received from sensors executing on the set of computing endpoints ([0091] the path for the enrichment setup data can correspond to a path within the incoming data stream (e.g., event stream), indicating, for example, a key in an event payload on which a match should be attempted against one or more fields or columns associated with a stored enrichment entity). Lasky does not explicitly teach initially instantiating, by the host, the enrichment cache as an empty enrichment cache; Godowski teaches initially instantiating, by the host, the enrichment cache as an empty enrichment cache ([0046], fig. 4A, the monitor verifies whether the filtering queue contains any other events of the current file. If not, the monitor adds the current event to the filtering queue); It would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention made to include in the Lasky disclosure, store data to a queue and verify whether the queue is empty, as taught by Godowski. One would be motivated to do so to queue implemented by a FIFO register capable of storing a fixed number of events. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANH NGUYEN whose telephone number is (571)270-0657. The examiner can normally be reached M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Umar Cheema can be reached at 5712703037. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANH NGUYEN/Primary Examiner, Art Unit 2458
Read full office action

Prosecution Timeline

Apr 30, 2024
Application Filed
Sep 10, 2025
Non-Final Rejection — §103
Dec 15, 2025
Response Filed
Feb 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602480
DATA MANAGEMENT APPARATUS AND DATA MANAGEMENT METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12603908
SYSTEM FOR DETECTING ANOMALOUS NETWORK PATTERNS BASED ON ANALYZING NETWORK TRAFFIC DATA AND METHOD THEREOF
2y 5m to grant Granted Apr 14, 2026
Patent 12587558
SYSTEM AND METHOD OF ARTIFICIAL INTELLIGENCE ASSISTED CYBER THREAT IDENTIFICATION VIA WEBSERVER LOGS
2y 5m to grant Granted Mar 24, 2026
Patent 12578895
USING NETWORK DEVICE REPLICATION IN DISTRIBUTED STORAGE CLUSTERS
2y 5m to grant Granted Mar 17, 2026
Patent 12581310
PAIRING OF USER DEVICE WITH REMOTE SYSTEM
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
79%
Grant Probability
99%
With Interview (+24.9%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 359 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month