Prosecution Insights
Last updated: April 19, 2026
Application No. 18/928,994

EVENT DETECTION MODEL

Non-Final OA §101§102§112
Filed
Oct 28, 2024
Examiner
ZARRINEH, SHAHRIAR
Art Unit
2496
Tech Center
2400 — Computer Networks
Assignee
Crowdstrike Inc.
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
87%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
341 granted / 433 resolved
+20.8% vs TC avg
Moderate +8% lift
Without
With
+7.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
59 currently pending
Career history
492
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
52.2%
+12.2% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
16.2%
-23.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 433 resolved cases

Office Action

§101 §102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In communications filed on 10/28/2024. Claims 1-20 are pending in this examination. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. This examination is in response to US Patent Application No. 18/928,994. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim Interpretation: Under the broadest reasonable interpretation, the terms of the claim are presumed to have their plain meaning consistent with the specification as it would be interpreted by one of ordinary skill in the art. See MPEP 2111. The claim(s) recite(s) computing a first score corresponding to an event at a first host based on a first timestamp of the event, a second timestamp, and a base rate; computing, based on the first score exceeding a first threshold value, a second score based on: the first timestamp, a third timestamp corresponding to an occurrence of the event at a second host, and the base rate; and outputting, by a processing device, an indication of the event based on the first score and the second score. The claim does not put any limits on how the scores are computed based on the timestamps, and how the first threshold value computed based on the first score. The steps recited above are performed by “ a processing device”. The recited device is recited at a high level of generality, since the specification is devoid of adequate structure to perform the claimed steps/functions, and merely i.e., as a generic computer performing generic computer functions. Step 1: See MPEP 2106.03. The claim recites at least one step or act, including “computing first score…” , and “computing second score……”, and “outputting, by the processing device, an indication of an event…”, Thus, the claim is to a process, which is one of the statutory categories of invention. (Step 1: YES). Step 2A, Prong One: As explained in MPEP 2106.04, subsection II, a claim “recites” judicial exception when the judicial exception is “set forth” or “described” in the claim. The broadest reasonable interpretation of steps is that those steps fall within the mental process groupings of abstract ideas because they cover concepts performed in the human mind, including observation, evaluation, judgment, and opinion. See MPEP 2106.04(a)(2), subsection III. Under its broadest reasonable interpretation when read in light of the specification, the “computing” and “outputting” encompasses mental observations or evaluations that are practically performed in the human mind, for example, the claimed “computing first score…” , and “computing second score……”, and “outputting, by the processing device, an indication of an event…”. Step 2A, Prong Two. See MPEP 2106.04(d). The claim recites the additional elements of “computing ( by a processor)”. This judicial exception is not integrated into a practical application because the limitations “computing first score…” , and “computing second score……”, and “outputting, by the processing device, an indication of an event…”, in this case a processor, , is recited at a high level of generality. The device is used as a tool to perform the generic computer function of receiving data and creating data. See MPEP 2106.05(f). The limitations, the computer is used to perform an abstract idea, as discussed above in Step 2A, Prong One, such that it amounts to no more than mere instructions to apply the exception using a generic computer. See MPEP 2106.05(f). Even when viewed in combination, these additional elements do not integrate the recited judicial exception into a practical application (Step 2A, Prong Two: NO), and the claim is directed to the judicial exception. (Step 2A: YES) Step 2B: See MPEP 2106.05. As explained with respect to Step 2A, Prong Two, the additional elements. The additional element of “computing ( by a processor)” are at best mere instructions to “apply” the abstract ideas, which cannot provide an inventive concept. See MPEP 2106.05(f) The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because even when considered in combination, these additional elements represent mere instructions to implement an abstract idea or other exception on a computer and insignificant extra-solution activity, which do not provide an inventive concept. (Step 2B: NO). The claim is ineligible. Claims 2-6, 8-10, and 12-16, all recites such as : detecting an occurrence, obtaining an event, determining the event, computing the difference between first timestamp and second timestamp, combining the sources, transmitting the indication, detect event, and recentness of the event. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more under step 2A and Sept 2B , similarly as above analyzed. Claims 7, and 11 recite “ computing the first score comprises computing a logarithm of a quotient of the difference and the base rate, and computing the second score comprises computing a logarithm of a quotient of the difference and the base rate”. The specification has no support how these calculations are performed and is nothing just stating the method is used for calculation which uses mathematical calculation to compute the score, which encompass mental choices or evaluations, and the claimed statistics performing mathematical calculations. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more under step 2A and Sept 2B , similarly as above analyzed. Claims 17 and 19, the limitations “a processor device” and “non-transitory computer readable medium … and when executed by a processing”, the same steps, in this case a computer, are recited at a high level of generality, as above noted for claim 1. In these limitations are used as a tool to perform the generic computer function of receiving data and creating data. See MPEP 2106.05(f). Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL. — The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 7, and 11 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. In regard to Claims 7, and 11 the specification fails to provide written description support for the claim limitation of “computing the first score comprises computing a logarithm of a quotient of the difference and the base rate, and computing the second score comprises computing a logarithm of a quotient of the difference and the base rate.;”. There is no evidence in the disclosure as how this mathematical calculation calculated! in paragraph [0052], In some respects, computing the first score may include computing a logarithm of a quotient of the difference and the base rate.), and [00550] In some respects, computing the second score may include computing logarithm of a quotient of the difference and the base rate. The level of detail required to satisfy the written description requirement varies depending on the nature and scope of the claims and on the complexity and predictability of the relevant technology. Ariad, 598 F.3d at 1351, 94 USPQ2d at 1172; Capon v. Eshhar, 418 F.3d 1349, 1357-58, 76 USPQ2d 1078, 1083-84 (Fed. Cir. 2005). Computer-implemented inventions are often disclosed and claimed in terms of their functionality. For computer-implemented inventions, the determination of the sufficiency of disclosure will require an inquiry into the sufficiency of both the disclosed hardware and the disclosed software due to the interrelationship and interdependence of computer hardware and software. The critical inquiry is whether the disclosure of the application relied upon reasonably conveys to those skilled in the art that the inventor had possession of the claimed subject matter as of the filing date. Vasudevan Software, Inc. v. MicroStrategy, Inc., 782 F.3d 671, 682. 114 USPQ2d 1349, 1356 (citing Ariad Pharm., Inc. V. Eli Lilly & Co, 598 F.3d 1336, 1351, 94 USPQ2d 1161, 1172 (Fed. Cir. 2010) in the context of determining possession of a claimed means of accessing disparate databases). Applicant is kindly requested to show the examiner support in the original disclosure for the new or amended claims. See MPEP 714.02 and 2163.06 (“Applicant should specifically point out the support for any amendments made to the disclosure"). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. First set of rejections: Claims 1-20 are rejected under 35 U.S.C. 102(a) (1) as being anticipated by Ulrich Buergi (US20200396252A1), hereinafter, “Buergi”. Regarding claim 1, Buergi discloses a method, comprising computing a first score corresponding to an event at a first host based on a first timestamp of the event, a second timestamp, and a base rate [0031-0034] Looking at the requests and responses in the log of Table 1 in detail the following network traffic events may be observed: 1) At 0.001178s a user with the IP address 172.16.55.5 (Client A) requests the web site with the IP address 10.88.88.88 (phishing.test). 2) At 0.018097s the user with IP address 172.16.55.5 requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22 (target.test). 3) At 0.018837s the user 172.16.55.5 addresses 10.22.22.22 to requests an asset (here: /img/logo.png). [0038] While backtracking the network flow of Client A from a call to a targeted website 10.22.22.22, it may be observed that Client A accessed a different web site 10.88.88.88 (phishing.test/login.html) only 0.017s(0.018837-0.001178)( equated to first score) before calling up assets from the targeted site.], and [0042] The short call pattern may be characterized as being short in time, e.g., as within a predefined time window ( equated to base rate) of preferably between 0.1 and 10 seconds or being short in the registered network traffic, e.g., with very few or no intermediate calls, preferably less than 5 or 3, between the calls forming part of the call pattern or both.]; and computing, based on the first score exceeding a first threshold value [0038] While backtracking the network flow of Client A from a call to a targeted website 10.22.22.22, it may be observed that Client A accessed a different web site 10.88.88.88 (phishing.test/login.html) only 0.017s(0.018837-0.001178) before calling up assets from the targeted site( equated to exceeding the first threshold, spent more time to get the target website (10.22.22.22 )]; and a second score based on: the first timestamp, a third timestamp corresponding to an occurrence of the event at a second host, and the base rate [0035-38] 5) At 0.028197s the user with IP address 192.168.3.3 (Client B)requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22. 6) At 0.029218s the user with IP address 192.168.3.3 requests an asset (here: /img/logo.png) of the website with the IP address 10.22.22.22… When backtracking the network flow from and to Client B, who also accessed the targeted site, no such call of a different site is observed within the logged time window( 0.029218- 0.028197= 0.001021 equates to second score) The sudden change of accessed sites makes the initial IP address suspicious and hence this IP address and/or URL may be added to the list of potential phishing site for further analysis]; and and outputting, by a processing device, an indication of the event based on the first score and the second score [0032] While backtracking the network flow of Client A from a call to a targeted website 10.22.22.22, it may be observed that Client A accessed a different web site 10.88.88.88 (phishing.test/login.html) only 0.017s before calling up assets from the targeted site. When backtracking the network flow from and to Client B, who also accessed the targeted site, no such call of a different site is observed within the logged time window. The sudden change of accessed sites makes the initial IP address suspicious and hence this IP address and/or URL may be added to the list of potential phishing site for further analysis. Regarding claim 2, Buergi discloses wherein the second timestamp corresponds to at least one of: a second occurrence of the event at the first host, a time at which the first host was activated, or a time at which an event detection system was activated [0031] Looking at the requests and responses in the log of Table 1 in detail the following network traffic events may be observed: 1) At 0.001178s a user with the IP address 172.16.55.5 (Client A) requests the web site with the IP address 10.88.88.88 (phishing.test). 2) At 0.018097s the user with IP address 172.16.55.5 requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22 (target.test). 3) At 0.018837s the user 172.16.55.5 addresses 10.22.22.22 to requests an asset (here: /img/logo.png). Regarding claim 3, Buergi discloses detecting an occurrence of the event at the first host, wherein the computing the first score corresponding to the event is based on the detection. [0031] Looking at the requests and responses in the log of Table 1 in detail the following network traffic events may be observed: 1) At 0.001178s a user with the IP address 172.16.55.5 (Client A) requests the web site with the IP address 10.88.88.88 (phishing.test). 2) At 0.018097s the user with IP address 172.16.55.5 requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22 (target.test). 3) At 0.018837s the user 172.16.55.5 addresses 10.22.22.22 to requests an asset (here: /img/logo.png). Regarding claim 4, Buergi discloses The method of claim 3, further comprising: obtaining an event stream comprising a type of the event, an identifier of the first host, and an identifier of an organization to which the first host belongs; and determining that the event has occurred previously at the first host based on the event stream and a repository comprising indications of events, wherein the computing the first score is based on the determination that the event has occurred previously at the first host [0027] FIG. 2 illustrates example network traffic pattern, in accordance with the present disclosure. Shown in FIG. 2 is flow chart 200 representing example network traffic. In this regard, traffic from and to a specific user, client, or site ("Client A"), as identified by its IP address, is emphasized in the flow chart 200 using black arrow within the multitude of other requests and responses in the network traffic. [0028] The example of network traffic as represented in FIG. 2 is in form of a network flow log of time stamped network traffic events is illustrated by Table 1, below, which illustrates example network flow, for particular duration (e.g., 0.032616 seconds), at network level, e.g., as passing through a network or gateway server of an ISP (Internet Service Provider). Regarding claim 5, Buergi discloses determining that the event has occurred previously at the second host based on the event stream and the repository comprising the indications of the events, wherein the computing the second score is based on the determination that the event has occurred previously at the second host [See table 1. 4) At 0.027165s another client with IP address 192.168.3.3 (Client B) accesses the web site with the IP address 10.88.88.88. 5) At 0.028197s the user with IP address 192.168.3.3 requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22. 6) At 0.029218s the user with IP address 192.168.3.3 requests an asset (here: /img/logo.png) of the website with the IP address 10.22.22.22]. Regarding claim 6, Buergi discloses the method of claim 1, further comprising: computing a difference based on the first timestamp and the second timestamp, wherein the computing the first score comprises computing the first score based on the base rate and the difference [0032] While backtracking the network flow of Client A from a call to a targeted website 10.22.22.22, it may be observed that Client A accessed a different web site 10.88.88.88 (phishing.test/login.html) only 0.017s before calling up assets from the targeted site. When backtracking the network flow from and to Client B, who also accessed the targeted site, no such call of a different site is observed within the logged time window. The sudden change of accessed sites makes the initial IP address suspicious and hence this IP address and/or URL may be added to the list of potential phishing site for further analysis], and [0036] The short call pattern may be characterized as being short in time, e.g., as within a predefined time window of preferably between 0.1 and 10 seconds or being short in the registered network traffic, e.g., with very few or no intermediate calls, preferably less than 5 or 3, between the calls forming part of the call pattern or both.] Regarding claim 7, Buergi discloses wherein the computing the first score comprises computing a logarithm of a quotient of the difference and the base rate. [0031-0034] Looking at the requests and responses in the log of Table 1 in detail the following network traffic events may be observed: 1) At 0.001178s a user with the IP address 172.16.55.5 (Client A) requests the web site with the IP address 10.88.88.88 (phishing.test). 2) At 0.018097s the user with IP address 172.16.55.5 requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22 (target.test). 3) At 0.018837s the user 172.16.55.5 addresses 10.22.22.22 to requests an asset (here: /img/logo.png). [0038] While backtracking the network flow of Client A from a call to a targeted website 10.22.22.22, it may be observed that Client A accessed a different web site 10.88.88.88 (phishing.test/login.html) only 0.017s(0.018837-0.001178)( equated to first score) before calling up assets from the targeted site.], and [0042] The short call pattern may be characterized as being short in time, e.g., as within a predefined time window ( equated to base rate) of preferably between 0.1 and 10 second]. Regarding claim 8, Buergi discloses combining the first score and the second score to generate a combined score, wherein the outputting the indication of the event comprises outputting the indication of the event based on the combined score. [0032] While backtracking the network flow of Client A from a call to a targeted website 10.22.22.22, it may be observed that Client A accessed a different web site 10.88.88.88 (phishing.test/login.html) only 0.017s before calling up assets from the targeted site. When backtracking the network flow from and to Client B, who also accessed the targeted site, no such call of a different site is observed within the logged time window. The sudden change of accessed sites makes the initial IP address suspicious and hence this IP address and/or URL may be added to the list of potential phishing site for further analysis. Regarding claim 9, Buergi discloses wherein the outputting the indication of the event comprises outputting the indication of the event based on the combined score exceeding a second threshold value [0032] While backtracking the network flow of Client A from a call to a targeted website 10.22.22.22, it may be observed that Client A accessed a different web site 10.88.88.88 (phishing.test/login.html) only 0.017s before calling up assets from the targeted site. When backtracking the network flow from and to Client B, who also accessed the targeted site, no such call of a different site is observed within the logged time window. The sudden change of accessed sites makes the initial IP address suspicious and hence this IP address and/or URL may be added to the list of potential phishing site for further analysis. Regarding claim 10, Buergi discloses computing a difference based on the first timestamp and the third timestamp corresponding to the occurrence of the event at the second host, wherein the computing the second score comprises computing the second score based on the base rate and the difference. 4) At 0.027165s another client with IP address 192.168.3.3 (Client B) accesses the web site with the IP address 10.88.88.88. 5) At 0.028197s the user with IP address 192.168.3.3 requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22. 6) At 0.029218s the user with IP address 192.168.3.3 requests an asset (here: /img/logo.png) of the website with the IP address 10.22.22.22. [0032]When backtracking the network flow from and to Client B, who also accessed the targeted site, no such call of a different site is observed within the logged time window. The sudden change of accessed sites makes the initial IP address suspicious and hence this IP address and/or URL may be added to the list of potential phishing site for further analysis]. Regarding claim 11, Buergi discloses wherein the computing the second score comprises computing a logarithm of a quotient of the difference and the base rate. [0035-38] 5) At 0.028197s the user with IP address 192.168.3.3 (Client B)requests an asset (here: /css/main.css) of the website with the IP address 10.22.22.22. 6) At 0.029218s the user with IP address 192.168.3.3 requests an asset (here: /img/logo.png) of the website with the IP address 10.22.22.22… When backtracking the network flow from and to Client B, who also accessed the targeted site, no such call of a different site is observed within the logged time window( 0.029218- 0.028197= 0.001021 equates to second score) The sudden change of accessed sites makes the initial IP address suspicious and hence this IP address and/or URL may be added to the list of potential phishing site for further analysis]. Regarding claim 12, Buergi discloses, wherein outputting the indication of the event comprises: transmitting the indication of the event for presentation in a user interface (UI). [0039] The network traffic flow monitoring 117 and the HTTP flow correlator 118 may be performed wherever client network traffic may be monitored. This may be at a client's computer (e.g., as a web browser extension or as part of a software firewall), within a client's network (e.g., hardware firewall, router, or modem), or within an ISP's network. Regarding claim 13, Buergi discloses wherein the event comprises an indicator event or a detect event.[0027] FIG. 2 illustrates example network traffic pattern, in accordance with the present disclosure. Shown in FIG. 2 is flow chart 200 representing example network traffic. In this regard, traffic from and to a specific user, client, or site ("Client A"), as identified by its IP address, is emphasized in the flow chart 200 using black arrow within the multitude of other requests and responses in the network traffic. [0028] The example of network traffic as represented in FIG. 2 is in form of a network flow log of time stamped network traffic events is illustrated by Table 1, below, which illustrates example network flow, for particular duration (e.g., 0.032616 seconds), at network level, e.g., as passing through a network or gateway server of an ISP (Internet Service Provider). Regarding claim 14, Buergi discloses wherein the first host comprises at least one of a server or a user device [0027] FIG. 2 illustrates example network traffic pattern, in accordance with the present disclosure. Shown in FIG. 2 is flow chart 200 representing example network traffic. In this regard, traffic from and to a specific user, client, or site ("Client A"), as identified by its IP address, is emphasized in the flow chart 200 using black arrow within the multitude of other requests and responses in the network traffic. Regarding claim 15, Buergi discloses wherein the first host and the second host belong to an organization.[0027] FIG. 2 illustrates example network traffic pattern, in accordance with the present disclosure. Shown in FIG. 2 is flow chart 200 representing example network traffic. In this regard, traffic from and to a specific user, client, or site ("Client A"), as identified by its IP address, is emphasized in the flow chart 200 using black arrow within the multitude of other requests and responses in the network traffic. [0028] The example of network traffic as represented in FIG. 2 is in form of a network flow log of time stamped network traffic events is illustrated by Table 1, below, which illustrates example network flow, for particular duration (e.g., 0.032616 seconds), at network level, e.g., as passing through a network or gateway server of an ISP (Internet Service Provider). Regarding claims 16, 18, and 20, Buergi discloses wherein the first score is indicative of a recentness of the event occurring at the first host, and wherein the second score is indicative of a recentness of the event occurring with an organization [ see Table 1, example network traffic, columns: time, source, and destination]. Regarding claims 17, and 19, these claims are interpreted and rejected for the same rational set forth in claim 1. Second set of rejections: Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-20 are rejected under 35 U.S.C. 102(a) (1) as being anticipated by Dennison (US9043894). Regarding claim 1, Dennison discloses comprising computing a first score corresponding to an event at a first host based on a first timestamp of the event, a second timestamp, and a base rate [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to fist host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.]. At block 316, the beaconing malware pre-filter system may filter out any noise in each time series. For example, the connection pairs in each connection series may be analyzed in order to identify any connection pairs of the particular connection series that should be indicated as noise. Noise in a connection series may include, for example, any internal-external connection pairs that have a low likelihood of being related to beaconing activity and/or to malicious activity. Various filter criteria may be applied to filter out noise. Examples of noise filtering criteria may include, but are not limited to: filter 316A, which detects frequently established connections, such as the same or similar connection pairs (for example, multiple connection pairs from the same internal IP to the same external IP and/or domain) that occur with short intervals (or deltas) of time between them (for example, intervals on the order of seconds, or intervals that are shorter than are typically employed by beaconing malware); filter 316B, which detects connection pairs that have only been occurring for a short period of time (for example, for a week or less)], and [ see fig. 2A, Col. 9 lines 11-14, the connection records can be limited to those connection records occurring within a certain period of time (e.g., a 1 minute block, a 5 minute block, a 15 minute block, an hour block etc.)], and [ see FIG. 1, the filtering system in the application identifies beaconing activity and/or to malicious activity in connection pair via various filter method and the process passed through the scoring processor, [Col. 7 lines 52-59, A scoring processor 106 executes a scoring process on the identified subset of data items. The scoring process can implement machine learning. The score indicates the relative likelihood that a particular data item is associated with a cyber threat, such as being transmitted in response to a command by malicious software. For example, data items with a high score can be more likely to be malicious than items with a low score, or vice versa]; and computing, based on the first score exceeding a first threshold value[Col. 11 lines 38-65, At block 316, the beaconing malware pre-filter system may filter out any noise in each time series. For example, the connection pairs in each connection series may be analyzed in order to identify any connection pairs of the particular connection series that should be indicated as noise. Noise in a connection series may include, for example, any internal-external connection pairs that have a low likelihood of being related to beaconing activity and/or to malicious activity. Various filter criteria may be applied to filter out noise. Examples of noise filtering criteria may include, but are not limited to: filter 316A, which detects frequently established connections, such as the same or similar connection pairs (for example, multiple connection pairs from the same internal IP to the same external IP and/or domain) that occur with short intervals (or deltas) of time between them (for example, intervals on the order of seconds, or intervals that are shorter than are typically employed by beaconing malware); filter 316B, which detects connection pairs that have only been occurring for a short period of time (for example, for a week or less)]; and a second score based on: the first timestamp, a third timestamp corresponding to an occurrence of the event at a second host, and the base rate [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to second host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.]. At block 316, the beaconing malware pre-filter system may filter out any noise in each time series. For example, the connection pairs in each connection series may be analyzed in order to identify any connection pairs of the particular connection series that should be indicated as noise. Noise in a connection series may include, for example, any internal-external connection pairs that have a low likelihood of being related to beaconing activity and/or to malicious activity. Various filter criteria may be applied to filter out noise. Examples of noise filtering criteria may include, but are not limited to: filter 316A, which detects frequently established connections, such as the same or similar connection pairs (for example, multiple connection pairs from the same internal IP to the same external IP and/or domain) that occur with short intervals (or deltas) of time between them (for example, intervals on the order of seconds, or intervals that are shorter than are typically employed by beaconing malware); filter 316B, which detects connection pairs that have only been occurring for a short period of time (for example, for a week or less)], and [ see fig. 2A, Col. 9 lines 11-14, the connection records can be limited to those connection records occurring within a certain period of time (e.g., a 1 minute block, a 5 minute block, a 15 minute block, an hour block etc.)], and [ see FIG. 1, the filtering system in the application identifies beaconing activity and/or to malicious activity in connection pair via various filter method and the process passed through the scoring processor, [Col. 7 lines 52-59, A scoring processor 106 executes a scoring process on the identified subset of data items. The scoring process can implement machine learning. The score indicates the relative likelihood that a particular data item is associated with a cyber threat, such as being transmitted in response to a command by malicious software. For example, data items with a high score can be more likely to be malicious than items with a low score, or vice versa]; and and outputting, by a processing device, an indication of the event based on the first score and the second score [ Col. 11 lines 38-65, At block 316, the beaconing malware pre-filter system may filter out any noise in each time series. For example, the connection pairs in each connection series may be analyzed in order to identify any connection pairs of the particular connection series that should be indicated as noise. Noise in a connection series may include, for example, any internal-external connection pairs that have a low likelihood of being related to beaconing activity and/or to malicious activity. Various filter criteria may be applied to filter out noise. Examples of noise filtering criteria may include, but are not limited to: filter 316A, which detects frequently established connections, such as the same or similar connection pairs (for example, multiple connection pairs from the same internal IP to the same external IP and/or domain) that occur with short intervals (or deltas) of time between them (for example, intervals on the order of seconds, or intervals that are shorter than are typically employed by beaconing malware); filter 316B, which detects connection pairs that have only been occurring for a short period of time (for example, for a week or less); filter 316C, which detects connection pairs with popular or well-known legitimate external domains (for example, a third-party produced list of popular domains may be used by the system); and/or filter 316D, which detects connection pairs made by legitimate software for, for example, software updates (in an embodiment, this filter criteria may be applied on a per-computer system basis, such that a determination may be made regarding the legitimacy of particular pieces of software on each individual computer system)]. Regarding claim 2, Dennison discloses wherein the second timestamp corresponds to at least one of: a second occurrence of the event at the first host, a time at which the first host was activated, or a time at which an event detection system was activated [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to fist host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.]. Regarding claim 3, Dennison discloses further comprising: detecting an occurrence of the event at the first host, wherein the computing the first score corresponding to the event is based on the detection [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to fist host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.], and [ see FIG. 1, the filtering system in the application identifies beaconing activity and/or to malicious activity in connection pair via various filter method and the process passed through the scoring processor, [Col. 7 lines 52-59, A scoring processor 106 executes a scoring process on the identified subset of data items. The scoring process can implement machine learning. The score indicates the relative likelihood that a particular data item is associated with a cyber threat, such as being transmitted in response to a command by malicious software. For example, data items with a high score can be more likely to be malicious than items with a low score, or vice versa]. Regarding claim 4, Dennison discloses obtaining an event stream comprising a type of the event, an identifier of the first host, and an identifier of an organization to which the first host belongs; and determining that the event has occurred previously at the first host based on the event stream and a repository comprising indications of events, wherein the computing the first score is based on the determination that the event has occurred previously at the first host [ Col. 7 lines 5-131, The outbound data connection log 102 includes a large plurality of data items, such as thousands, millions, tens of millions, hundreds of millions, or even billions of data items. In one embodiment, such data items include the IP addresses of internal resources, within the local network, that have attempted to communicate with an external resource outside the local network. The outbound data connection log 102 can also include a time, such as a time stamp indicating year, month, day, hour, minute, and/or second, associated with each attempted connection. The outbound data connection log 102 can also include a character string relating to the attempted connection. An example character string may be a URL. Such a URL can generally resemble the form: schm://3LD.2LD.TLD/filepath. The portion "schm" represents the scheme or prefix, such as ftp, http, mailto, and the like. The portion "3LD" is a combination of alphabetic characters, numbers, and/or hyphens representing the third level domain. The portion "2LD" is a combination of alphabetic characters, numbers, and/or hyphens representing the second level domain. The portion "TLD" represents the top-level domain, such as com, org, edu, gov, and the like. The portion "filepath" is a textual string that can include numeric, alphabetic, and punctuation characters such as backslashes, hyphens, question marks, periods, and the like. As used herein, and unless specified otherwise, the term "domain name" refers to the combination of the 2LD and the TLD. An example domain name has the form example.com.], and [ see FIG. 1, the filtering system in the application identifies beaconing activity and/or to malicious activity in connection pair via various filter method and the process passed through the scoring processor, [Col. 7 lines 52-59, A scoring processor 106 executes a scoring process on the identified subset of data items. The scoring process can implement machine learning. The score indicates the relative likelihood that a particular data item is associated with a cyber threat, such as being transmitted in response to a command by malicious software. For example, data items with a high score can be more likely to be malicious than items with a low score, or vice versa]. Regarding claim 5, Dennison discloses determining that the event has occurred previously at the second host based on the event stream and the repository comprising the indications of the events, wherein the computing the second score is based on the determination that the event has occurred previously at the second host [ Col. 7 lines 5-131, The outbound data connection log 102 includes a large plurality of data items, such as thousands, millions, tens of millions, hundreds of millions, or even billions of data items. In one embodiment, such data items include the IP addresses of internal resources, within the local network, that have attempted to communicate with an external resource outside the local network. The outbound data connection log 102 can also include a time, such as a time stamp indicating year, month, day, hour, minute, and/or second, associated with each attempted connection. The outbound data connection log 102 can also include a character string relating to the attempted connection. An example character string may be a URL. Such a URL can generally resemble the form: schm://3LD.2LD.TLD/filepath. The portion "schm" represents the scheme or prefix, such as ftp, http, mailto, and the like. The portion "3LD" is a combination of alphabetic characters, numbers, and/or hyphens representing the third level domain. The portion "2LD" is a combination of alphabetic characters, numbers, and/or hyphens representing the second level domain. The portion "TLD" represents the top-level domain, such as com, org, edu, gov, and the like. The portion "filepath" is a textual string that can include numeric, alphabetic, and punctuation characters such as backslashes, hyphens, question marks, periods, and the like. As used herein, and unless specified otherwise, the term "domain name" refers to the combination of the 2LD and the TLD. An example domain name has the form example.com.], and [ see FIG. 1, the filtering system in the application identifies beaconing activity and/or to malicious activity in connection pair via various filter method and the process passed through the scoring processor, [Col. 7 lines 52-59, A scoring processor 106 executes a scoring process on the identified subset of data items. The scoring process can implement machine learning. The score indicates the relative likelihood that a particular data item is associated with a cyber threat, such as being transmitted in response to a command by malicious software. For example, data items with a high score can be more likely to be malicious than items with a low score, or vice versa]. Regarding claim 6, Dennison discloses computing a difference based on the first timestamp and the second timestamp, wherein the computing the first score comprises computing the first score based on the base rate and the difference [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to fist host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.], and [ see fig. 2A, Col. 9 lines 11-14, the connection records can be limited to those connection records occurring within a certain period of time (e.g., a 1-minute block, a 5-minute block, a 15 minute block, an hour block etc.)]. Regarding claim 7, Dennison discloses wherein the computing the first score comprises computing a logarithm of a quotient of the difference and the base rate[Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to fist host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.], and [ see fig. 2A, Col. 9 lines 11-14, the connection records can be limited to those connection records occurring within a certain period of time (e.g., a 1-minute block, a 5 minute block, a 15 minute block, an hour block etc.)]. Regarding claim 8, Dennison discloses combining the first score and the second score to generate a combined score, wherein the outputting the indication of the event comprises outputting the indication of the event based on the combined score [ see FIG. 1 and corresponding text for more detail, [Col. 7 lines 60-67- Col. 8 lines 1-13, Optionally, suitable program instructions stored on a non-transitory computer readable storage medium can be executed by a computer processor in order to cause the computing system of FIG. 12 to run one or more post-filters 108A, 108B on one or more of the scored data items returned from the scoring processor 106. The post-filters can identify a subset of data items from the scored data items as likely malicious URLs. Again, the post-filters can be executed in series or in parallel. The post-filters can be processed without any intervention by a human analyst or in response to specific commands by a human analyst. In any event, the data items output from the post-filter are likely to be associated with malicious software. An output group of data items from the subset of the post-filters 108A, 108B is then passed to output 110. … The output 110 can be used, for example, to alert system administrators when a computer is likely to be infected with malicious software. The output 110 can also be used to improve as feedback for improving the scoring process]. Regarding claim 9, Dennison discloses wherein the outputting the indication of the event comprises outputting the indication of the event based on the combined score exceeding a second threshold value [ Col. 11 lines 37-65, At block 316, the beaconing malware pre-filter system may filter out any noise in each time series. For example, the connection pairs in each connection series may be analyzed in order to identify any connection pairs of the particular connection series that should be indicated as noise. Noise in a connection series may include, for example, any internal-external connection pairs that have a low likelihood of being related to beaconing activity and/or to malicious activity. Various filter criteria may be applied to filter out noise. Examples of noise filtering criteria may include, but are not limited to: filter 316A, which detects frequently established connections, such as the same or similar connection pairs (for example, multiple connection pairs from the same internal IP to the same external IP and/or domain) that occur with short intervals (or deltas) of time between them (for example, intervals on the order of seconds, or intervals that are shorter than are typically employed by beaconing malware); filter 316B, which detects connection pairs that have only been occurring for a short period of time (for example, for a week or less); filter 316C, which detects connection pairs with popular or well-known legitimate external domains (for example, a third-party produced list of popular domains may be used by the system); and/or filter 316D, which detects connection pairs made by legitimate software for, for example, software updates (in an embodiment, this filter criteria may be applied on a per-computer system basis, such that a determination may be made regarding the legitimacy of particular pieces of software on each individual computer system)], and [ Col. 12 lines 28-32, At block 318, the system may determine which connection pairs have beaconing scores that satisfy a particular threshold. For example, the system may determine that any beaconing pairs having beaconing scores below a particular variance are likely to represent malware beaconing activity], and [ Col. 14 lines 55-67, In an embodiment, the beaconing malware pre-filter system may automatically evaluate the generated clusters to determine a likelihood that a given cluster represents beaconing malware activity. For example, the system may determine that a cluster having a metascore below a particular threshold is likely not related to beaconing malware activity, while a cluster having a metascore above another particular threshold likely is beaconing malware activity. In other words, based on the various score and metascores, a cluster that is more likely to be associated with beaconing malware can be passed to the scoring processor of FIG. 1. In this way, the beaconing malware pre-filter can improve processing speed by reducing the number of data items passed to the scoring processor], and [ see FIG. 1 and corresponding text for more detail, [Col. 7 lines 60-67- Col. 8 lines 1-13, Optionally, suitable program instructions stored on a non-transitory computer readable storage medium can be executed by a computer processor in order to cause the computing system of FIG. 12 to run one or more post-filters 108A, 108B on one or more of the scored data items returned from the scoring processor 106. The post-filters can identify a subset of data items from the scored data items as likely malicious URLs. Again, the post-filters can be executed in series or in parallel. The post-filters can be processed without any intervention by a human analyst or in response to specific commands by a human analyst. In any event, the data items output from the post-filter are likely to be associated with malicious software. An output group of data items from the subset of the post-filters 108A, 108B is then passed to output 110. … The output 110 can be used, for example, to alert system administrators when a computer is likely to be infected with malicious software. The output 110 can also be used to improve as feedback for improving the scoring process]. Regarding claim 10, Dennison discloses computing a difference based on the first timestamp and the third timestamp corresponding to the occurrence of the event at the second host, wherein the computing the second score comprises computing the second score based on the base rate and the difference [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to second host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.]. At block 316, the beaconing malware pre-filter system may filter out any noise in each time series. For example, the connection pairs in each connection series may be analyzed in order to identify any connection pairs of the particular connection series that should be indicated as noise. Noise in a connection series may include, for example, any internal-external connection pairs that have a low likelihood of being related to beaconing activity and/or to malicious activity. Various filter criteria may be applied to filter out noise. Examples of noise filtering criteria may include, but are not limited to: filter 316A, which detects frequently established connections, such as the same or similar connection pairs (for example, multiple connection pairs from the same internal IP to the same external IP and/or domain) that occur with short intervals (or deltas) of time between them (for example, intervals on the order of seconds, or intervals that are shorter than are typically employed by beaconing malware); filter 316B, which detects connection pairs that have only been occurring for a short period of time (for example, for a week or less)], and [ see fig. 2A, Col. 9 lines 11-14, the connection records can be limited to those connection records occurring within a certain period of time (e.g., a 1 minute block, a 5 minute block, a 15 minute block, an hour block etc.)], and [ see FIG. 1, the filtering system in the application identifies beaconing activity and/or to malicious activity in connection pair via various filter method and the process passed through the scoring processor, [Col. 7 lines 52-59, A scoring processor 106 executes a scoring process on the identified subset of data items. The scoring process can implement machine learning. The score indicates the relative likelihood that a particular data item is associated with a cyber threat, such as being transmitted in response to a command by malicious software. For example, data items with a high score can be more likely to be malicious than items with a low score, or vice versa]. Regarding claim 11, Dennison discloses wherein the computing the second score comprises computing a logarithm of a quotient of the difference and the base rate [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to second host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.], and [ see fig. 2A, Col. 9 lines 11-14, the connection records can be limited to those connection records occurring within a certain period of time (e.g., a 1-minute block, a 5-minute block, a 15-minute block, an hour block etc.)]. Regarding claim 12, Dennison discloses wherein outputting the indication of the event comprises: transmitting the indication of the event for presentation in a user interface (UI) [Col. 5 lines 49-51, FIG. 3E illustrates an example cluster analysis user interface of the beaconing malware pre-filter system as applied to beaconing malware detection], and [ FIGS. 11A-11C illustrate example user interfaces of the malicious software detection system and aspects thereof]. Regarding claim 13, Dennison discloses wherein the event comprises an indicator event or a detect event. [Col. 7 lines 5-31, The outbound data connection log 102 includes a large plurality of data items, such as thousands, millions, tens of millions, hundreds of millions, or even billions of data items. In one embodiment, such data items include the IP addresses of internal resources, within the local network, that have attempted to communicate with an external resource outside the local network. The outbound data connection log 102 can also include a time, such as a time stamp indicating year, month, day, hour, minute, and/or second, associated with each attempted connection. The outbound data connection log 102 can also include a character string relating to the attempted connection. An example character string may be a URL. Such a URL can generally resemble the form: schm://3LD.2LD.TLD/filepath. The portion "schm" represents the scheme or prefix, such as ftp, http, mailto, and the like. The portion "3LD" is a combination of alphabetic characters, numbers, and/or hyphens representing the third level domain. The portion "2LD" is a combination of alphabetic characters, numbers, and/or hyphens representing the second level domain. The portion "TLD" represents the top-level domain, such as com, org, edu, gov, and the like. The portion "filepath" is a textual string that can include numeric, alphabetic, and punctuation characters such as backslashes, hyphens, question marks, periods, and the like. As used herein, and unless specified otherwise, the term "domain name" refers to the combination of the 2LD and the TLD. An example domain name has the form example.com]. Regarding claim 14, Dennison discloses wherein the first host comprises at least one of a server or a user device [ Col. 8 lines 62-67, The local IP addresses, URLs, and times can be logically associated as connection records indicating a particular communication from a particular computerized device to a particular external resource at a particular time, such that each of the connection records is associated with a particular device identifier, a particular URL, and a particular time]. Regarding claim 15, Dennison discloses wherein the first host and the second host belong to an organization [Col. 1 lines 37-45, Disclosed herein are various systems, methods, and computer-readable media for detecting malicious software and/or otherwise undesirable access of online resources in a computing system, such as among a network of computers of an organization. At least some of the systems, methods, and media can analyze data, such as URL data items, transmitted by computing systems within a local network in order to identify the infected systems and/or systems that have or are likely to access undesirable online resources], and [ Col. 24 lines47- 51, FIG. 11B illustrates an example interface for marking or tagging data from the listing of FIG. 11A. When reviewing the listing of FIG. 11A, an analyst may determine that the first three listings warrant further investigation, because they were registered by the same organization on the same date]. Regarding claim 16, 18, and 20, Dennison discloses wherein the first score is indicative of a recentness of the event occurring at the first host, and wherein the second score is indicative of a recentness of the event occurring with an organization [Col. 11 lines 38-56, At block 313, the system may generate internal-external connection pairs. Each of the internal-external connection pairs may include a particular internal IP address and a particular external IP address and/or domain that was contacted by the internal IP address.( equated to fist host) At block 314, time series of the generated internal-external connection pairs may be generated. For example, the system may determine sets of connection pairs that have common internal IP addresses and external IP addresses or domains. Then, for each set, a time series may be generated that represents each point in time that the same or a similar connection is made between a particular internal IP address and external IP address or domains. . Each of the time series may span a particular time period. For example, each time series may span a number of days, weeks, months, or years. Thus, a connection pair time-series (or simply "connection pair series" or "connection series"), may indicate multiple connections made between a particular internal and external IP address (or domain or other device identifier) and/or a periodicity or other pattern indicating when the connections were made. The internal-external connection pairs may be plotted along each time series for the particular time period.]. At block 316, the beaconing malware pre-filter system may filter out any noise in each time series. For example, the connection pairs in each connection series may be analyzed in order to identify any connection pairs of the particular connection series that should be indicated as noise. Noise in a connection series may include, for example, any internal-external connection pairs that have a low likelihood of being related to beaconing activity and/or to malicious activity. Various filter criteria may be applied to filter out noise. Examples of noise filtering criteria may include, but are not limited to: filter 316A, which detects frequently established connections, such as the same or similar connection pairs (for example, multiple connection pairs from the same internal IP to the same external IP and/or domain) that occur with short intervals (or deltas) of time between them (for example, intervals on the order of seconds, or intervals that are shorter than are typically employed by beaconing malware); filter 316B, which detects connection pairs that have only been occurring for a short period of time (for example, for a week or less)], and [ see FIG. 1, the filtering system in the application identifies beaconing activity and/or to malicious activity in connection pair via various filter method and the process passed through the scoring processor, [Col. 7 lines 52-59, A scoring processor 106 executes a scoring process on the identified subset of data items. The scoring process can implement machine learning. The score indicates the relative likelihood that a particular data item is associated with a cyber threat, such as being transmitted in response to a command by malicious software. For example, data items with a high score can be more likely to be malicious than items with a low score, or vice versa]. Regarding claims 17, and 19, these claims are interpreted and rejected for the same rational set forth in claim 1. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See 892 for more relevant references. Brew ( US2017/0132306) [ Abstract, A plurality of first event instances of a first event and a plurality of second event instances of a second event are received based on the first event occurring and the second event occurring. Each event instance has an event identifier and a timestamp. A first event type of the plurality of first event instances and a second event type of the plurality of second event instances are identified. A time period of overlap between the first event and the second event are determined by detecting regular intervals between the plurality of first event instances, as compared to each other, and the plurality of second event instances, as compared to each other. A relationship between the first event and the second event are scored based on the time period of overlap. The first event and the second event are grouped based on the scored relationship]. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAHRIAR ZARRINEH whose telephone number is (571)272-1207. The examiner can normally be reached Monday-Friday, 8:30am-5:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jorge Ortiz-Criado can be reached at 571-272-7624. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHAHRIAR ZARRINEH/Primary Examiner, Art Unit 2496
Read full office action

Prosecution Timeline

Oct 28, 2024
Application Filed
Feb 20, 2026
Non-Final Rejection — §101, §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12587392
SECURE COMMUNICATION METHOD AND APPARATUS IN PASSIVE OPTICAL NETWORK
2y 5m to grant Granted Mar 24, 2026
Patent 12549527
MULTI-FACTOR AUTHENTICATION OF CLOUD-MANAGED SERVICES
2y 5m to grant Granted Feb 10, 2026
Patent 12547755
TECHNIQUES FOR SECURELY EXECUTING ATTESTED CODE IN A COLLABORATIVE ENVIRONMENT
2y 5m to grant Granted Feb 10, 2026
Patent 12543044
SYSTEMS AND METHODS OF AUTOMATIC OUT-OF-BAND (OOB) RESTRICTED CELLULAR CONNECTIVITY FOR SET UP PROVISIONING OF MANAGED CLIENT INFORMATION HANDLING SYSTEMS
2y 5m to grant Granted Feb 03, 2026
Patent 12511435
DEVICE AND METHOD FOR ENFORCING A DATA POLICY
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
87%
With Interview (+7.8%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 433 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month