Prosecution Insights
Last updated: April 19, 2026
Application No. 18/247,121

MALWARE INFECTION MITIGATION OF CRITICAL COMPUTER SYSTEMS

Final Rejection §103
Filed
Mar 29, 2023
Examiner
COLIN, CARL G
Art Unit
2493
Tech Center
2400 — Computer Networks
Assignee
British Telecommunications Public Limited Company
OA Round
3 (Final)
48%
Grant Probability
Moderate
4-5
OA Rounds
4y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 48% of resolved cases
48%
Career Allow Rate
65 granted / 136 resolved
-10.2% vs TC avg
Strong +53% interview lift
Without
With
+52.8%
Interview Lift
resolved cases with interview
Typical timeline
4y 9m
Avg Prosecution
7 currently pending
Career history
143
Total Applications
across all art units

Statute-Specific Performance

§101
13.4%
-26.6% vs TC avg
§103
44.7%
+4.7% vs TC avg
§102
17.2%
-22.8% vs TC avg
§112
17.2%
-22.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 136 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION In communications filed on 11/19/2025, the following claims 1-12 presented for examination. Claims 1-12 are pending. Information Disclosure Statement The information disclosure statements (IDS) submitted on 11/19/2025 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement has been considered by the examiner. Response to Arguments Applicant’s arguments filed on 11/19/2025 with respect to claim(s) 1-12 have been considered but they are not persuasive. Applicant argues on pages 7-8 that the prior art Hartrell does not disclose “accessing a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system” because to be such a target computer system consistent with the instant claims the 10 computer systems must be part of and identified in a model of a set of computer systems. This argument is not persuasive as the claim merely recites a model that identifies interacting pairs of the computer systems in a set of computer systems. Hartrell discloses a model with interacting pairs of computer systems as each of the ten computer system interacts with the web site which is also considered a node or computer systems, for this reason Hartrell reads on the claim language the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs. Hartrell discloses each of the computer systems that becomes infected reads on a target computer system in the model disclosed by Hartrell. Applicant original specification filed on 3/29/2023 states, “Such target computer system can be for example a critical computer system”. Applicant argues that none of the prior art discloses triggering deployment of malware at a time period selected with reference to the identified time period so as to protect the target computer system from the malware. The arguments are not persuasive because Chen discloses point in time or moment in time or previous saved state of the machine system to roll back to (para 56-58 and 33). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 4-11 are rejected under 35 U.S.C. 103 as being unpatentable Chen. (Pub. No. US-2018/0375885-A1) in view of Hartrell et al. (US Patent No. US-9886578-B2), hereafter referred to as Hartrell. With respect to claim 1, Chen teaches, a computer implemented malware protection method to protect a target computer system in a set of computer systems from a malware, the method comprising: (See Chen para 16 disclosing system to prevent malware from entering the local network or machine). simulating, over a plurality of time periods, a propagation of the malware originating from a predetermined source computer system in the model, the simulation being based on a number of interactions per time period between each interacting pair of computer systems in the set, and a rate of transmission of the malware per interaction (See para 32, disclosing model a time series; see also para 33, disclosing monitor inputs (source computer system) to the machine learning system; see paragraph 35 disclosing “The change-point of the scan statistics may be detected over the time-window or over a previous time window. In an example, the change-point represents the anomaly and corresponding to a value of a scan statistic of the scan statistics that is above a threshold. In an example, the scan statistic includes a count, such as a maximum of the number of connected edges of a subgraph of the plurality of subgraphs over the time-window, and wherein the number of connected edges away from a particular vertex of the subgraph includes a set of k-nearest neighbors' vertices from the particular vertex” that meets the recitation of the simulation being based on a number of interactions per time period between each interacting pair of computer systems in the set) “In yet another example, the scan statistic includes a weighted geometric average of locality statistics derived over the time-window and a scan statistic derived over a previous time-window. Deriving the scan statistic may include performing temporal normalization on the scan statistics to smooth the scan statistics over the time-window or over the previous time-window.” and a rate of transmission that meets the recitation of based on and a rate of transmission of the malware per interaction. evaluating, for each of at least a subset of the time periods, a probability of infection of the target computer system in the time period (see paragraph 48 the directed graph corresponds to time-window and change-point represents anomaly and corresponding to a value of a scan statistic above a threshold); responsive to the simulating, identifying an earliest time period during which the probability of infection of the target computer system meets a predetermined threshold probability; (See paragraphs 27 and 31 disclosing using a machine learning system identifying anomalies to retrieve maximum value at each time window to where it meets a threshold that meets the claimed recitation) and triggering deployment of malware protection measures in respect of the target computer system at a time period selected with reference to the identified time period so as to protect the target computer system from the malware (see paragraphs 54-56 disclosing some of the measures to protect the target system from the malware). Although Chen discloses interactions between pair of computer systems, Chen does not explicitly disclose accessing a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system. However, Hartrell in an analogous art teaches, accessing a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system (see column 11, lines 18-64 disclosing a model of 10 computers interacting with a web site and monitor previous communication between each computer and the web site, each infected computer represents the target computer). Therefore, it would have been obvious for a person having ordinary skill in the art, before the effective filing date of the claimed invention to combine the teaching of Chen with Hartrell to access a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system. One of ordinary skill in the art would have been motivated to do so in order to prevent future infections as suggested by Hartrell (see column 11, lines 55-64). With respect to claim 2, the references as combined above discloses the method of claim 1. Chen further discloses, wherein the simulating, the evaluating, and the responsive to the simulating are repeated a plurality of times to establish the earliest time period during which the probability of infection of the target computer system exceeds the predetermined threshold probability with confidence intervals for selecting an earliest time period having a confidence meeting a threshold degree of confidence. (See para 27, 28, and 31 disclosing using a machine learning system identifying anomalies to retrieve maximum value at each time window to where it meets a threshold that meets the claimed recitation and fig. 4 the graph illustrates scan statistics over time that shows the process is being repeated). With respect to claim 4, the references as combined above discloses the method of claim 1. Chen further discloses, wherein the malware protection measures include one or more of: an anti-malware facility; a malware filter; a malware detector; a block, preclusion or cessation of interaction; or a reconfiguration of one or more computer systems; (see paragraphs 54-56 disclosing some of the measures to protect the target system from the malware including a block). With respect to claim 5, the references as combined above discloses the method of claim 1. Chen further discloses, wherein the simulating is performed a plurality of times for the source computer system and the responsive to the simulating is responsive to the plurality of simulatings. (See para 27, 28, and 31 disclosing using a machine learning system identifying anomalies to retrieve maximum value at each time window to where it meets a threshold that meets the claimed recitation and fig. 4 the graph illustrates scan statistics over time that shows the process is being repeated). With respect to claim 6, the references as combined above discloses the method of claim 1. Chen further discloses, wherein the simulating is performed a plurality of times for each of multiple different source computer systems, and the responsive to the simulating is responsive to the plurality of simulatings. (See para 27, 28, and 31 disclosing using a machine learning system identifying anomalies to retrieve maximum value at each time window to where it meets a threshold that meets the claimed recitation and fig. 4 the graph illustrates scan statistics over time that shows the process is being repeated. See also paragraph 34 also discloses after any of operations, returning to operation for further training). With respect to claim 7, the references as combined above discloses the method of claim 1. Chen further discloses, wherein the number of interactions per time period between an interacting pair of computer systems is determined based on a statistical distribution(See para 27, 28, and 31 disclosing using a machine learning system identifying anomalies to retrieve maximum value at each time window to where it meets a threshold that meets the claimed recitation and fig. 4 the graph illustrates scan statistics over time). With respect to claim 8, the references as combined above discloses the method of claim 1. Hartrell further discloses, wherein the number of interactions per time period between an interacting pair of computer systems in the set is defined based on historical records of interactions between the interacting pair of computer systems; (see column 9. Lines 10-15, fig. 6 discloses interactions based on recorded activities; see column 11, lines 55-64). With respect to claim 9, the references as combined above discloses the method of claim 1. Hartrell further discloses wherein the model further identifies a class of interaction between interacting pairs of computer systems, the class of interaction being determined based on historical records of interactions between each computer system in an interacting pair, and wherein the rate of transmission of the malware per interaction is determined for each interacting pair of computer systems based on the class of interaction for the interacting pair; (see column 11, lines 55-64 disclosing a model of a class of interaction based on historical data using for instance snapshot of last 5 minutes). With respect to claim 10, Chen teaches, a computer system comprising: a processor and memory storing computer program code for implementing malware protection to protect a target computer system in a set of computer systems from a malware by: (See Chen para 16 disclosing system to prevent malware from entering the local network or machine). simulating, over a plurality of time periods, a propagation of the malware originating from a predetermined source computer system in the model, the simulation being based on a number of interactions per time period between each interacting pair of computer systems in the set, and a rate of transmission of the malware per interaction (See para 32, disclosing model a time series; see also para 33, disclosing monitor inputs (source computer system) to the machine learning system; see paragraph 35 disclosing “The change-point of the scan statistics may be detected over the time-window or over a previous time window. In an example, the change-point represents the anomaly and corresponding to a value of a scan statistic of the scan statistics that is above a threshold. In an example, the scan statistic includes a count, such as a maximum of the number of connected edges of a subgraph of the plurality of subgraphs over the time-window, and wherein the number of connected edges away from a particular vertex of the subgraph includes a set of k-nearest neighbors' vertices from the particular vertex” that meets the recitation of the simulation being based on a number of interactions per time period between each interacting pair of computer systems in the set) “In yet another example, the scan statistic includes a weighted geometric average of locality statistics derived over the time-window and a scan statistic derived over a previous time-window. Deriving the scan statistic may include performing temporal normalization on the scan statistics to smooth the scan statistics over the time-window or over the previous time-window.” and a rate of transmission that meets the recitation of based on and a rate of transmission of the malware per interaction. evaluating, for each of at least a subset of the time periods, a probability of infection of the target computer system in the time period (see paragraph 48 the directed graph corresponds to time-window and change-point represents anomaly and corresponding to a value of a scan statistic above a threshold); responsive to the simulating, identifying an earliest time period during which the probability of infection of the target computer system meets a predetermined threshold probability; (See paragraphs 27 and 31 disclosing using a machine learning system identifying anomalies to retrieve maximum value at each time window to where it meets a threshold that meets the claimed recitation) and triggering deployment of malware protection measures in respect of the target computer system at a time period selected with reference to the identified time period so as to protect the target computer system from the malware (see paragraphs 54-56 disclosing some of the measures to protect the target system from the malware). Although Chen discloses interactions between pair of computer systems, Chen does not explicitly disclose accessing a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system. However, Hartrell in an analogous art teaches, accessing a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system (see column 11, lines 18-64 disclosing a model of 10 computers interacting with a web site and monitor previous communication between each computer and the web site, each infected computer represents the target computer). Therefore, it would have been obvious for a person having ordinary skill in the art, before the effective filing date of the claimed invention to combine the teaching of Chen with Hartrell to access a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system. One of ordinary skill in the art would have been motivated to do so in order to prevent future infections as suggested by Hartrell (see column 11, lines 55-64). With respect to claim 11, Chen teaches, a non-transitory computer-readable storage medium storing a computer program element comprising computer program code to, when loaded into a computer system and executed thereon, cause the computer system to implement malware protection to protect a target computer system in a set of computer systems from a malware by: simulating, over a plurality of time periods, a propagation of the malware originating from a predetermined source computer system in the model, the simulation being based on a number of interactions per time period between each interacting pair of computer systems in the set, and a rate of transmission of the malware per interaction (See para 32, disclosing model a time series; see also para 33, disclosing monitor inputs (source computer system) to the machine learning system; see paragraph 35 disclosing “The change-point of the scan statistics may be detected over the time-window or over a previous time window. In an example, the change-point represents the anomaly and corresponding to a value of a scan statistic of the scan statistics that is above a threshold. In an example, the scan statistic includes a count, such as a maximum of the number of connected edges of a subgraph of the plurality of subgraphs over the time-window, and wherein the number of connected edges away from a particular vertex of the subgraph includes a set of k-nearest neighbors' vertices from the particular vertex” that meets the recitation of the simulation being based on a number of interactions per time period between each interacting pair of computer systems in the set) “In yet another example, the scan statistic includes a weighted geometric average of locality statistics derived over the time-window and a scan statistic derived over a previous time-window. Deriving the scan statistic may include performing temporal normalization on the scan statistics to smooth the scan statistics over the time-window or over the previous time-window.” and a rate of transmission that meets the recitation of based on and a rate of transmission of the malware per interaction. evaluating, for each of at least a subset of the time periods, a probability of infection of the target computer system in the time period (see paragraph 48 the directed graph corresponds to time-window and change-point represents anomaly and corresponding to a value of a scan statistic above a threshold); responsive to the simulating, identifying an earliest time period during which the probability of infection of the target computer system meets a predetermined threshold probability; (See paragraphs 27 and 31 disclosing using a machine learning system identifying anomalies to retrieve maximum value at each time window to where it meets a threshold that meets the claimed recitation) and triggering deployment of malware protection measures in respect of the target computer system at a time period selected with reference to the identified time period so as to protect the target computer system from the malware (see paragraphs 54-56 disclosing some of the measures to protect the target system from the malware). Although Chen discloses interactions between pair of computer systems, Chen does not explicitly disclose accessing a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system. However, Hartrell in an analogous art teaches, accessing a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system (see column 11, lines 18-64 disclosing a model of 10 computers interacting with a web site and monitor previous communication between each computer and the web site, each infected computer represents the target computer). Therefore, it would have been obvious for a person having ordinary skill in the art, before the effective filing date of the claimed invention to combine the teaching of Chen with Hartrell to access a model of the set of computer systems, the model identifying interacting pairs of the computer systems in the set based on interactions corresponding to previous communication occurring between the computer systems in the pairs, and the model identifying the target computer system. One of ordinary skill in the art would have been motivated to do so in order to prevent future infections as suggested by Hartrell (see column 11, lines 55-64). Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Chen, in view of Hartrell, and further in view of Beachem et al. (Pub. No. US-2011/0078797-A1), hereafter referred to as Beachem. With respect to claim 3, the references as combined above discloses the method of claim 1. Chen does not explicitly disclose the method of claim 3. However, Beachem in an analogous art teaches, wherein deploying malware protection measures comprises provisioning a replacement computer system for the target computer system as a replica of the target computer system supplemented by the provision of protection measures such that the replacement computer system is protected from the malware, wherein the replacement computer system is provisioned in advance of the selected time period, the method further comprising deploying the replacement computer system as a substitute for the target computer system at the selected time period. (A virtual representation is made from a cloned/replica image of the compromised device at least as of a time just before the compromised device became infected by the security threat. The virtual representation may be configured on a separate hardware platform as the compromised device. Threat assessment occurs by monitoring data flows relative to the computing device and, upon actual identification, threat type or severity is also attempted to be characterized. In the event the type or severity meets a predetermined threshold, a virtual representation of the compromised device is stood-up to operationally replace the original device, including installation of an active countermeasure [Beachem Para 0006]). Therefore, it would have been obvious for a person having ordinary skill in the art, before the effective filing date of the claimed invention in the same field of endeavor to utilize the teaching of Beachem to modify the method of Chen in view of Hartrell to prepare backup computer or router hardware with updated software and virus patches as they become known. In this way, when a node or computer hardware gets infected, I could be rapidly replaced with a replica hardware having all the software and vulnerability updates. This is very beneficial to keep the downtime to a minimal while keeping the services available for the users almost 100%. For the service providers, it minimizes loss of revenue and improves customer satisfaction index. Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Chen in view of Hartrell, and further in view of PRIESS et al. (Pub. No. US-2015/0026027-A1), hereafter referred to as Priess. With respect to claim 12, the combination of Chen and Hartrell discloses the method of claim 7. But, Chen in view of Hartrell does not explicitly disclose the method of claim 12. However, Priess in an analogous art teaches, wherein the statistical distribution is as a Poisson distribution or a uniform distribution. (The predictive user model “PUM” for fraud detection uses uniform probability distributions to adjust probability distributions [Priess Para 0096: Lines 6-7]). Therefore, it would have been obvious for a person having ordinary skill in the art, before the effective filing date of the claimed invention in the same field of endeavor to utilize the teaching of Priess to modify the method of Chen in view of Hartrell to provide best recommendation to each machine, the most befitting countermeasures given all presently known information/data and associated predicted probabilistic information regarding prospective intrusion or attack. If any systems are adversely affected, methods for repairing the damage are shared and redistributed throughout the network. The net impact of such an approach is that every machine on a network can benefit from security experience gained at any other point on the network. A high and uniform level of security is therefore assured to all systems attached to the network, and this security is updated in real-time. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure as many of the prior arts listed in the 892 disclose earliest time period for protective measures from a malware. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Carl G Colin whose telephone number is (571)272-3862. The examiner can normally be reached Monday-Thursday 8:00-5:00 PM, Friday 8-12 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amy Cohen Johnson can be reached on 571-272-2238. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CARL G COLIN/Supervisory Patent Examiner, Art Unit 2493
Read full office action

Prosecution Timeline

Mar 29, 2023
Application Filed
Jan 03, 2025
Non-Final Rejection — §103
Apr 08, 2025
Response Filed
Aug 18, 2025
Non-Final Rejection — §103
Nov 19, 2025
Response Filed
Mar 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592963
DETECTION DEVICE, DETECTION METHOD, AND DETECTION PROGRAM
2y 5m to grant Granted Mar 31, 2026
Patent 12554808
PUBLIC KEY EMBEDDED IN CONTENT FOR VERIFICATION OF AUTHORSHIP
2y 5m to grant Granted Feb 17, 2026
Patent 12547704
AUTOMATED DEPLOYMENT OF RELOCATABLE CODE BLOCKS AS AN ATTACK COUNTERMEASURE IN SOFTWARE
2y 5m to grant Granted Feb 10, 2026
Patent 7996908
METHOD AND SYSTEM FOR COORDINATING CLIENT AND HOST SECURITY MODULES
2y 5m to grant Granted Aug 09, 2011
Patent 7987512
BIOS BASED SECURE EXECUTION ENVIRONMENT
2y 5m to grant Granted Jul 26, 2011
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
48%
Grant Probability
99%
With Interview (+52.8%)
4y 9m
Median Time to Grant
High
PTA Risk
Based on 136 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month