Prosecution Insights
Last updated: April 19, 2026
Application No. 18/342,588

RISK ANALYSIS BASED NETWORK AND SYSTEM MANAGEMENT

Final Rejection §102§103
Filed
Jun 27, 2023
Examiner
JAKOVAC, RYAN J
Art Unit
2445
Tech Center
2400 — Computer Networks
Assignee
Cisco Technology Inc.
OA Round
4 (Final)
66%
Grant Probability
Favorable
5-6
OA Rounds
3y 9m
To Grant
83%
With Interview

Examiner Intelligence

Grants 66% — above average
66%
Career Allow Rate
402 granted / 613 resolved
+7.6% vs TC avg
Strong +17% interview lift
Without
With
+17.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
32 currently pending
Career history
645
Total Applications
across all art units

Statute-Specific Performance

§101
7.5%
-32.5% vs TC avg
§103
50.5%
+10.5% vs TC avg
§102
20.7%
-19.3% vs TC avg
§112
17.6%
-22.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 613 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments filed 02/09/2026 have been fully considered. Applicant argues the prior art fails to teach: ”prior to the network device experiencing a failure, transmitting, from the controller, a control signal to a second network device in the network, the control signal instructing the second network device to route data traffic away from the network device based on the likelihood that the network device is going to experience failure in the future”. The prior art to Sun discloses a system for assessing risk in an information technology system (Sun, abstract). Sun’s system analyzes anomaly data related to devices, machines, and/or components of the computing system and determines a likelihood of failure for the machine, device, and/or component and responsively transmits a “control signal” in order to apply corrective actions proactively (Sun, ¶ 44-48, ¶ 64, 74). Sun’s corrective actions include rescheduling a time of certain operations due to high CPU utilization or storage use and commands to preempt an operational failure by shifting certain operations to other devices or components due to the risk score being associated with vulnerabilities of the currently used devices or components (see Sun ¶ 49-50, 60, 74). Applicant’s arguments that Sun fails to disclose the claimed control signal are not persuasive in light of Sun’s signaling as described above. Applicant argues that Sun’s control signal is not sent “to a second network device in the network”. Applicant’s arguments are not persuasive in this regard because Sun discloses that the control signal is output to a UI device 150 and/or and output node 155 which are both examples of “a second network device in the network” as both the UI device and output node are communicatively coupled via networked connections to the system (See Sun ¶ 44-50, 60, 74 and fig. 1). Moreover, Sun’s disclosure is employed in a cloud platform (¶ 77) where the devices in Sun’s architecture are communicate with one another via a networked/cloud environment, e.g. data transmission occurs between devices over a network (¶ 92; ¶ 78-86). Regarding dependent claim 4, applicant argues Sun fails to disclose a “wherein the network device is a first network device that is a high risk network device” because while Sun discloses applying risk scores to IT infrastructure components, Sun fails to categorize the components into risk tiers (e.g. high risk versus low risk). Applicant’s arguments are not persuasive. As an initial matter related to claim construction and application of prior art, the method claim transmits data to a single device where the data includes instructions for that single device to perform routing, and in particular, for it to route data away from a certain other device described as a “high-risk” device. Applicant’s arguments here are merely semantic in nature with the claim language amounting essentially to non-functionally descriptive material in that applying the semantic label “high-risk” to the device does not import any step or function to the method and therefore does little to provide any patentable distinction over the prior art (similar rationale applies the remaining independent claims which recite similar language). Nevertheless, Sun discloses “wherein the network device is a first network device that is a high risk network device” because 1) one of ordinary skill in the art before the effective filing date of the claimed invention would understand that a device with an assigned risk score based on a recognized anomaly associated with the vulnerable device is an example of a “high risk” device; 2) devices are associated with risk scores indicative of an anomaly and one of ordinary skill in the art before the effective filing date of the claimed invention would understand that scoring itself is a hierarchical construct that provides a tiered metric; and 3) one of ordinary skill in the art before the effective filing date of the claimed invention would understand that any device with a risk score indicative of an anomaly is an example of a “high risk” device relative to devices without anomalies. Claim 4 recites: “wherein the network device is a first network device that is a high risk network device, and wherein the control signal is utilized to instruct the second network device to at least one of i) reroute the data traffic to a third network device that is a low risk network device, or ii) reroute high risk data traffic from among the data traffic”. Applicant argues Sun fails to teach i) reroute the data traffic to a third network device that is a low risk network device, or ii) reroute high risk data traffic from among the data traffic. Applicant’s arguments are not persuasive as Sun discloses rerouting high risk traffic. For example, Sun is operative to take corrective actions involving devices associated with anomalies and vulnerabilities by routing traffic away from such a device (Sun, ¶ 44-50). One of ordinary skill in the art before the effective filing date of the claimed invention would understand the traffic to be “high risk” because it is being directed to an at-risk device that has been identified as subject to vulnerabilities (see at least ¶ 32, 50 of Sun). Additionally, applicant argues that Sun fails to disclose the contested features because the claims require both a classifying step and an identifying step in regards to the “high risk data”. In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., classification function/step, identification function/step related to the “high risk” data) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-5, 8-10, 12, 14-19 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 20210084059 to Sun. Regarding claim 1, Sun teaches a method, comprising: receiving, at a controller that manages a network, anomaly data associated with a network device in the network (¶ 4, 32, 44-50, received/analyzed anomaly data); identifying anomaly characteristic information associated with the anomaly data (¶ 4,7 , 29-30, 37-50, identification of anomalous data characteristics of data); computing, at the controller, a likelihood that the network device is going to experience a failure in a future based on the anomaly data (¶ 44-50, likelihood of failure based on anomaly data and characteristic information); and prior to the network device experiencing a failure, transmitting, from the controller, a control signal to a second network device in the network, the control signal instructing the second network device to route data traffic away from the network device based on the likelihood that the network device is going to experience failure in the future (¶ abstract, ¶ 16, 35, 44-50, 60-64, predicted failure of network element and rerouting based on likelihood of failure; see also ¶ 74, 77 and 92-97). Regarding claim 2, 9, Sun teaches: identifying an anomaly indicated via the anomaly data, wherein the anomaly data includes at least one of an identifier, a classification, or a severity associated with the anomaly (¶ 44-48, identification of anomalous node and anomaly identification). Regarding claim 3, 10, 17, Sun teaches: wherein computing the likelihood that the network device is going to experience the failure further comprises: computing a severity level associated with an anomaly indicated via the anomaly data (¶ 50-51, severity level from temporary to catastrophic failures); identifying a number of occurrences of the anomaly (¶ 44-46, 51, anomalies 1-m); computing a risk weight based on the severity level and the number of occurrences (¶ 44-51, risk threat based on occurrences and severity of anomaly); computing an anomaly frequency of a classification associated with the anomaly (¶ 34, 38, 8-10, frequencies of anomalies over time); and computing the likelihood that the network device is going to experience the failure based on the risk weight and the anomaly frequency (¶ 44-48, likelihood of failure). Regarding claim 4, Sun teaches: wherein the network device is a first network device that is a high risk network device, and wherein the control signal is utilized to instruct the second network device to at least one of i) reroute the data traffic to a third network device that is a low risk network device, or ii) reroute high risk data traffic from among the data traffic (¶ 48-50, instructions to reroute traffic). Regarding claim 5, Sun teaches: wherein an anomaly indicated via the anomaly data comprises at least one of i) a behavior of a behavior type that is not included from among a group of approved behavior types, or ii) an operation of an operation type that is not included from among a group of approved operation types (¶ 4,7 , 29-30, 37-50). Claim 8 addressed by similar rationale as claim 1. Regarding claim 12, Sun teaches: wherein the anomaly is included in a cluster from among a group of clusters, and the group of clusters include a software critical cluster, a hardware critical cluster, and a consistency critical cluster (figs. 9-10, ¶ 92-94). Regarding claim 14, Sun teaches: wherein the network device is a first network device, and transmitting the control signal further comprises: transmitting the control signal to a second network device, the control signal being utilized to instruct the second network device to at least one of i) reroute the data traffic to a third network device, or ii) reroute a portion of the data traffic (¶ 48-50). Claim 15 is addressed by similar rational as claim 1. Regarding claim 16, Sun teaches: wherein the anomaly characteristic information includes an identifier, a classification, and a severity associated with an anomaly indicated in the anomaly data (¶ 44-52, ¶ 60, identifier, severity, category of anomaly). Regarding claim 18, Sun teaches: wherein the network device is a first network device that is a high risk network device, and transmitting the control signal further comprises: transmitting the control signal to a second network device, the control signal being utilized to instruct the second network device to reroute the data traffic to a third network device that is a low risk network device (¶ 48-50, instructions to reroute traffic). Regarding claim 19, Sun teaches: wherein the network device is a first network device, and transmitting the control signal further comprises: transmitting the control signal to a second network device, the control signal being utilized to instruct the second network device to reroute high risk data traffic from among the data traffic (¶ 50), instructions to reroute). Claim 6 rejected under 35 U.S.C. 103 as being unpatentable over Sun in view of US 20210203680 to Das. Regarding claim 6, Sun teaches: identifying an anomaly indicated via the anomaly data (¶ 44-51, 60); wherein the anomaly data includes an anomaly classification from among a group of classifications (¶ 44-51, 60), Sun fails to teach but Das teaches: the group of classifications includes at least one of a software error classification, a hardware error classification, or a consistency check classification (¶ 43, software application errors). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the teachings of Das. The motivation to do so is that the teachings of Das would have been advantageous in terms of facilitating anomaly detection and prevention (Das, ¶ 5, 43). Claim 7, 11, and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Sun in view of AU 2015202706 A1 to Lefebvre in view of US 20210360407 to Obaidi. Regarding claim 7, Sun teaches: wherein computing the estimated overall risk factor information further comprises: computing a severity level associated with a first anomaly indicated via the anomaly data (¶44-52, ¶ 60, severity levels); computing a number occurrences of the first anomaly indicated via the anomaly data (¶ 44-46, 51, anomalies 1-m); computing a first risk weight based on the severity level and the occurrences (¶ 44-51, risk threat based on occurrences and severity of anomaly). Sun fails to teach but Lefebvre teaches: computing an estimated overall risk factor of the estimated overall risk factor information based on the first risk weight and a second risk weight, the second risk weight being associated with a second anomaly of a different type than the first anomaly (¶ 66-69, score based on secondary risks such as connection association, packet throughput per time, time variance of connections, etc.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the teachings of Lefebvre. The motivation to do so is that the teachings of Lefebvre would have been advantageous in terms of facilitating the control of network traffic (Lefebvre, ¶ 1-4). Lefebvre fails to teach the number of occurrences is a “percentage of occurrences”. However, Obaidi teaches discloses the number of anomalous occurrences as a percentage of occurrences (¶ 38, percentage of anomalies). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the teachings of Obaidi. The motivation to do so is that the teachings of Obaidi would have been advantageous in terms of facilitating scam prevention and anomalous pattern detection (Obaidi, ¶ 9, 38). Regarding claim 11, Sun teaches: wherein computing the estimated overall risk factor further comprises: identifying a number of occurrences of anomalies in a cluster that comprises the anomaly identified by the data, based at least in part on a time interval in which the anomalies occur (¶ 44-46, occurrences of anomaly per time period; ¶ 92-94, fig. 9-10, cluster); Sun fails to teach but Obaidi teaches: identifying a total number of occurrences of anomalies during the time interval; computing a percentage of occurrences based at least in part on the number of occurrences and a total number of occurrences; and computing the estimated overall risk factor based at least on part on the percentage of occurrences (¶ 38, anomaly detection based on total / percentage of occurrences). Motivation to include Obaidi is the same as presented above. Regarding claim 13, Sun teaches: wherein computing the estimated overall risk factor further comprises: computing a severity level associated with the anomaly (¶ 51-54, severity); computing occurrences of the anomaly (¶ 44-48, occurrences of anomaly); computing a risk weight of the anomaly based on the severity level; and computing the estimated overall risk factor based at least in part on the risk weight (¶ 44-51). computing an anomaly frequency of a classification associated with the anomaly (¶ 8-9, 10, 34, 38, 41, frequencies of anomalies over time); and computing an estimated overall risk factor of the estimated overall risk factor information based on the risk weight and the anomaly frequency (¶ 2-7, 24, 48-49, 66-72, 89, 102). Sun fails to teach the number of occurrences is a “percentage of occurrences”. However, Obaidi teaches discloses the number of anomalous occurrences as a percentage of occurrences (¶ 38, percentage of anomalies). Motivation to include Obaidi is the same as presented above. Claim 20 rejected under 35 U.S.C. 103 as being unpatentable over Sun in view of AU 2015202706 A1 to Lefebvre. Regarding claim 20, Sun fails to teach but Lefebvre teaches: wherein the network device is a first network device, the control signal is a first control signal, wherein transmitting the first control signal further comprises: transmitting, to the control device, the first control signal, the first control signal being routed by the control device to a second network device, the operations further comprising: transmitting, to the control device, a second control signal, the second control signal being routed by the control device to a third network device, the second network device and the third network device being utilized to control data traffic associated with the first network device (fig. 1, fig. 2, ¶ 57, 74, 76, routing between monitoring device and firewall and/or user device to control traffic associated with network devices). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the teachings of Lefebvre. The motivation to do so is that the teachings of Lefebvre would have been advantageous in terms of facilitating the control of network traffic (Lefebvre, ¶ 1-4). CONCLUSION THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RYAN J JAKOVAC whose telephone number is (571)270-5003. The examiner can normally be reached on 8-4 PM EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Oscar A. Louie can be reached on 572-270-1684. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RYAN J JAKOVAC/Primary Examiner, Art Unit 2445
Read full office action

Prosecution Timeline

Jun 27, 2023
Application Filed
Mar 06, 2025
Non-Final Rejection — §102, §103
Mar 18, 2025
Interview Requested
Mar 27, 2025
Applicant Interview (Telephonic)
Mar 28, 2025
Response Filed
Mar 28, 2025
Examiner Interview Summary
Jun 05, 2025
Final Rejection — §102, §103
Jun 13, 2025
Interview Requested
Sep 04, 2025
Request for Continued Examination
Sep 16, 2025
Response after Non-Final Action
Oct 04, 2025
Non-Final Rejection — §102, §103
Feb 09, 2026
Response Filed
Mar 07, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603906
ALERT MONITORING OF DATA BASED ON RECOMMENDED ATTRIBUTE VALUES
2y 5m to grant Granted Apr 14, 2026
Patent 12572634
ELECTRONIC DEVICE AND ENCRYPTION METHOD FOR ELECTRONIC DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12549627
INTELLIGENT CLOUD-EDGE RESOURCE MANAGEMENT
2y 5m to grant Granted Feb 10, 2026
Patent 12526298
System and Method for Fraud Identification
2y 5m to grant Granted Jan 13, 2026
Patent 12500926
Executing Real-Time Message Monitoring to Identify Potentially Malicious Messages and Generate Instream Alerts
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
66%
Grant Probability
83%
With Interview (+17.4%)
3y 9m
Median Time to Grant
High
PTA Risk
Based on 613 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month