Prosecution Insights
Last updated: April 18, 2026
Application No. 18/180,455

AUTOMATIC MITIGATION OF BIOS ATTACKS

Final Rejection §103
Filed
Mar 08, 2023
Examiner
LIU, ZHE
Art Unit
2493
Tech Center
2400 — Computer Networks
Assignee
DELL PRODUCTS, L.P.
OA Round
4 (Final)
71%
Grant Probability
Favorable
5-6
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
96 granted / 136 resolved
+12.6% vs TC avg
Strong +59% interview lift
Without
With
+59.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
23 currently pending
Career history
159
Total Applications
across all art units

Statute-Specific Performance

§101
5.3%
-34.7% vs TC avg
§103
59.6%
+19.6% vs TC avg
§102
5.0%
-35.0% vs TC avg
§112
23.5%
-16.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 136 resolved cases

Office Action

§103
DETAILED ACTION The following claims are pending in this office action: 1-20 Claims 1, 11 and 20 are independent claims. The following claims are amended: 1-8, 11-18 and 20 The following claims are new: - The following claims are cancelled: - Claims 1-20 are rejected. This rejection is FINAL. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Previous Objections and/or Rejections Withdrawn The objections to claims 17 and 18 are withdrawn based on the amendments. RESPONSE TO ARGUMENTS Applicant’s arguments in the amendment filed 12/17/2025 have been fully considered but are moot in view of new grounds of rejection necessitated by amendment. Applicant notes: The art of record fails to disclose the features of the independent claims as amended. The amended limitations in the independent claims are disclosed by Gamble et al. (US Pub. 2019/0342307) as explained below and rejected accordingly. Dependent claims 2-10 and 12-19 depend on independent claims 1 and 11. The amended elements in the claims are disclosed by Gamble et al. (US Pub. 2019/0342307) as explained below, and so any additional features to the dependent claims are rejected accordingly. Examiner notes that the rejection is made based on the reasonably broad interpretation of the changes, attributes, events, and threat chains. Examiner notes that the Application is directed towards mitigation of bios attacks, but there is nothing in the independent claims which indicate that the changes, attributes, events, and threat chains are in any way associated with BIOS. The only limitation directed towards BIOS is claim 8, where the limitation is that the attributes are associated with one of a BIOS and firmware pf the computer. However, it is unclear, even if this is incorporated into the independent claims, how the attributes are associated with the BIOS/firmware, which allows for the combination of references as explained below. Examiner suggests further clarifying how the changes, attributes, events and threat chains are related to computer BIOS and mitigation of BIOS attacks. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1-9, 11-18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Pike et al. (US Pub. 2020/0404016) (hereinafter “Pike”) in view of Petersen et al. (US Pub. 2012/0005542) (hereinafter “Petersen”) in view of Xiaoning et al. (US Pub. 2018/0096143) (hereinafter “Xiaoning”) and in view of Gamble et al. (US Pub. 2019/0342307) (hereinafter “Gamble”) As per claim 1, Pike teaches an information handling system, comprising: ([Pike, para. 0052] “managed network 100 [a system] for monitoring root-level attacks”; [para. 0123] “The computer described herein may be used … to execute the described functions [making the managed network a computer/information handling system]”) a memory device configured to store code; and ([Pike, para. 0128] “instructions embodying any one or more of the methodologies or functions described herein … reside … within the main memory”) a processor configured to execute the code ([Pike, para. 0132] “The various operations of example methods described herein may be performed, at least partially, by one or more processors”) to instantiate a system health monitor, the system health monitor configured to: ([para. 0116] “a process for monitoring and preventing root level attacks [a health monitor] … are performed by the management system [making the monitor a system health monitor]”) detect a plurality of first changes to associated first attributes of the information handling system ([Pike, para. 0117] “The management system 130 detects 510 a change at the target device ... These may ... be changes [a plurality of first changes]”; [Fig. 1] The managed devices are of the managed network/system”; [para. 0092] “A change … may generally refer to any access … refer to … changing of settings [associated first attributes: as attribute is not defined from the specification, examiner interprets “attribute” in accordance to its broadest reasonable interpretation to mean any characteristic of the computer]”) from associated first states to associated second states; ([para. 0092] “for example, where an access [setting change/associated first attribute] has occurred previously... the transmission of a heartbeat signal [association of first states as a number of states is necessarily associated with transmission of reoccurring signal] the omission of this access at a later time [to associated second states as the omission of such a signal is also necessarily associated reoccurring lack of states] is a change”) remediate the particular first change attribute in response to determining that the particular first attribute is a first critical setting. ([Pike, para. 0058] “If the change [a change of settings/account access/the particular first change attribute] is determined to be unauthorized [in response to determining that the particular first attribute is a first critical setting as it impacts security of the operation of the computer– see for example, para. 0057: “the changes caused by these ... user accounts... result in negative consequences, such as stealing confidential information”] … revert any changes [remediate the particular first change attribute] caused by the unapproved change”) Pike does not clearly teach wherein detecting the first changes is based upon the system health monitor reading a secure event log; convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; compare the first threat chain to a policy related to the first attributes; determine whether a particular one of the first attributes is related to one of a preferred setting, an important setting, and a critical setting, wherein the preferred setting provides an event logging level for the information handling system, the important setting governs a performance level of the information handling system, and the critical setting impacts safety and security of the information handling system. However, Petersen teaches determine whether a particular one of the first attributes ([Petersen, para. 0090] “computers regularly generate log entries for a variety of actions [the first change to the first attribute of a computer system], such as ... operating system errors [a particular one of the first attributes in view of its BRI]”; [para. 0152] “an event [the first change] consists of ... the log entry”) is related to one of a preferred setting, an important setting, and a critical setting, ([para. 0172] “control panel ... provides summary type information in relation to logs [is related to] ... provide users with a real-time visibility into compliance/audit [preferred setting as “auditing outputs ... output data to the log file so that the engineers may diagnose problems” – see para. 0003], security [critical setting as: “the event manager to automatically alert a user of a certain event ... that a user would find exceptionally important, such as a denial of service attack”] and operations [an important setting as “the system may automatically prioritize each event based its impact to ... operations – see para. 0176] related events”) wherein the preferred setting provides an event logging level for the information handling system, ([para. 0105] “The log agent may monitor and forward textual log data of a variety of sources, such as ... audit logs”; as the audit classification determines which events are audit events and audit events are forwarded as a textual log, the classification provides an event logging of the auditing level) the important setting governs a performance level of the information handling system, ([Fig. 23] the “operation events classification” graph sets the events by performance level including “Critical”, “Error”, “Warning”, “Information”, “Network Allow”, “Network Deny”, “Network Traffic” and “Other Operations”) and the critical setting impacts safety and security of the information handling system. [para. 0159] “the event manager 14 may generate an alarm that a denial of service attack [critical setting] is underway so that a system administrator may take appropriate steps to limit or prevent any damage [impacts safety and security of the information handling system]”; [para. 0174] “security events [critical setting] ... e.g. compromise, attack, denial of service”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike with the teachings of Petersen to include determining whether a particular one of the first attributes is related to one of a preferred setting, an important setting, and a critical setting, wherein the preferred setting provides an event logging level for the information handling system, the important setting governs a performance level of the information handling system, and the critical setting impacts safety and security of the information handling system. One of ordinary skill in the art would have been motivated to make this modification because in this way, the impact of the events/changes in settings may be assessed in multiple dimensions to allow the extraction of meaning from the events/changes in settings for what may otherwise appear to be simply isolated blocks. (Petersen, para. 0173) Pike in view of Petersen does not clearly teach wherein detecting the first changes is based upon the system health monitor reading a secure event log; convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. However, Xiaoning teaches wherein detecting the first changes is based upon the system health monitor reading a secure event log. ([Xiaoning, para. 0036] “security code 504 [system health monitor] ... securely request and receive [read] information related to the change log 524 [secure event log] ... and determine a presence of [monitor] any unauthorized changes [system health monitor as unauthorized changes are attacks – see para. 0025] to the storage system 508 [the system being monitored] based on the information received”; [para. 0037] “the storage controller code ... store a record of all LBA write operations [first event changes] in the change log 524”; [para. 0038] “provide the list of changed LBAs [detecting the first changes] to the security code 504 in response to the request [based upon reading the secure event log]”; [para. 0038] “storage controller code may be configured to encrypt and decrypt the change log 524 [making the event log a secure event log]”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen with the teachings of Xiaoning to include wherein detecting the first changes is based upon the system health monitor reading a secure event log. One of ordinary skill in the art would have been motivated to make this modification because this allows additional confidentiality and provides the benefit of an accurate change log without any manipulation from malicious OS components. (Xiaoning, para. 0034) Pike in view of Petersen and Xiaoning does not clearly teach convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. However, Gamble teaches convert the detected first changes into associated first threat events; ([Gamble, para. 0048] “Security platform 100 collects data ... to ... generate security events [into first threat events]”; [para. 0093] “the security platform 100 is configured to extract ... permission data [settings/first changes] associated with one or more of the events”) organize related first threat events into at least one first threat chain, ([Gamble, para. 0066] “Graph generator 126 can use links to create meaningful connections between events [organize threat events] and security alert unit 124 can map graph data structures representing chains of events to between the security events [into at least one threat chain]”) wherein related first threat events are associated with a common set of related first attributes; and ([para. 0066] “The connections between events can occur when the graph generator 126 detects two (or more) events with something in common [a common set of related first attributes]”; [para. 0090] “Events are grouped together ... based upon common elements [attributes]”) compare the first threat chain to a policy related to the first attributes. ([Gamble, para. 0106] “Security platform 100 attempts to identify intrusions by constructing potential attack chains and then applies rules [compare the first threat chain to a policy] to determine the likelihood/severity of the incident”; [para. 0090] “common elements [first attributes] ... which may be codified as a rule [policies related to the first attributes] ... and flagged by security platform 100”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen and Xiaoning with the teachings of Gamble to include convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. One of ordinary skill in the art would have been motivated to make this modification because this can reduce false positives to use resources more efficiently to target actual threats and more complex chains of events. (Gamble, para. 0047) As per claim 2, Pike in view of Petersen, Xiaoning and Gamble teaches claim 1. Pike also teaches wherein in remediating the particular first attribute, the system health monitor is further configured to attempt to restore the first particular attribute to the associated first state. ([Pike, para. 0058] “If the change is determined to be unauthorized [a change of settings/account access/the particular first change attribute] … revert any changes caused by the unapproved change [remediate the particular first attribute] … To revert the changes [an attempt to restore the first particular attribute], the management library 126 may rollback the operating system 122, and any associated executables 124 and data 125, to an earlier state”; [para. 0096] “in response to an instruction from the management system 130 [health monitor] … the rebuild layer 226 may replace the current operating system 122 with a device image of the operating system 122 that is of a known good state [attempt to restore the first particular attribute to the associated first state] … with reference to the build manager 236”) As per claim 3, Pike in view of Petersen, Xiaoning and Gamble teaches claim 2. Pike also teaches wherein the system health monitor is further configured to: determine that the attempt to restore the particular first attribute to the associated first state was successful; and ([Pike, para. 0086] “the change is reverted and the managed device 120 is restored to a known good state [the attempt to restore the first attribute to the first state was successful] … The rebuild manager 236 [the health monitor – see Fig. 2] may instruct the rebuild layer 226 to ... replace the data [restore the particular first attribute] … restore … of the managed device 120 to a state as stored [to the associate first state] in the device image ... indicate ... the rollback is completed [determine attempt was successful]”) provide an indication that the attempt to restore the particular first attribute to the associated first state was successful. ([Pike, para. 0086] “Subsequently, the rebuild manager 236 may indicate in a log [provide an indication] that the rollback is completed in response to the change that was detected [attempt to restore the particular first attribute to the associated first state was successful as per above]”) As per claim 4, Pike in view of Petersen, Xiaoning and Gamble teaches claim 3. Pike also teaches wherein the system health monitor is further configured to reboot the information handling system after the first particular attribute is restored to the associated first state. ([Pike, para. 0086] “The rebuild manager 236 may also restore the volatile storage of the managed device 120 to a state as stored in the device image … Subsequently, [after the first particular attribute is restored to the first associated state] the rebuild manager 236 may instruct the rebuild layer 226 to boot up or start the operating system 122 of the managed device [reboot the information handling system] to resume operations”) As per claim 5, Pike in view of Petersen, Xiaoning and Gamble teaches claim 2. Pike also teaches wherein the system health monitor is further configured to: determine that the attempt to restore the particular first attribute to the first state was associate unsuccessful; and ([Pike, para. 0086-0087] “The rebuild manager ... may log [determine] ... the rollback [attempt to restore the particular first attribute to the first state as explained above] is not successful”) provide an indication that the attempt to restore the particular first attribute to the associated first state was unsuccessful. ([Pike, para. 0087] “the rebuild manager 236 may also send a message to an administrator … indicating that a rollback failed”) As per claim 6, Pike in view of Petersen, Xiaoning and Gamble teaches claim 2. Pike also teaches wherein the system health monitor is further configured to: provide an indication that the change to the particular first attribute is a preferred change in response to determining that the first change is the preferred change. ([Pike, para. 0059] “If the change is listed in the exception list, [determining that the first change to the particular first attribute is the preferred change based on the comparison] the management system 130 may send a message to the management library 126 for that managed device 120 indicating to allow the change [provide an indication that the change is the preferred change]”) As per claim 7, Pike in view of Petersen, Xiaoning and Gamble teaches claim 1. Pike also teaches wherein the system health monitor is further configured to: detect a plurality of second changes to associated second attributes of the information handling system ([Pike, para. 0117] “The management system 130 detects 510 a change at the target device”; [Fig. 1] The managed devices are of the managed network/system”; [para. 0092] “A change … may generally refer to any access … refer to … modification of permissions [a second attribute]”) from associated third states to associated fourth states, ([para. 0092] “for example, where a device previously allowed access via a particular network connection [associated third states as it requires at least the state of the device and the state of the network connection], and now does not respond or refuses access [associated forth states]”) wherein the policy further relates to the second attributes; ([para. 0094] “The exception list … include rules [a policy] regarding … accesses [related to the second attributes]”) remediate the particular second attribute in response to determining that the particular second attribute is a second critical setting. ([Pike, para. 0058] “If the change [modification of permissions/the particular second change attribute] is determined to be unauthorized [in response to determining that the particular second attribute is a second critical setting as it impacts security of the operation of the computer– see for example, para. 0057: “the changes caused by these ... user accounts... result in negative consequences, such as stealing confidential information”] … revert any changes [remediate the particular second change attribute] caused by the unapproved change”) Pike in view of Xiaoning does not clearly teach convert the detected second changes into associated second threat events; organize related second threat events into at least one second threat chain, wherein related second threat events are associated with a common set of related second attributes; compare the second threat chain to the policy; and determine whether a particular one of the second attributes is related to one of the preferred setting, the important setting, and the critical setting. However, Petersen teaches determine whether a particular one of the second attributes ([Petersen, para. 0090] “when a user incorrectly attempts to logon to a single computer [a particular one of the second attributes in view of its BRI] ... computers regularly generate log entries for a variety of actions”; [para. 0152] “an event [the second change] consists of ... the log entry”) is related to one of a preferred setting, an important setting, and a critical setting. ([para. 0172] “control panel ... provides summary type information in relation to logs [is related to] ... provide users with a real-time visibility into compliance/audit [preferred setting as “auditing outputs ... output data to the log file so that the engineers may diagnose problems” – see para. 0003], security [critical setting as: “the event manager to automatically alert a user of a certain event ... that a user would find exceptionally important, such as a denial of service attack”] and operations [an important setting as “the system may automatically prioritize each event based its impact to ... operations – see para. 0176] related events”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to combine the teachings of Pike, Petersen and Xiaoning for the same reasons as disclosed above. Pike in view of Petersen and Xiaoning does not clearly teach convert the detected second changes into associated second threat events; organize related second threat events into at least one second threat chain, wherein related second threat events are associated with a common set of related second attributes; compare the second threat chain to the policy. However, Gamble teaches convert the detected second changes into associated second threat events; ([Gamble, para. 0048] “Security platform 100 collects data ... to ... generate security events [into first threat events]”; [para. 0093] “the security platform 100 is configured to extract ... device permissions [detected second changes] associated with one or more of the events”) organize related second threat events into at least one second threat chain, ([Gamble, para. 0066] “Graph generator 126 can use links to create meaningful connections between events [organize threat events] and security alert unit 124 can map graph data structures representing chains of events to between the security events [into at least one threat chain]”; [Fig. 5] A first threat chain is shown as Graph #1 and a second threat chain is shown as Graph #2) wherein related second threat events are associated with a common set of related second attributes; ([para. 0066] “The connections between events can occur when the graph generator 126 detects two (or more) events with something in common [a common set of related second attributes]”; [Fig. 6] Events in the second threat chain are grouped with a common set of attributes; [para. 0090] “Events are grouped together ... based upon common elements [attributes]”) compare the second threat chain to the policy. ([Gamble, para. 0106] “Security platform 100 attempts to identify intrusions by constructing potential attack chains and then applies rules [compare the second threat chain to a policy] to determine the likelihood/severity of the incident”; [para. 0090] “common elements [second attributes] ... which may be codified as a rule [policies related to the first attributes] ... and flagged by security platform 100”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to combine the teachings of Pike, Petersen, Xiaoning and Gamble for the same reasons as disclosed above. As per claim 8, Pike in view of Petersen, Xiaoning and Gamble teaches claim 1. Pike also teaches wherein the first attributes are associated with one of a basic input/output system of the information handling system and firmware of the information handling system. ([Pike, para. 0095] “the detector 224 detects changes [the first attributes] … at … the firmware (BIOS, UEFI), [associated with BIOS and firmware] and reports any changes to these elements”) As per claim 9, Pike in view of Petersen, Xiaoning and Gamble teaches claim 1. Pike also teaches wherein the processor is associated with a hosted environment instantiated on the information handling system. ([Pike, para. 0124] “a computing machine [the processor] in the example form of a computer system 700 [instantiated on the information handling system] … in a networked deployment the machine may operate … as a peer machine a [associated with/instantiated on] … distributed … network environment [a hosted environment]”) As per claim 11, Pike teaches a method, comprising: ([Pike, para. 0132] “The various operations of example methods described herein may be performed”) instantiating, by a processor, a system health monitor; ([para. 0116] “a process for monitoring and preventing root level attacks [a health monitor] … are performed by the management system [making the monitor a system health monitor]”; [para. 0132] “The various operations of example methods described herein may be performed, at least partially, by one or more processors”) detecting, by the system health monitor, a plurality of first changes to associated first attributes of an information handling system ([Pike, para. 0117] “The management system 130 detects 510 a change at the target device ... These may ... be changes [a plurality of first changes]”; [Fig. 1] The managed devices are of the managed network/system”; [para. 0092] “A change … may generally refer to any access … refer to … changing of settings [associated first attributes: as attribute is not defined from the specification, examiner interprets “attribute” in accordance to its broadest reasonable interpretation to mean any characteristic of the computer]”) from associated first states to associated second states; ([para. 0092] “for example, where an access [setting change] has occurred previously [from a first state], the omission of this access at a later time [to a second state] is a change”) remediating the particular first attribute in response to determining that the particular first attribute is a first critical setting. ([Pike, para. 0058] “If the change [a change of settings/account access/the particular first change attribute] is determined to be unauthorized [in response to determining that the particular first attribute is a first critical setting as it impacts security of the operation of the computer– see for example, para. 0057: “the changes caused by these ... user accounts... result in negative consequences, such as stealing confidential information”] … revert any changes [remediate the particular first change attribute] caused by the unapproved change”) Pike does not clearly teach wherein detecting the first changes is based upon the system health monitor reading a secure event log; convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; compare the first threat chain to a policy related to the first attributes; determine whether a particular one of the first attributes is related to one of a preferred setting, an important setting, and a critical setting, wherein the preferred setting provides an event logging level for the information handling system, the important setting governs a performance level of the information handling system, and the critical setting impacts safety and security of the information handling system. However, Petersen teaches determining whether a particular one of the first attributes ([Petersen, para. 0090] “computers regularly generate log entries for a variety of actions [the first change to the first attribute of a computer system], such as ... operating system errors [a particular one of the first attributes in view of its BRI]”; [para. 0152] “an event [the first change] consists of ... the log entry”) is related to one of a preferred setting, an important setting, and a critical setting, ([para. 0172] “provide users with a real-time visibility into compliance/audit [preferred setting as “auditing outputs ... output data to the log file so that the engineers may diagnose problems” – see para. 0003], security [critical setting as: “the event manager to automatically alert a user of a certain event ... that a user would find exceptionally important, such as a denial of service attack”] and operations [an important setting as “the system may automatically prioritize each event based its impact to ... operations – see para. 0176] related events”) wherein the preferred setting provides an event logging level for the information handling system, ([para. 0105] “The log agent may monitor and forward textual log data of a variety of sources, such as ... audit logs”; as the audit classification determines which events are audit events and audit events are forwarded as a textual log, the classification provides an event logging of the auditing level) the important setting governs a performance level of the information handling system, ([Fig. 23] the “operation events classification” graph sets the events by performance level including “Critical”, “Error”, “Warning”, “Information”, “Network Allow”, “Network Deny”, “Network Traffic” and “Other Operations”) and the critical setting impacts safety and security of the information handling system. [para. 0159] “the event manager 14 may generate an alarm that a denial of service attack [critical setting] is underway so that a system administrator may take appropriate steps to limit or prevent any damage [impacts safety and security of the information handling system]”; [para. 0174] “security events [critical setting] ... e.g. compromise, attack, denial of service”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike with the teachings of Petersen to include determining whether a particular one of the first attributes is related to one of a preferred setting, an important setting, and a critical setting, wherein the preferred setting provides an event logging level for the information handling system, the important setting governs a performance level of the information handling system, and the critical setting impacts safety and security of the information handling system. One of ordinary skill in the art would have been motivated to make this modification because in this way, the impact of the events/changes in settings may be assessed in multiple dimensions to allow the extraction of meaning from the events/changes in settings for what may otherwise appear to be simply isolated blocks. (Petersen, para. 0173) Pike in view of Petersen does not clearly teach wherein detecting the first changes is based upon the system health monitor reading a secure event log; convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. However, Xiaoning teaches wherein detecting the first changes is based upon the system health monitor reading a secure event log. ([Xiaoning, para. 0036] “security code 504 [system health monitor] ... securely request and receive [read] information related to the change log 524 [secure event log] ... and determine a presence of [monitor] any unauthorized changes [system health monitor as unauthorized changes are attacks – see para. 0025] to the storage system 508 [the system being monitored] based on the information received”; [para. 0037] “the storage controller code ... store a record of all LBA write operations [first event changes] in the change log 524”; [para. 0038] “provide the list of changed LBAs [detecting the first changes] to the security code 504 in response to the request [based upon reading the secure event log]”; [para. 0038] “storage controller code may be configured to encrypt and decrypt the change log 524 [making the event log a secure event log]”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen with the teachings of Xiaoning to include wherein detecting the first changes is based upon the system health monitor reading a secure event log. One of ordinary skill in the art would have been motivated to make this modification because this allows additional confidentiality and provides the benefit of an accurate change log without any manipulation from malicious OS components. (Xiaoning, para. 0034) Pike in view of Petersen and Xiaoning does not clearly teach convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. However, Gamble teaches convert the detected first changes into associated first threat events; ([Gamble, para. 0048] “Security platform 100 collects data ... to ... generate security events [into first threat events]”; [para. 0093] “the security platform 100 is configured to extract ... permission data [settings/first changes] associated with one or more of the events”) organize related first threat events into at least one first threat chain, ([Gamble, para. 0066] “Graph generator 126 can use links to create meaningful connections between events [organize threat events] and security alert unit 124 can map graph data structures representing chains of events to between the security events [into at least one threat chain]”) wherein related first threat events are associated with a common set of related first attributes; and ([para. 0066] “The connections between events can occur when the graph generator 126 detects two (or more) events with something in common [a common set of related first attributes]”; [para. 0090] “Events are grouped together ... based upon common elements [attributes]”) compare the first threat chain to a policy related to the first attributes. ([Gamble, para. 0106] “Security platform 100 attempts to identify intrusions by constructing potential attack chains and then applies rules [compare the first threat chain to a policy] to determine the likelihood/severity of the incident”; [para. 0090] “common elements [first attributes] ... which may be codified as a rule [policies related to the first attributes] ... and flagged by security platform 100”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen and Xiaoning with the teachings of Gamble to include convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. One of ordinary skill in the art would have been motivated to make this modification because this can reduce false positives to use resources more efficiently to target actual threats and more complex chains of events. (Gamble, para. 0047) As per claim 12, the claim language is identical or substantially similar to that of claim 2. Therefore, it is rejected under the same rationale applied to claim 2. As per claim 13, the claim language is identical or substantially similar to that of claim 3. Therefore, it is rejected under the same rationale applied to claim 3. As per claim 14, the claim language is identical or substantially similar to that of claim 4. Therefore, it is rejected under the same rationale applied to claim 4. As per claim 15, the claim language is identical or substantially similar to that of claim 5. Therefore, it is rejected under the same rationale applied to claim 5. As per claim 16, the claim language is identical or substantially similar to that of claim 6. Therefore, it is rejected under the same rationale applied to claim 6. As per claim 17, the claim language is identical or substantially similar to that of claim 7. Therefore, it is rejected under the same rationale applied to claim 7. As per claim 18, the claim language is identical or substantially similar to that of claim 8. Therefore, it is rejected under the same rationale applied to claim 8. As per claim 20, Pike teaches an information handling system, comprising: ([Pike, para. 0052] “managed network 100 [a system] for monitoring root-level attacks”; [para. 0123] “The computer described herein may be used … to execute the described functions [making the managed network a computer/information handling system]”) a memory configured to store code; and ([Pike, para. 0128] “instructions embodying any one or more of the methodologies or functions described herein … reside … within the main memory”) a processor configured to execute the code ([Pike, para. 0132] “The various operations of example methods described herein may be performed, at least partially, by one or more processors”) to instantiate a system health monitor, the system health monitor configured to: ([para. 0116] “a process for monitoring and preventing root level attacks [a health monitor] … are performed by the management system [making the monitor a system health monitor]”) detect a plurality of first changes to associated first attributes of the information handling system ([Pike, para. 0117] “The management system 130 detects 510 a change at the target device ... These may ... be changes [a plurality of first changes]”; [Fig. 1] The managed devices are of the managed network/system”; [para. 0092] “A change … may generally refer to any access … refer to … changing of settings [associated first attributes: as attribute is not defined from the specification, examiner interprets “attribute” in accordance to its broadest reasonable interpretation to mean any characteristic of the computer]”) from associated first states to associated second states ([para. 0092] “for example, where an access [setting change/associated first attribute] has occurred previously... the transmission of a heartbeat signal [association of first states as a number of states is necessarily associated with transmission of reoccurring signal] the omission of this access at a later time [to associated second states as the omission of such a signal is also necessarily associated reoccurring lack of states] is a change”) from an attack on the information handling system; ([para. 0020] “the malicious attacker attempts to disable any local monitoring services of the computer, the security system is capable of detecting the change”) remediate the particular first change attribute in response to determining that the particular first attribute is a first critical setting. ([Pike, para. 0058] “If the change [a change of settings/account access/the particular first change attribute] is determined to be unauthorized [in response to determining that the particular first attribute is a first critical setting as it impacts security of the operation of the computer– see for example, para. 0057: “the changes caused by these ... user accounts... result in negative consequences, such as stealing confidential information”] … revert any changes [remediate the particular first change attribute] caused by the unapproved change”) Pike does not clearly teach wherein detecting the first changes is based upon the system health monitor reading a secure event log; convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; compare the first threat chain to a policy related to the first attributes; determine whether a particular one of the first attributes is related to one of a preferred setting, an important setting, and a critical setting, wherein the preferred setting provides an event logging level for the information handling system, the important setting governs a performance level of the information handling system, and the critical setting impacts safety and security of the information handling system. However, Petersen teaches determine whether a particular one of the first attributes ([Petersen, para. 0090] “computers regularly generate log entries for a variety of actions [the first change to the first attribute of a computer system], such as ... operating system errors [a particular one of the first attributes in view of its BRI]”; [para. 0152] “an event [the first change] consists of ... the log entry”) is related to one of a preferred setting, an important setting, and a critical setting, ([para. 0172] “control panel ... provides summary type information in relation to logs [is related to] ... provide users with a real-time visibility into compliance/audit [preferred setting as “auditing outputs ... output data to the log file so that the engineers may diagnose problems” – see para. 0003], security [critical setting as: “the event manager to automatically alert a user of a certain event ... that a user would find exceptionally important, such as a denial of service attack”] and operations [an important setting as “the system may automatically prioritize each event based its impact to ... operations – see para. 0176] related events”) wherein the preferred setting provides an event logging level for the information handling system, ([para. 0105] “The log agent may monitor and forward textual log data of a variety of sources, such as ... audit logs”; as the audit classification determines which events are audit events and audit events are forwarded as a textual log, the classification provides an event logging of the auditing level) the important setting governs a performance level of the information handling system, ([Fig. 23] the “operation events classification” graph sets the events by performance level including “Critical”, “Error”, “Warning”, “Information”, “Network Allow”, “Network Deny”, “Network Traffic” and “Other Operations”) and the critical setting impacts safety and security of the information handling system. [para. 0159] “the event manager 14 may generate an alarm that a denial of service attack [critical setting] is underway so that a system administrator may take appropriate steps to limit or prevent any damage [impacts safety and security of the information handling system]”; [para. 0174] “security events [critical setting] ... e.g. compromise, attack, denial of service”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike with the teachings of Petersen to include determining whether a particular one of the first attributes is related to one of a preferred setting, an important setting, and a critical setting, wherein the preferred setting provides an event logging level for the information handling system, the important setting governs a performance level of the information handling system, and the critical setting impacts safety and security of the information handling system. One of ordinary skill in the art would have been motivated to make this modification because in this way, the impact of the events/changes in settings may be assessed in multiple dimensions to allow the extraction of meaning from the events/changes in settings for what may otherwise appear to be simply isolated blocks. (Petersen, para. 0173) Pike in view of Petersen does not clearly teach wherein detecting the first changes is based upon the system health monitor reading a secure event log; convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. However, Xiaoning teaches wherein detecting the first changes is based upon the system health monitor reading a secure event log. ([Xiaoning, para. 0036] “security code 504 [system health monitor] ... securely request and receive [read] information related to the change log 524 [secure event log] ... and determine a presence of [monitor] any unauthorized changes [system health monitor as unauthorized changes are attacks – see para. 0025] to the storage system 508 [the system being monitored] based on the information received”; [para. 0037] “the storage controller code ... store a record of all LBA write operations [first event changes] in the change log 524”; [para. 0038] “provide the list of changed LBAs [detecting the first changes] to the security code 504 in response to the request [based upon reading the secure event log]”; [para. 0038] “storage controller code may be configured to encrypt and decrypt the change log 524 [making the event log a secure event log]”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen with the teachings of Xiaoning to include wherein detecting the first changes is based upon the system health monitor reading a secure event log. One of ordinary skill in the art would have been motivated to make this modification because this allows additional confidentiality and provides the benefit of an accurate change log without any manipulation from malicious OS components. (Xiaoning, para. 0034) Pike in view of Petersen and Xiaoning does not clearly teach convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. However, Gamble teaches convert the detected first changes into associated first threat events; ([Gamble, para. 0048] “Security platform 100 collects data ... to ... generate security events [into first threat events]”; [para. 0093] “the security platform 100 is configured to extract ... permission data [settings/first changes] associated with one or more of the events”) organize related first threat events into at least one first threat chain, ([Gamble, para. 0066] “Graph generator 126 can use links to create meaningful connections between events [organize threat events] and security alert unit 124 can map graph data structures representing chains of events to between the security events [into at least one threat chain]”) wherein related first threat events are associated with a common set of related first attributes; and ([para. 0066] “The connections between events can occur when the graph generator 126 detects two (or more) events with something in common [a common set of related first attributes]”; [para. 0090] “Events are grouped together ... based upon common elements [attributes]”) compare the first threat chain to a policy related to the first attributes. ([Gamble, para. 0106] “Security platform 100 attempts to identify intrusions by constructing potential attack chains and then applies rules [compare the first threat chain to a policy] to determine the likelihood/severity of the incident”; [para. 0090] “common elements [first attributes] ... which may be codified as a rule [policies related to the first attributes] ... and flagged by security platform 100”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen and Xiaoning with the teachings of Gamble to include convert the detected first changes into associated first threat events; organize related first threat events into at least one first threat chain, wherein related first threat events are associated with a common set of related first attributes; and compare the first threat chain to a policy related to the first attributes. One of ordinary skill in the art would have been motivated to make this modification because this can reduce false positives to use resources more efficiently to target actual threats and more complex chains of events. (Gamble, para. 0047) Claims 10 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Pike in view of Petersen, Xiaoning and Gamble as applied to claims 1 and 11 above, and further in view of Senn et al. (US Pub. 2023/0359517) (hereinafter “Senn”). As per claim 10, Pike in view of Petersen, Xiaoning and Gamble teaches claim 1. Pike in view of Petersen, Xiaoning and Gamble does not teach wherein the processor is associated with a baseboard management controller of the information handling system. However, Senn teaches wherein the processor is associated with a baseboard management controller of the information handling system. ([Senn, para. 0014; Fig. 1] “Computing device 100 includes a central processing unit 102 that is connected to [associated with] … a baseboard management controller 106”; [para. 0018] “monitors … are executed by central processing unit 102 [the processor] … To obtain information about events, monitors … make requests to … baseboard management controller 106”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen, Xiaoning and Gamble with the teachings of Senn to include wherein the processor is associated with a baseboard management controller of the information handling system. One of ordinary skill in the art would have been motivated to make this modification because associating the processors with the baseboard management controller allows verification of whether an event received from the baseboard management controller is indicative of an actual malfunction of the device before sending an event message placed on the network, which helps to improve the performance of the network. (Senn, para. 0013) As per claim 19, Pike in view of Petersen, Xiaoning and Gamble teaches claim 11. Pike also teaches wherein the processor is associated with one of a hosted environment instantiated on the information handling system. ([Pike, para. 0124] “a computing machine [the processor] in the example form of a computer system 700 [instantiated on the information handling system] … in a networked deployment the machine may operate … as a peer machine a [associated with/instantiated on] … distributed … network environment [a hosted environment]”) Pike in view of Petersen, Xiaoning and Gamble does not teach wherein the processor is associated with a baseboard management controller of the information handling system. However, Senn teaches wherein the processor is associated with a baseboard management controller of the information handling system. ([Senn, para. 0014; Fig. 1] “Computing device 100 includes a central processing unit 102 that is connected to [associated with] … a baseboard management controller 106”; [para. 0018] “monitors … are executed by central processing unit 102 [the processor] … To obtain information about events, monitors … make requests to … baseboard management controller 106”) It would have been obvious before the effective filing date of the claimed invention for one of ordinary skill in the art to have modified the elements disclosed by Pike in view of Petersen, Xiaoning and Gamble with the teachings of Senn to include wherein the processor is associated with a baseboard management controller of the information handling system. One of ordinary skill in the art would have been motivated to make this modification because associating the processors with the baseboard management controller allows verification of whether an event received from the baseboard management controller is indicative of an actual malfunction of the device before sending an event message placed on the network, which helps to improve the performance of the network. (Senn, para. 0013) Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Oliver et al. (US Patent No. 8,805,995) discloses capturing data relating to a threat where the sequential chain of one or more events is associated with the threat and a second chain of events comprises the sequential chain of one or more events. Ray et al. (US Pub. 2021/0400071) discloses event data that includes changes to computing objects which is evaluated to detect threats and tack casual chains of events back to a root cause. McLean et al. (US Pub. 2021/0273958) discloses detection of anomalous process chains where the chains are compared against specific policies to protect threats in the environment. The chains track changing device behaviors and computer network structures. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZHE LIU whose telephone number is (571) 272-3634. The examiner can normally be reached on Monday - Friday: 8:30 AM to 5:30 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carl Colin can be reached on (571) 272-3862. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at (866) 217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call (800) 786-9199 (IN USA OR CANADA) or (571) 272-1000. /Z.L./Examiner, Art Unit 2493 /CARL G COLIN/Supervisory Patent Examiner, Art Unit 2493
Read full office action

Prosecution Timeline

Mar 08, 2023
Application Filed
Feb 07, 2025
Non-Final Rejection — §103
May 14, 2025
Response Filed
Aug 07, 2025
Final Rejection — §103
Aug 29, 2025
Response after Non-Final Action
Sep 08, 2025
Request for Continued Examination
Sep 15, 2025
Response after Non-Final Action
Sep 19, 2025
Non-Final Rejection — §103
Dec 17, 2025
Response Filed
Mar 26, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602469
FUSE BASED REPLAY PROTECTION WITH AGGRESSIVE FUSE USAGE AND COUNTERMEASURES FOR FUSE VOLTAGE CUT ATTACKS
2y 5m to grant Granted Apr 14, 2026
Patent 12585764
MALICIOUS BEHAVIOR DETECTION AND MITIGATION IN A DOCUMENT EXECUTION ENVIRONMENT
2y 5m to grant Granted Mar 24, 2026
Patent 12572644
MICRO-ENCLAVES FOR INSTRUCTION-SLICE-GRAINED CONTAINED EXECUTION OUTSIDE SUPERVISORY RUNTIME
2y 5m to grant Granted Mar 10, 2026
Patent 12572649
METHOD FOR PROTECTION FROM CYBER ATTACKS TO A VEHICLE BASED UPON TIME ANALYSIS, AND CORRESPONDING DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12566851
DETECTING AND ASSESSING EVIDENCE OF MALWARE INTRUSION
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
71%
Grant Probability
99%
With Interview (+59.0%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 136 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month