Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . The present Final office action is responsive to communication received 9/30/2025. Claims 1-20 are pending.
Information Disclosure Statement
The information disclosure statements (IDS) submitted on 8/7/2025, 9/2/2025, 9/15/2025, 11/17/2025, 11/21/2025, and 1/8/2025 were filed after the mailing date of the application no. 18/309,249 on 4/28/2023. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Response to Arguments
Applicant’s arguments with respect to claims 1, 12, and 17 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. The examiner brings in reference Haeberle to disclose “the operational report comprising a number of computations performed by the data processing system, and the operational report being obtained based on a schedule specifying a time for obtaining the operational report that is based, at least in part, on one selected from a group consisting of: a need of a downstream consumer of services provided by the data processing system, and a frequency of historical security issues that impacted the distributed environment over time”.
Regarding the applicant’s arguments with respect to claims 4 and 5, the examiner believes that the arguments are not persuasive and believes that Farooq (page 2, lines 5-9) discloses the logging of network measurements, it can be interpreted that this is the quantification of routines over a duration or a period of time.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-8 and 11-20 are rejected under 35 U.S.C. 103 as being unpatentable over Farooq et al. (WO 2022101656) in view of Haeberle et al. (US 20140081925).
Regarding claim 1,
Farooq teaches a method of monitoring security of data processing systems throughout a distributed environment by a security manager, the method comprising:
[the method further comprises correlating the MDT report with historical MDT reports from one or more wireless devices in proximity to the wireless device to determine a second correlation value. (Farooq et al., page 6 and 7, lines 29-30 and 1-2)]
obtaining an operational report associated with a data processing system of the data processing systems;
[receiving a MDT report generated by a wireless device (Farooq et al., page 6, line 17)]
obtaining a simulated operational report, the simulated operational report being intended to match the operational report when no unauthorized computations are performed by the data processing system;
[The network node may receive the MDT report directly from the wireless device, via another wireless device or network node, or retrieve the MDT report from a database of MDT reports. (Farooq et al., page 25, lines 20-23)]
making a determination regarding whether the operational report matches the simulated operational report within a threshold;
[correlating the MDT report with MDT output from a network planning tool or network simulator to determine a third correlation value. (Farooq et al., page 7, lines 11-13, MDT output from network simulator being interpreted as the simulated operational report)]
[the network node determines a trust score for the MDT report based on one or more correlation values.(Farooq et al., page 26, lines 28-29)]
[the network node determines whether the trust score is below a validation threshold. For example, the network node may perform threshold comparison as described with respect to step 7 of FIG. 1. (Farooq et al., page 27, lines 8-9)]
in a first instance of the determination in which the operational report does not match the simulated operational report within the threshold: adding the data processing system to a list of potentially compromised data processing systems;
[when the trust score is less than threshold, then the network can challenge the suspected UE(s) by probing/requesting for additional measurements (for which the network has an idea of the expected response) and based on that can make a final decision. In some embodiments the UE may be blacklisted and/or authorities may be notified of the identification of the suspicious UE. (Farooq et al., page 12 and 13, lines 27- 30 and line 1)]
and performing a first action set based on the list of the potentially compromised data processing systems to identify each data processing system of the list that is compromised.
[the UE may be blacklisted and/or authorities may be notified of the identification of the suspicious UE. (Farooq et al., page 12 and 13 lines 30 and line 1)]
Farooq fails to explicitly disclose the operational report comprising a number of computations performed by the data processing system, and the operational report being obtained based on a schedule specifying a time for obtaining the operational report that is based, at least in part, on one selected from a group consisting of:a need of a downstream consumer of services provided by the data processing system, and a frequency of historical security issues that impacted the distributed environment over time;
However in an analogous art Haeberle discloses the operational report comprising a number of computations performed by the data processing system,
[The incident viewer 127 can enable users to interact with a summarized incident report generated from the aggregation engine 129. In some instances, the aggregation engine 129 may handle the incoming messages (e.g., the individual alert messages 146) and perform a single step aggregation for the incident viewer 127 (i.e., aggregation process is not performed at the server 101, resulting in less data manipulation/reduction) (Haeberle et al., paragraph 28, aggregation of alerts and correlation to remove duplicates can be interpreted to be the computations)]
[the aggregation engine 129 may perform a second level aggregation to the bulk alert messages 145 (Haeberle et al., paragraph 28)]
the correlation techniques can include correlation of inbound incident reports of bulk alert messages 145 and/or individual alert messages 146 with the tenant information 117, the incident database 119, and the message database 143 (Haeberle et al., paragraph 38)]
and the operational report being obtained based on a schedule specifying a time for obtaining the operational report that is based, at least in part, on one selected from a group consisting of: a need of a downstream consumer of services provided by the data processing system, and a frequency of historical security issues that impacted the distributed environment over time;
[The summarized incident report can then be provided to the incident viewer 260 upon request to enable users to view and interact with the aggregated alerts and/or incidents of the tenants 210 and 212. (Haeberle et al., paragraph 58, the request can be interpreted as a schedule or specifying time)]
Farooq and Haeberle are considered to be analogous to the claimed invention because they are in the same field of vulnerability detection. Therefore, it would have been obvious to one of ordinary skill in the art before the instant application effective filing date to have modified the teachings of Farooq to incorporate the teachings of Haeberle et al. to include discloses the operational report comprising a number of computations performed by the data processing system, and the operational report being obtained based on a schedule specifying a time for obtaining the operational report that is based, at least in part, on one selected from a group consisting of: a need of a downstream consumer of services provided by the data processing system, and a frequency of historical security issues that impacted the distributed environment over time, in order to generate and provide incident reports to describe any malfunction or unexpected behavior of the software. (Haeberle et al., paragraph 2)]
Regarding claim 12,
Farooq teaches A non-transitory machine-readable medium having instructions stored therein, which when executed by a processor, cause the processor to perform operations for monitoring security of data processing systems throughout a distributed environment by a security manager,
[and/or computer-executable memory devices that store information, data, and/or instructions that may be used by processing circuitry 170. Device readable medium 180 may store any suitable instructions, data or information, including a computer program, software, an application including one or more of logic, rules, code, tables, etc. and/or other instructions capable of being executed by processing circuitry 170 and, utilized by network node 160. (Farooq et al., page 17, lines 6-11)]
The claim recites substantially the same content as claim 1 and is rejected with the rationales set forth for claim 1.
Regarding claim 17,
Farooq teaches A data processing system, comprising:
a processor; and a memory coupled to the processor to store instructions, which when executed by the processor, cause the processor to perform operations for monitoring security of data processing systems throughout a distributed environment by a security manager,
[processing circuitry 120 may execute instructions stored in device readable medium 130 or in memory within processing circuitry 120 to provide the functionality disclosed herein. (Farooq et al., page 22, lines 4-6)]
The claim recites substantially the same content as claim 1 and is rejected with the rationales set forth for claim 1.
Regarding claims 2, 13, and 18,
Farooq in view of Haeberle discloses the method of claim 1, the non-transitory machine-readable medium of claim 12, and the data processing system of claim 17 further comprising:
in a second instance of the determination in which the operational report matches the simulated operational report within the threshold: concluding that the data processing system is not compromised.
[If Υ.sub.n is greater than the threshold, then the report can be identified as a true report (Farooq et al., page 12, line 13-14)]
Regarding claims 3, 14, and 19,
Farooq in view of Haeberle discloses the method of claim 1, the non-transitory machine-readable medium of claim 12, and the data processing system of claim 17, further comprising:
prior to obtaining the operational report:
identifying operational report criteria indicating data to be provided by the data processing system; and instantiating a reporting agent in the data processing system using the operational report criteria.
[MDT reporting enables the network to instruct user equipment (UE′) to log network measurements, such as reference signal received power (RSRP) and reference signal received quality (RSRQ) of serving and neighboring cells and send them back to the core through radio resource control (RRC) signaling messages thereby avoiding manual and time consuming physical drive tests as shown in Table I. (Farooq et al., page 2, lines 5-9)]
Regarding claims 4, 15, and 20,
Farooq in view of Haeberle discloses the method of claim 1, the non-transitory machine-readable medium of claim 12, and the data processing system of claim 17,
wherein the operational report further comprises a quantification of sub-routines performed by the data processing system at a point in time.
[MDT reporting enables the network to instruct user equipment (UE′) to log network measurements, such as reference signal received power (RSRP) and reference signal received quality (RSRQ) of serving and neighboring cells and send them back to the core through radio resource control (RRC) signaling messages thereby avoiding manual and time consuming physical drive tests as shown in Table I. (Farooq et al., page 2, lines 5-9, the logging of network measurements being the quantification of routines at a point in time)]; see also (Haeberle et al., paragraph 28), and motivation to combine as explained in claim 1).
Regarding claims 5 and 16,
Farooq in view of Haeberle discloses the method of claim 1, the non-transitory machine-readable medium of claim 12, wherein the operational report further comprises a quantification of sub-routines performed by the data processing system over a duration of time.
[MDT reporting enables the network to instruct user equipment (UE′) to log network measurements, such as reference signal received power (RSRP) and reference signal received quality (RSRQ) of serving and neighboring cells and send them back to the core through radio resource control (RRC) signaling messages thereby avoiding manual and time consuming physical drive tests as shown in Table I. (Farooq et al., page 2, lines 5-9, the logging of network measurements being the quantification of routines over a duration of time)]; see also (Haeberle et al., paragraph 28), and motivation to combine as explained in claim 1).
Regarding claim 6,
Farooq in view of Haeberle discloses the method of claim 1,
wherein obtaining the simulated operational report comprises:
obtaining a digital twin of the data processing system;
and performing a simulation of operation of the data processing system using the digital twin to obtain an expected number of computations performed by the data processing system.
[the network node calculates a factor that accounts for a similarity to target information retrieved from a radio network planning tool or simulator. For example, the planning tool or simulator can be used to emulate conditions reported in the location as reported by the UE. If the difference between the MDT measurements received from UE and the target information from the planner tool or simulator differ by certain threshold, then it may be a sign of an MDT attack. (Farooq et al., page 10, lines 18-23)]
Regarding claim 7,
Farooq in view of Haeberle discloses the method of claim 6, wherein the digital twin of the data processing system simulates operation of the data processing system.
[comparing the report with nearby reporting user equipment (UEs), target information obtained from a planning tool or simulator, and/or historical reports reported from the same location. (Farooq et al., page 8, lines 24-26, simulation can take place)]
Regarding claim 8,
Farooq in view of Haeberle discloses The method of claim 6,
wherein the simulated operational report and the operational report comprise a same quantity.
[correlating the MDT report with MDT reports from one or more wireless devices in proximity to the wireless device includes comparing similarity of hardware components (e.g., same manufacturer) between the wireless device and the one or more wireless devices in proximity to the wireless device. (Farooq et al., page 6, lines 25-28)]
Regarding claim 11,
Farooq in view of Haeberle discloses the method of claim 1,
wherein the data processing system is an internet of things device associated with at least one sensor positioned to collect data representative of an aspect of an environment.
[in an Internet of Things (IoT) scenario, a WD may represent a machine or other device that performs monitoring and/or measurements and transmits the results of such monitoring and/or measurements to another WD and/or a network node. The WD may in this case be a machine-to-machine (M2M) device, which may in a 3GPP context be referred to as an MTC device. As one example, the WD may be a UE implementing the 3GPP narrow band internet of things (NB-IoT) standard. Examples of such machines or devices are sensors (Farooq et al., page 20, lines 11-17)]
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Farooq et al. (WO 2022101656) in view of Haeberle et al. (US 20140081925) and in further view of Naito et al. (US 20220137611).
Regarding claim 9,
Farooq in view of Haeberle discloses the method of claim 8, but fails to explicitly disclose wherein making the determination comprises: obtaining a time series representing the computations performed by the data processing system over a duration of time using the operational report; comparing the time series to a simulated time series based on the simulated operational report; obtaining a difference between the time series and the simulated time series; and comparing the difference to the threshold.
However in an analogous art Naito discloses
wherein making the determination comprises: obtaining a time series representing the computations performed by the data processing system over a duration of time using the operational report; comparing the time series to a simulated time series based on the simulated operational report; obtaining a difference between the time series and the simulated time series; and comparing the difference to the threshold
[The monitoring standard is a threshold value to be compared with the difference between the input time-series data and the restoration data. Where the difference is larger than the threshold value, it is determined that there is an abnormality or a sign of abnormality. The difference between the input time-series data and the restoration data will be hereinafter referred to as an error. (Naito et al., paragraph 61)]
Farooq, Haeberle, and Naito are considered to be analogous to the claimed invention because they are in the same field of vulnerability detection. Therefore, it would have been obvious to one of ordinary skill in the art before the instant application effective filing date to have modified the teachings of Farooq and Haeberle to incorporate the teachings of Naito et al. to include wherein making the determination comprises: obtaining a time series representing the computations performed by the data processing system over a duration of time using the operational report; comparing the time series to a simulated time series based on the simulated operational report; obtaining a difference between the time series and the simulated time series; and comparing the difference to the threshold, in order to determine if there is an abnormality or a sign of abnormality. (Naito et al., paragraph 61)]
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Farooq et al. (WO 2022101656) in view of Haeberle et al. (US 20140081925) in further view of Cain et al. (US 20060080656).
Regarding claim 10,
Farooq in view of Haeberle discloses the method of claim 1, but fails to explicitly disclose wherein performing the first action set comprises: for each compromised data processing system:
performing a second action set to remediate a compromised state of the compromised data processing system. the second action set comprising an action selected from a group of actions consisting of: re-imaging software of the data processing system, and powering off the data processing system for a duration of time.
However in an analogous art Cain discloses performing a second action set to remediate a compromised state of the compromised data processing system. the second action set comprising an action selected from a group of actions consisting of: re-imaging software of the data processing system, and powering off the data processing system for a duration of time
[Countermeasures may be of lower risk and can be applied more quickly and with less testing than the software update itself. It may be significantly easier, for example, to disable network ports or to shut down services or systems that are exposed to a particular security vulnerability and apply the software update later. (Cain et al., paragraph 517)]
[The post implementation review may also include various actions designed to make sure the computer system is updated, such as ensuring that build images include the software updates so that computers receiving the image are up-to-date with respect to the latest software updates (Cain et al., paragraph 72)]
Farooq, Haeberle, and Cain are considered to be analogous to the claimed invention because they are in the same field of vulnerability detection. Therefore, it would have been obvious to one of ordinary skill in the art before the instant application effective filing date to have modified the teachings of Farooq and Haeberle to incorporate the teachings of Cain et al. to include performing a second action set to remediate a compromised state of the compromised data processing system. the second action set comprising an action selected from a group of actions consisting of: re-imaging software of the data processing system, and powering off the data processing system for a duration of time, in order to deploy a short-term countermeasure to resolve the issues with the vulnerability. (Cain et al., paragraph 515)]
Conclusion
Applicant’s amendment necessitated the new ground(s) of rejection presented in this Office action. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL ELAHIAN whose telephone number is (703) 756-1284. The examiner can normally be reached on Monday – Friday from 7:30am to 5pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Catherine Thiaw can be reached at telephone number 571-270-1138. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center and the Private Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from Patent Center or Private PAIR. Status information for unpublished applications is available through Patent Center and Private PAIR for authorized users only. Should you have questions about access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/D.E./DANIEL ELAHIAN, Examiner, Art Unit 2407
/Catherine Thiaw/Supervisory Patent Examiner, Art Unit 2407 2/7/2026