Prosecution Insights
Last updated: April 19, 2026
Application No. 18/607,189

METHOD AND SYSTEM FOR ANALYZING EMBEDDED SYSTEMS

Final Rejection §103
Filed
Mar 15, 2024
Examiner
MUNGUIA, DUILIO
Art Unit
2497
Tech Center
2400 — Computer Networks
Assignee
Objectsecurity LLC
OA Round
2 (Final)
100%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 100% — above average
100%
Career Allow Rate
5 granted / 5 resolved
+42.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
25 currently pending
Career history
30
Total Applications
across all art units

Statute-Specific Performance

§101
6.0%
-34.0% vs TC avg
§103
69.3%
+29.3% vs TC avg
§102
16.7%
-23.3% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 5 resolved cases

Office Action

§103
Detailed Action Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendments This Final Office Action is in response to the amendments filed on 11/13/2025. In which, claims 1 and 14 have been amended, no claims have been cancelled, and claims 1 – 26 remain pending in the application. Response to Amendment The amended filed 2/25/26 has been entered. See response to amendments. Claims 1 and 14 are amended but do not have the proper markings (e.g. strike-through and underlining) MPEP 714: Amendments, Applicant’s Action. Applicant’s did not mark limitations amended and Examiner understand limitations in claims 1 and 14 are amended. Response to Arguments Remarks regarding rejections under 35 U.S.C § 103 filed 2/25/26 Applicant’s amendment to Independent claims 1 and 14 and arguments are carefully considered and are persuasive. However, upon further consideration, arguments are moot in view of new found prior art. With respect to applicant’s argument to the remaining dependent claims 2-13, and 15 - 26 on pages 21 of the remark, the applicant is relying on the newly added amendments of the independent claims 1 and 14. Please see examiner’s response above and the detail of the rejection below. The Applicant argument regarding (remark pages 20-21): “The Examiner alleges that lnokuchi discloses many of the features of claim 1 but admits that lnokuchi is silent about some of the features of claim 1. The Examiner then alleges that Becht discloses loading the digital twin configuration data and storing the predicate device model data and/or the digital twin configuration data in the memory. However, lnokuchi discloses choosing a duplication level before running tests, based on time or budget constraints with a penetration-test module only. Becht discloses loading an FPGA configuration and capturing signal states for debugging, which is hardware replication. That is, lnokuchi and Becht as combined are directed to a static or one-time simulation system that does not provide for dynamic, continuous analysis and reconfiguration. lnokuchi and Becht also does not issue a model to analyze a system. In other words, neither lnokuchi nor Becht performs the runtime, model-based feedback process central to our platform. The distinctions are operational (static vs adaptive), architectural (pre-built vs model-driven), and domain-specific (debug vs security analysis). As such, Applicant respectfully submits that lnokuchi and Becht, either individually or in combination, fail to teach or suggest each and every feature of claim 1. Accordingly, claim 1 is patentably distinct from lnokuchi and Becht. The Examiner respectfully disagrees and argument is not persuasive because there is no recitation of limitations that suggest adaptive, model-driven, and furthermore debugging is type of security analysis to ensure the quality, performance and reliability of software. Claim Objections Claims 1-13 are objected to because of the following informalities: Claim(s) 1 is a method claim with limitations that include “if” conditions. contingent limitations” (MPEP § 2111.04, subsection II). Examiner submits that if the conditional limitation step is not reached, then the remaining limitation steps do not have to be performed and will render the remaining limitations not valid, therefore, it will not be required to show anticipation or obviousness for all paths of the conditional limitation. Examiner suggests replacing “if” with proper terminology to avoid conditional limitations. Appropriate correction is required. Dependent claims 2 - 13 are objected due to their dependency on the objected parent claim. Claim 1 is further objected to because of the following informalities: claim 1 refers “data relevant to the analysis” and there is no antecedent it should be “an analysis”. Claim 14 is objected to because of the following informalities: claim 14 refers “data relevant to the analysis” and there is no antecedent it should be “an analysis”. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-9, 11-12, 14-22, and 24-25 are rejected under 35 U.S.C. 103 as being unpatentable over Inokuchi et al. (US-20210042423-A1 hereafter Inokuchi), in view of Becht et al. (US-20210232471-A1 hereafter Becht), in further view of Agrawal et al. (US-20240039945 A1 hereafter Agrawal). Regarding claim 1, Inokuchi a computer-implemented method for analyzing software or firmware of one or more computing systems to assess security properties related to the one or more computing systems, the method comprising (see Inokuchi par.0018: “The security assessment system 10 is configured to provide a duplicated environment 200 which duplicates an assessment target system (not shown) comprising a plurality of physical components. The duplicated environment 200 serves as the above-mentioned digital twin. It is noted that the digital twin, on the one hand, allows running required security/pen tests, and on the other hand, all the results that are obtained from executing the tests on the digital twin reflect the expected results of the tests if they would have executed on the real environment.”): loading, via a processor, from a data storage, a memory, or via a communication, or via a user entry through a user interface, at least one predicate device input data comprising characteristics about at least one predicate device (see Inokuchi par.0020: “in the simulation sub-module 210, the specification 302 of the physical component comprises a list of hardware/software and version which the physical component has. The specification 302 of the physical component typically may, for example, be a specification which is provided by a vender or a specification in which information is integrated in accordance with CPE (Common Product Enumeration).”); translating, via a processor, the at least one predicate device input data into at least one predicate device model data comprising data structures that describe characteristics or dependencies of the at least one predicate device input data relevant to the analysis( see Inokuchi par.0031: “which physical component in the assessment target system should be assessed by the simulation sub-module 210, the emulation sub-module 220, or the physical sub-module 230 to produce a designed result 330 indicative of a duplicated environment design. With the defined constraints, a target function that is desired to be maximized may be defined. This function represents the benefit from the selected digital twin.”); determining, via the processor, from the data storage, the memory, or via the communication, or via the user entry through the user interface, at least one digital twin configuration data used to configure at least one digital twin environment to behave as similar as possible to the predicate device with respect to processing the at least one predicate device model data (see Inokuchi par.0031: “the duplicated environment design module 110 is configured to select a duplication level for each physical component based on the constrains 310 and the above-mentioned effects associated with the physical components in order to design the duplicated environment 200 to produce the designed result 330 indicative of the duplicated environment design. The output interface 420 is configured to output the designed result 330. The duplication level is indicative of any one of the simulation sub-module 210, the emulation sub-module 220, and the physical sub-module 230 which are for reproducing the physical components of the duplicated environment 200.”); instructing, via the processor, the at least one digital twin environment to configure itself to implement the loaded at least one digital twin configuration data (see Inokuchi par.0031: “the duplicated environment design module 110 is configured to select a duplication level for each physical component based on the constrains 310 and the above-mentioned effects associated with the physical components in order to design the duplicated environment 200 to produce the designed result 330 indicative of the duplicated environment design.”); determining, via a processor, from the data storage, the memory, or via the communication, or via the user entry through the user interface, at least one security analysis to be carried out on the at least one digital twin environment (see Inokuchi par.0035: “The vulnerability assessment module 170 extracts vulnerabilities of each component in the duplicated environment 200 to generate a list of the extracted vulnerabilities. The attack graph generation module 180 generates an attack graph in which components in the duplicated environment 200 used in an attack can be distinguished. The duplicated environment design module 110 designs the duplicated environment 200 based on at least one of the list and the attack graph. In other words, the duplicated environment design module 110 is configured to select the duplication level based on at least one of the list and the attack graph.”.); simulating, on the at least one digital twin environment, the at least one predicate device (see Inokuchi par.0066-0067: “The refined potential calculation sub-module 522 may calculate the above-mentioned refined potential based on the above mentioned classification of the security diagnosis actions which may be carried out on the duplicated environment 200 or a sum of effects thereof. In this event, the security assessment system further comprises a database for storing, each classification of the apparatus, information indicating which security diagnosis action can be carried out in a case where a real apparatus is used as the apparatus in question or in a case where a virtual machine is used as the apparatus in question. Alternatively, the input interface 410 may input information indicative of the security diagnosis actions which can be carried out by a user.”); storing, via the processor, the output data pertaining to the at least one result in a memory (see Inokuchi par.0068: “the security assessment system 10 further comprises another database for storing, each of the security diagnosis actions, information indicative of degrees of the effects which are preliminarily set. Such information includes a numerical value in a specific range.”); Inokuchi appear do not explicitly teach however Becht teaches loading, via the processor, from the data storage, the memory, or via the communication, or via the user entry through the user interface, the at least one digital twin configuration data onto the at least one digital twin environment (see Becht par.0028: “DUT 208 is implemented as a software simulation of the ASIC RTL code, e.g., Verilog or Very High Speed Integrated Circuit-Hardware Description Language (VHDL).”, par.0030: “Emulation capture program 112 creates a copy of the DUT (step 302). In an embodiment, emulation capture program 112 creates copy DUT 210 to duplicate the functionality of DUT 208. In an embodiment, emulation capture program 112 creates copy DUT 210 by loading the identical configuration into one or more FPGAs in copy DUT 210 as are loaded into the one or more FPGAs in DUT 208. In an embodiment, emulation capture program 112 loads the test software and the test cases for the test software into DUT 208 and copy DUT 210.”) Examiner interpret that the DUT as the digital twin data loaded in to emulation capture program 112 construe as the digital twin environment which is consistent with applicant’s instant application [par.0050] “The invention may include programmable hardware (e.g., FPGA) where enhanced visibility and/or observational strategies may be enabled, and/or may suggest and/or propose a system where physical ES may be modeled and/or characterized for their simulation in hardware and/or software.”; storing, via the processor, the at least one predicate device model data and/or the at least one digital twin configuration data in the memory (see Becht par.0028: “DUT 208 is implemented as a software simulation of the ASIC RTL code, e.g., Verilog or Very High Speed Integrated Circuit-Hardware Description Language (VHDL).”, par.0031: “Emulation capture program 112 creates a delayed buffer (step 304). In an embodiment, emulation capture program 112 creates delayed buffer 204 in the block RAM of the one or more FPGAs in copy DUT 210. In another embodiment, emulation capture program 112 creates delayed buffer 204 in external RAM, for example, DDR memory.”) Examiner construe that DUT 210 as the digital twin that is being store in delayed buffer 204 in the block RAM of the FPGA which is consistent with applicant’s instant application [par.0050] “The invention may include programmable hardware (e.g., FPGA) where enhanced visibility and/or observational strategies may be enabled, and/or may suggest and/or propose a system where physical ES may be modeled and/or characterized for their simulation in hardware and/or software.”; It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have combined Inokuchi teaching “a security assessment system is configured to provide a duplicated environment which duplicates an assessment target system comprising a plurality of physical components. The security assessment system comprises a duplicated environment design circuitry configured to select a duplication level for each physical component based on constraints specified by a user and effects associated with the physical components in order to design the duplicated environment to produce a designed result indicative of duplicated environment design”, (Inokuchi par.), with Becht teaching “computer program product, and a system for simulating an electronic device. In one embodiment, a copy of a design under test is created. A delayed buffer for the copy is created, where the inputs to the design under test are stored in the delayed buffer.”, (see Becht par.0003). Inokuchi in view of Becht do not explicitly teach however Agrawal teaches generating, via the processor, an output data describing (see Agrawal par.0051: “a result of the evaluation is output. In one approach, the result of the evaluation includes a recommendation for security modifications to make to the security characteristics of the physical environment. In another approach, the result of the evaluation includes an identification of which test conditions or equivalently which set of test conditions cause a failure of at least one of the security conditions.”); executing, via the processor, the at least one security analysis on the digital twin environment (See Agrawal par.0049: “a set of test conditions within the digital twin of the physical environment is simulated to test the security characteristics. For example, the set of test conditions may include a set of simulated agents (e.g., simulated people) that attempt to breach the security characteristics of the digital twin.”); and determining, via the processor, if the at least one result satisfies a predetermined condition, , if not, repeating to reconfigure the at least one digital twin configuration data to reconfigure the at least one digital twin environment and to simulate the at least one predictable device using the reconfigured at least one digital twin environment until the at least one result satisfies the predetermined condition. (See Agrawal par.0051-0052: “a result of the evaluation is output. In one approach, the result of the evaluation includes a recommendation for security modifications to make to the security characteristics of the physical environment. In another approach, the result of the evaluation includes an identification of which test conditions or equivalently which set of test conditions cause a failure of at least one of the security conditions. the method 201 may further include modifying the security characteristics of the digital twin of the physical environment based on the analysis of the simulation of test conditions, and performing the method 201 with the modified security characteristics in the digital twin of the physical environment. The effectiveness of the modified security characteristics relative to the security characteristics existing in the physical environment can then be evaluated.”). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have combined Inokuchi in view of Becht teaching described above with Agrawal teaching “if a security scenario fails, a notification can be output, and in some approaches, a recommendation of how the security devices or protocols are to be arranged and/or changed to ensure required security. For example, a recommendation may be to add a camera to view a particular location. Another recommendation may refer to capacity management, e.g., such as to limit an area such as a lobby to a certain number of people. Iterative changes and/or updates to the model can be incorporated into subsequent renderings and the process re-run with the changes and/or updates to determine whether they eliminate risk elements.”, (see Agrawal par.0094). Regarding claim 14 is a computer-implemented system claim that recites similar limitations as the method claim 1 and is being rejected based on the same rational as claim 1. Regarding claim 2. Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi further disclose wherein the at least one characteristics of the at least one predicate device input data comprises at least one of digital characteristics, physical characteristics, electrical power consumption, electromagnetic radiation, temperature, acoustics, emanations, firmware, software, binary, code, communications, network traffic, vibration patterns, hardware configurations, sensor data, system logs, user interactions, environmental conditions, GPS data, timing information, power cycles, error codes, device states, signal integrity, memory usage, processor activity, interface interactions, cryptographic operations, protocol specifics, storage contents, peripheral status, execution patterns, energy efficiency metrics, thermal profiles, electromagnetic compatibility, and/or wireless signal characteristics. (See Inokuchi par. 0018: “which duplicates an assessment target system (not shown) comprising a plurality of physical components.”). Regarding claim 15 is a computer-implemented system claim that recites similar limitations as the method claim 2 and is being rejected based on the same rational as claim 2. Regarding claim 3 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Becht further teaches wherein the at least one predicate device comprises at least one of an embedded system, an industrial control system, a programmable logic controller, or a computing device. (See Becht par.0016-0017: “Computing device 110 can be a standalone computing device, a management server, a web server, a mobile computing device, or any other electronic device or computing system capable of receiving, sending, and processing data.”). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have combined Inokuchi in view of Becht and Agrawal teaching of claim 1, with Becht teaching “The design inputs 202 represent the various possible inputs received by DUT 208. In an embodiment, the design inputs 202 are signals received by DUT 208, such as hardware signals. In another embodiment, the design inputs 202 are instructions received by testing device 130 from computing device 110”, see (Becht par.0021). Regarding claim 16 is a computer-implemented system claim that recites similar limitations as the method claim 3 and is being rejected based on the same rational as claim 3. Regarding claim 4 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi further teaches wherein translating the at least one predicate device input data into the at least one predicate device model data comprises at least one of normalizing, filtering, pre-processing, un-biasing, balancing, selecting, correcting, auto-completing, inferring, cleaning, cleansing, converting, aggregating, smoothing, enriching, deduplicating, validating, segmenting, classifying, clustering, feature extraction, dimensionality reduction, scaling, discretizing, encoding, hashing, anonymizing, tokenizing, parsing, segmenting, interpolating, extrapolating, normalizing, standardizing, categorizing, summarizing, visualizing, correlating, integrating, aligning, optimizing, decomposing, reconstructing, compressing, decompressing, encrypting, decrypting, modeling, simulating, predicting, projecting, forecasting, estimating, detecting anomalies, identifying patterns, understanding context, generating insights, deriving metrics, benchmarking, evaluating, validating, backtesting, cross-validating, deploying, monitoring, updating, iterating, refining, evolving, contextualizing, localizing, personalizing, customizing, or optimizing. (See Inokuchi par.0031: “which physical component in the assessment target system should be assessed by the simulation sub-module 210, the emulation sub-module 220, or the physical sub-module 230 to produce a designed result 330 indicative of a duplicated environment design. With the defined constraints, a target function that is desired to be maximized may be defined. This function represents the benefit from the selected digital twin.”). Regarding claim 17 is a computer-implemented system claim that recites similar limitations as the method claim 4 and is being rejected based on the same rational as claim 4. Regarding claim 5 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi further teaches wherein the at least one predicate device model data comprises at least one of datasets of characteristics, binary data, assembly data, source code data, firmware, firmware images, logs, sensor readings, configuration files, diagnostic data, network packets, system metrics, user commands, environmental data, performance counters, hardware states, software versions, update histories, error messages, authentication records, encryption keys, communication protocols, user interfaces, API calls, memory dumps, registry settings, device specifications, GPS data, time stamps, power consumption patterns, electromagnetic emissions, acoustic signals, temperature readings, vibration data, pressure measurements, light intensity data, chemical composition data, material properties, structural integrity data, flow rates, energy usage data, bandwidth utilization, signal strength, latency measurements, throughput data, load profiles, capacity metrics, efficiency ratings, reliability indicators, maintenance records, operational statuses, geometric data, kinematic data, dynamic system models, control algorithms, optimization parameters, simulation results, test results, calibration data, audit trails, incident reports, vulnerability assessments, threat intelligence, security breaches, forensic analysis data, recovery plans, usage patterns, or digital footprints. (See Inokuchi par.0019: “simulation sub-module 210, an emulation sub-module 220, and a physical sub-module 230. The simulation sub-module 210 is a specification 302 of the physical component or a behavior model 304 of a function of the physical component. The emulation sub-module 220 comprises software which reproduces the physical component using a VM (virtual machine). The physical sub module 230 comprises a component which is physically similar to the physical component in the assessment target system.”). Regarding claim 18 is a computer-implemented system claim that recites similar limitations as the method claim 5 and is being rejected based on the same rational as claim 5. Regarding claim 6 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi further teaches wherein determining the at last one digital twin configuration data comprises at least one of manually or semi-automatically determining by a user, automatically determining, determining based on the kind of predicate device model data, determining based on which digital twin configuration data is compatible with each other, or modeling of the predicate device. (See Inokuchi par.0031: “the duplicated environment design module 110 is configured to select a duplication level for each physical component based on the constrains 310 and the above-mentioned effects associated with the physical components in order to design the duplicated environment 200 to produce the designed result 330 indicative of the duplicated environment design. The output interface 420 is configured to output the designed result 330. The duplication level is indicative of any one of the simulation sub-module 210, the emulation sub-module 220, and the physical sub-module 230 which are for reproducing the physical components of the duplicated environment 200.”) Regarding claim 19 is a computer-implemented system claim that recites similar limitations as the method claim 6 and is being rejected based on the same rational as claim 6. Regarding claim 7 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Becht further teaches wherein the at least one digital twin configuration data comprises at least one of FPGA IP, intermediate representation (IP), system model data, data sheet, binary data, script, code, pinout table. (see Becht par.0028: “DUT 208 is implemented as a software simulation of the ASIC RTL code, e.g., Verilog or Very High Speed Integrated Circuit-Hardware Description Language (VHDL).” It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have combined Inokuchi in view of Becht and Agrawal teaching of the method claim 1 with Becht teaching “event 206 is an error condition generated by DUT 208. The error condition may be, for example, an output signal from DUT 208. In another example, event 206 occurs when a specific combination of output signals, or pattern of output signals, is detected from DUT 208. In another embodiment, event 206 is triggered when a particular memory location in DUT 208 becomes a particular value. For example, when a particular memory location in DUT 208 is written with the value 088H, generate event 206. In an embodiment, event 206 is an event detected from hardware, for example, a start of packet indication on a connected network. In yet another embodiment, event 206 can be any appropriate signal that that aids in the testing and debugging of DUT 208.”, (see Becht par.0033). Regarding claim 20 is a computer-implemented system claim that recites similar limitations as the method claim 7 and is being rejected based on the same rational as claim 7. Regarding claim 8 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi further teaches wherein the at least one digital twin environment comprises at least one of FPGA, CPU, SOC, emulator, simulator, virtualization environment, LLVM, QEMU, computing device, embedded device, peripheral, power supply, sensor, actuator, communications module, printed circuit board, cable harness. (See Inokuchi par.0019: “the duplicated environment 200 has a simulation sub-module 210, an emulation sub-module 220,”). Regarding claim 21 is a computer-implemented system claim that recites similar limitations as the method claim 8 and is being rejected based on the same rational as claim 8. Regarding claim 9 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi further teaches wherein the at least one security analysis comprises at least one of power analysis, EM analysis, acoustic analysis, temperature analysis, processor execution analysis, memory analysis, control flow graph analysis, capturing emanations via DSO/SDR, network traffic analysis, firmware reverse engineering, software vulnerability analysis, cryptographic analysis, side-channel attacks, fault injection analysis, timing analysis, protocol analysis, authentication mechanism analysis, data integrity analysis, anomaly detection, intrusion detection, malware analysis, root cause analysis, code static analysis, dynamic analysis, penetration testing, fuzz testing, threat modeling, risk assessment, compliance testing, physical security analysis, incident response analysis, recovery strategies, resilience testing, security benchmarking, security auditing, configuration management analysis, dependency analysis, patch management analysis, access control analysis, authorization analysis, session management analysis, encryption implementation analysis, key management analysis, secure boot analysis, secure update analysis, API security analysis, IoT security analysis, automotive security analysis, industrial control system security analysis, smart grid security analysis, healthcare device security analysis, wearable device security analysis, mobile security analysis, cloud security analysis, virtualization security analysis, container security analysis, blockchain security analysis, artificial intelligence security analysis, machine learning model security analysis, quantum computing security analysis, operational technology security analysis, SCADA system security analysis, network segmentation analysis, firewall rule analysis, intrusion prevention system analysis, anti-malware solutions analysis, data loss prevention analysis, endpoint protection analysis, secure communication protocols analysis, DNS security analysis, email security analysis, web application security analysis, database security analysis, storage security analysis, backup security analysis, disaster recovery planning analysis, adversarial simulation, cyber threat intelligence analysis, third-party security assessment, vulnerability scanning, file upload security analysis, session hijacking prevention analysis, directory traversal prevention analysis, remote code execution prevention analysis, denial of service attack prevention analysis, distributed denial of service attack prevention analysis, man-in-the-middle attack prevention analysis, phishing attack prevention analysis, spear-phishing attack prevention analysis, ransomware defense analysis, botnet detection analysis, cryptojacking defense analysis, insider threat detection analysis, data anonymization techniques analysis, secure deletion techniques analysis, digital footprint analysis, online tracking prevention analysis, privacy by design strategies analysis, cybersecurity insurance analysis, or cybersecurity metrics. (See Inokuchi par.0039: “The active scan/pen-test module 190 comprises an interface for other software and other users which can perform the active scan and penetration test to the components reproduced in the emulation sub-module 220 and the physical sub-module 230 on the duplicated environment 200.”). Regarding claim 22 is a computer-implemented system claim that recites similar limitations as the method claim 9 and is being rejected based on the same rational as claim 9. Regarding claim 11 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi further teaches wherein executing the at least one security analysis comprises at least one of capturing emanations, capturing data, capturing communications, probing, injecting, sniffing, attacking, monitoring, logging, decrypting, encrypting, bypassing, replaying, emulating, simulating, fuzzing, scanning, auditing, testing, validating, verifying, reverse engineering, disassembling, decompiling, sandboxing, patching, hardening, securing, configuring, optimizing, tuning, benchmarking, stress testing, load testing, performance testing, resilience testing, fault injection, tampering, cloning, spoofing, eavesdropping, intercepting, blocking, filtering, analyzing, decoding, demodulating, modulating, synthesizing, emulating, virtualizing, containerizing, orchestrating, automating, scripting, deploying, updating, upgrading, backing up, restoring, recovering, erasing, wiping, disabling, enabling, restarting, rebooting, shutting down, isolating, quarantining, containing, deterring, detouring, alerting, reporting, visualizing, documenting, training, quantizing, adjusting, balancing, aligning, scaling, synchronizing, rewarding, or reinforcing. (See Inokuchi par.0039: “The active scan/pen-test module 190 may collect vulnerability attack codes published to carry out the penetration test using the vulnerability attack codes corresponding to the assessment target components.”). Regarding claim 24 is a computer-implemented system claim that recites similar limitations as the method claim 11 and is being rejected based on the same rational as claim 11. Regarding claim 12 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Agrawal further teaches wherein the output data comprises at least one of robustness compromise, availability compromise, partial or full system crash, confidentiality breach, exfiltration, unauthorized disclosure, unauthorized modification, compromise of peripheral, compromise of communications, physical damage, denial of service, man-in-the-middle attack, replay attack, phishing, spear-phishing, SQL injection, cross-site scripting, buffer overflow, firmware tampering, side-channel attack, cryptographic attack, code injection, session hijacking, zero-day exploit, vulnerability exploit, password cracking, phishing, supply chain attack, insider threat, data leakage, spoofing, tampering with data in transit, unauthorized access to sensitive data, data integrity breach, espionage, sabotage, command and control, botnet involvement, advanced persistent threat attack, remote code execution, keylogging, credential stuffing, dictionary attack, brute force attack, cross-site request forgery, clickjacking, drive-by download, malware infection, ransomware attack, privilege escalation, rootkit installation, backdoor access, IoT botnet participation, DDoS amplification, infrastructure hijacking, API abuse, cloud breach, virtual machine escape, container breakout, network segmentation bypass, wireless network compromise, GPS spoofing, sensor spoofing, SCAD A system manipulation, industrial espionage, operational disruption, software supply chain attack, safety system disablement, environmental control system manipulation, vehicle control system hacking, medical device compromise, critical infrastructure sabotage, IP theft, unauthorized network access, unauthorized system configuration changes, unauthorized application installation, unauthorized data extraction, unauthorized system control, manipulation of device functionality, manipulation of physical processes, manipulation of sensor data, bypassing security controls, evasion of detection systems, analysis report, user-readable analysis report, visualizations, suggestions, recommendations, scorecard, machine-readable analysis report, or API call. (See Agrawal par.0051: “a result of the evaluation is output. In one approach, the result of the evaluation includes a recommendation for security modifications to make to the security characteristics of the physical environment. In another approach, the result of the evaluation includes an identification of which test conditions or equivalently which set of test conditions cause a failure of at least one of the security conditions.”). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have combined Inokuchi in view of Becht and Agrawal teaching of the method claim 1 with Agrawal teaching “Using this simulated environment, a determination can be made as to whether any pre-defined security norms are violated within the environment, as described in more detail below. For each use case where the security compliance/norms were violated, an action that should be taken in order to address the security concern may be determined and output. Alternatively, a user may determine an action that should be taken. The assessment can be re-run with the result of the action included therein.”, (see Agrawal par.0030). Regarding claim 25 is a computer-implemented system claim that recites similar limitations as the method claim 12 and is being rejected based on the same rational as claim 12. Claims 10, 13, 23, and 26 are rejected under 35 U.S.C. 103 as being unpatentable over Inokuchi et al. (US-20210042423-A1 hereafter Inokuchi), in view of Becht et al. (US-20210232471-A1 hereafter Becht), in view of Agrawal et al. (US-20240039945 A1 hereafter Agrawal), in further view of Sha et al. (US-11463322-B1 hereafter Sha). Regarding claim 10 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi in view of Becht and Agrawal do not explicitly teach however Sha teaches wherein simulating the at least one predicate device comprises at least one of executing the predicate device code on the digital twin environment. (See Sha Col.5 lines 30-57: “FIG. 1 illustrates a functional block diagram of a digital twin architecture 100 according to an illustrative embodiment. More particularly, digital twin architecture 100 models the entire IoT/edge/cloud environment so as to determine ways to reduce latency and otherwise optimize the system design…. Each digital twin virtual component in FIG. 1 models one or more functionalities of the actual devices that the virtual components represent. For example, as shown, IoT digital twin virtual component 110 provides functionalities such as, but not limited to, visualization, online optimization and anomaly detection.”). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have combined Inokuchi in view of Becht and Agrawal teaching of the method claim 1 with Sha teaching “an IoT digital twin virtual component 110 represents the entire physical Io T platform which includes digital twin virtual components representing the physical IoT devices (end devices). IoT digital twin virtual component 110 is operatively coupled to an edge digital twin 40 virtual component 112-1, which is operatively coupled to an edge digital twin virtual component”, (see Sha Col.5 lines 36-42). Regarding claim 23 is a computer-implemented system claim that recites similar limitations as the method claim 10 and is being rejected based on the same rational as claim 10. Regarding claim 13 Inokuchi in view of Becht and Agrawal disclose the method according to claim 1, Inokuchi in view of Becht and Agrawal do not explicitly teach however Sha teaches wherein the at least one action comprises at least one of presenting output data to a user, communicating output data to another machine, storing output data, triggering one or more notifications or alarms, blocking the functioning of the predicate device, or triggering automated remediation/hardening. (See Sha Col.6 lines 55-67 and Col.7 1-3: “Digital twin management engine 210 receives the set of inputs 220 and executes one or more of the set of functions 214 including, but not limited to: real-time data retrieval; simulation; performance prediction; data analytics; optimization; and system design. Results of the execution of one or more of the set of functions 214 can be displayed or otherwise be presented via user interface 212. User interface 212, in illustrative embodiments, comprises a 3D model of the physical IoT system,”). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have combined Inokuchi in view of Becht and Agrawal teaching of the method claim 1 with the teaching of Sha teaching “manual optimization based on simulation can be performed. For example, a user can input different parameters into the digital twin architecture 330 to 5 run simulation and output a simulated performance index such as resource allocation in edge servers, latency, service migration cost, energy consumption, etc. Based on simulated results, the user can manually select and set up best options for IoT offloading to edge servers. Automatic optimization based on machine learning can also be performed. The machine learning based intelligence integrated with digital twins allows operators to bring together previously unconnected systems to gain new insights, optimize offloading process, provide intelligent offloading decisions and monitor processes remotely.”, (see Sha Col.9 lines 3-16). Regarding claim 26 is a computer-implemented system claim that recites similar limitations as the method claim 13 and is being rejected based on the same rational as claim 13. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Larzul et al. (US-20160098505-A1) emulation environment performs bandwidth efficient analysis of a digital system. One embodiment of the emulation environment includes a host system and an emulator. The host system configures the emulator to load a design under test (DUT) and the emulator emulates the DUT accordingly. In one aspect, the emulator includes one or more design field-programmable gate arrays (FPGAs) that emulate the DUT. Klein et al (US-11973790-B2) directed to a connected vehicle cyber-security platform that leverages digital twins across multiple layers of the connected vehicle ecosystem and generates AAGs based on digital twins to evaluate vulnerabilities and remedies within the connected vehicle ecosystem. For example, and as described in further detail herein, to defend against cyber-attacks, AAGs can be generated, which represent potential lateral movements of adversaries within and across layers of the connected vehicle ecosystem. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DUILIO MUNGUIA whose telephone number is (571)270-5277. The examiner can normally be reached M-F 7:30 - 5:00Pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Eleni A Shiferaw can be reached at (571) 272-3867. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DUILIO MUNGUIA/Examiner, Art Unit 2497 /ELENI A SHIFERAW/Supervisory Patent Examiner, Art Unit 2497
Read full office action

Prosecution Timeline

Mar 15, 2024
Application Filed
Aug 08, 2025
Non-Final Rejection — §103
Nov 13, 2025
Response Filed
Feb 26, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12470541
IMAGE FORMING APPARATUS, DISPLAY METHOD, AND RECORDING MEDIUM FOR DISPLAYING AUTHENTICATION METHOD USING EXTERNAL SERVER OR UNIQUE TO IMAGE FORMING APPARATUS
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
100%
Grant Probability
99%
With Interview (+0.0%)
3y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 5 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month