Prosecution Insights
Last updated: April 19, 2026
Application No. 18/908,203

ARTIFICIAL INTELLIGENCE BASED CYBERSECURITY SYSTEM FOR MONITORING AUTOMOTIVE ECOSYSTEMS

Non-Final OA §101§102§103
Filed
Oct 07, 2024
Examiner
POLTORAK, PIOTR
Art Unit
2433
Tech Center
2400 — Computer Networks
Assignee
Darktrace Holdings Limited
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
3y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
443 granted / 594 resolved
+16.6% vs TC avg
Strong +30% interview lift
Without
With
+30.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
21 currently pending
Career history
615
Total Applications
across all art units

Statute-Specific Performance

§101
12.4%
-27.6% vs TC avg
§103
41.4%
+1.4% vs TC avg
§102
16.9%
-23.1% vs TC avg
§112
19.2%
-20.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 594 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Claims 1-27 have been examined. Priority Acknowledgment is made of applicant's claim for priority based on provisional application No. 63/542,708 dated 10/05/23. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-27 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. In the instant case, claim(s) 1-25 are directed to a system, claim(s) 26 are directed to a method, and claim(s) 27 are directed to a non-transitory computer readable medium which is an article of manufacture. Therefore, these claims fall within the four statutory categories of invention. The question under step 2A, prong one, is whether the claims recite a judicial exception (an abstract idea enumerated in the 2019 PEG, a law of nature, or a natural phenomenon). The claims are essentially directed towards comparing the received data. This judicial exception is not integrated into a practical application because when analyzed under prong two of step 2A of the 2019 PEG, the additional elements of the claims such as vehicle (module), non-transitory computer readable medium, etc. merely use these elements as a tool to perform an abstract idea. Specifically, the additional elements of the claims such as vehicle (module), software, medium, etc. performs the steps or functions cited above to carry out the abstract idea. The use of these elements as a tool to implement the abstract idea does not integrate the abstract idea into a practical application because it requires no more than a computer performing functions that correspond to acts required to carry out the abstract idea. The additional elements do not involve improvements to the functioning of a computer, or to any other technology or technical field (MPEP 2106.05(a)), the claims do not apply or use the abstract idea to effect a particular defense, the claims do not apply the abstract idea with, or by use of, a particular machine (MPEP 2106.05(b)), the claims do not effect a transformation or reduction of a particular article to a different state or thing (MPEP 2106.05(c)), and the claims do not apply or use the abstract idea in some other meaningful way beyond generally linking the use of the abstract idea to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception (MPEP 2106.05(e) and Vanda Memo). Therefore, the claims do not, for example, purport to improve the functioning of a computer. Nor do they effect an improvement in any other technology or technical field. Accordingly, the additional elements do not impose any meaningful limits on practicing the abstract idea, and the claims are directed to an abstract idea. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception when analyzed under step 2B of the 2019 PEG. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using particular vehicles, modules and software (e.g., ML) perform receiving and comparing steps amounts to no more than mere instructions to apply the exception using a generic computer component. Therefore, the use of these additional elements does no more than employ the computer/vehicle as a tool to automate and/or implement the abstract idea. The use of a computer/vehicle to merely automate and/or implement the abstract idea cannot provide significantly more than the abstract idea itself (MPEP 2106.05(I)(A)(f) & (h)). Therefore, the claims are not patent eligible. Note that although some claims offer some additional, specific steps related to threat detection, these claims are merely providing the context (the specific data elements to manipulate, e.g., claims 11, 17, etc. or the tools that can be involved in manipulation, e.g., claims 4, 16, etc.), additional manipulation (update data, e.g., claim 2), some additional action (e.g. claim 12), etc. None of the claims provide manipulation of the data specific to detection of anomalies and a particular action based on such detection. (Also, note various use of the “intended use language”: such as “to determine”, “to analyze”, etc.) In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. NOTE: the claim language is full of different names, whose meaning is not well known in the art and are not defined by applicant. For the purpose of the initial prosecution, the examiner evaluated the named components based on their functionalities equating similarly functioning elements of the prior art to the named subjects cited in the claims. Furthermore, various claims lack a very specific connection between claimed subject matter and instead use the terms such as “associated”, “intended use” language (e.g., “to …”), etc. Claim Rejections - 35 USC § 102 or 103 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-9, 12, 15-23, 25-27 is/are rejected under 35 U.S.C. 102(a) as being anticipated by, or in alternative, under 35 U.S.C. 103 unpatentable over Fellows (Simon David Lincoln Fellows: USPUB 20200358810) As per claims 1 and 26-27, Fellows teaches a method for detecting a cyber threat by a cyber threat defense system (see Abstract, Fig. 1 and 2 with the associated text, e.g., para 151-116), the method comprising: receiving data from a first vehicles (para 55); receiving data from a second vehicle; referencing one or more machine-learning models using machine-learning and artificial intelligence (AI) algorithms, the one or more machine-learning models including a first machine-learning model trained on a normal pattern of life associated with the first vehicles and the second vehicle; and comparing data received from the first vehicles and the second vehicle to the normal pattern of life associated with the first vehicles and the second vehicle to detect anomalies representing a cyber threat within the first vehicles or the second vehicle (para 41). Note that in light of Fellows’ disclosure clearly articulating that the invention is related to more than one vehicle/car the examiner asserted that Fellows teaching expressly relates to the first and the second vehicles. However, it is noted that even if, Fellow did not contemplate such solution, Official Notice is taken that in the world of computing, the artificial learning/machine learning in particular, using data pertaining to more than one object, especially similar objects (note that Fellows expressly suggest that the invention may extend to more than one type of vehicle (e.g., cars, ships, etc.)) would have been old and well known at the time the application was filed to given the predictable benefit of obtaining more accurate results specific to a particular type of object. The limitations of claims 2-4, 7-8, 12, 23, 25 are addressed by Fellows in para 19, 22, 28, 36, 44, 47, 55, 69, 90 and 106, for example. As per claim 9, the software module and/or hardware device within vehicles meet the limitation of electronic control units and, as per claim 5, a skilled in the art would readily appreciate that vehicles, such as cars or ships are operated independently of one another. Similarly, in light of Fellows teaching the enterprise network dealing with vehicles such as cars, where a skilled in the art would appreciate that at least at some point they must be designed (including design of their components) and updated, the limitation of claim 22 is inherent. As per claim 15, given no specific differentiation between the first and second model (or third model or fourth model), one could argue that the first module of claim 1 could also read on claim 15. However, it is also noted that given specific recitation of utilizing set (or subset) of instructions (module(s)) to classify information (e.g., para 113-14). (Note that not only abnormal in computing environment could, in the broadest reasonable interpretation, read on cyber threats, but also, including specific malicious data in classification would have been at least implicit given Fellows attempting to detect a cyber threat as clearly indicated in the Abstract.) Additionally, as per claim 16-21, as clearly noted by Fellows, the set of instructions enabling the receipt of data from the Fellows network and using the ML and AI described above, meet the limitation of various other labels used in the claim language, e.g. “an enterprise module”, “operational technology module”, etc. Note that Fellows’ invention is to evaluate using normal patterns of life different vehicles (e.g., cars, ships, airplanes, etc., these devices not only having different design and parameters but they also could utilize different evaluation variables, e.g. specific protocols, e.g., para 69-71). Thus, clearly Fellows would need to have more than one (one or more) sources and models in order to accommodate this various type of devices. By the virtue of being on the same network and utilized by the same server, the specific models would meet the limitation of being associated with each other. Any particular component of these devices and/or a particular component of these devices control a particular aspect of activities and, as a result, they can reasonably meet the limitations of “controllers”. Lastly, clearly any particular name, e.g., “a factory network” would not affect the functionality of the invention and at most, would have been an obvious variant offering the predictable benefit of customization (especially since the network is a network of manufactured components of elements such as vehicle components. By the virtue of having the manufactured components on such network, the network is associated with the factory manufactured the components). As per claim 6, given the fact that the term “similar” is not clearly defined not only one could argue that the broadest reasonable interpretation of such limitation is inherent in light of Fellows disclosure but, in fact, even more limited interpretation (see para 90) would read on the claim language. Claim(s) 13-14 is/are rejected under 35 U.S.C. 103 unpatentable over Fellows (Simon David Lincoln Fellows: USPUB 20200358810) in view of Berntorp (USPUB 20180284785). Fellows teaches determining autonomous response in order to remediate a cyber threat in response to information received from vehicles having the detected abnormalities as discussed above and providing warning to the vehicle (e.g. para 44-51). Although the prior art fails to discuss the warning being displayed graphically to a user of the vehicle, Official Notice is taken that such solution would have been old and well known in the art at the time the application was filed given the benefit of safety and usability. However, Fellows does not teach the received information being the vehicle motion state information. However, in the related art, Berntorp suggests such solution (par 18 and 60, for example). It would have been obvious to one of ordinary skill in the art at the time the application was filed to include information such as vehicle motion state information given the benefit of customization and security. Claim(s) 10-11 is/are rejected under 35 U.S.C. 103 unpatentable over Fellows (Simon David Lincoln Fellows: USPUB 20200358810) in view of Smith (USPUB 20190349426). Fellows teaches the probes installed on the vehicles monitoring traffic (communicating data to vehicle module over network) as discussed above. As per claim 10, Fellows does not teach the probes monitoring all network traffic inbound to and outbound from the vehicles. However, in the related art Smith suggests such solution (para 2148, 2167, etc.). It would have been obvious to one of ordinary skill in the art at the time the application was filed to include monitoring traffic as taught by Smith given the predictable benefit of data management. As per claim 11, a skilled in the art would readily appreciate that any operation on data includes obtaining data types (outgoing data from the vehicle would meet the limitation of the vehicle data and, as a result, the type of this data would meet the limitation of the vehicle data type). While Smith teaches a few entities operating on client, a watchdog agent, monitoring agent, data manager, etc., each responsible for monitoring device operation, including monitoring inbound/outbound traffic to/from device and a particular or a set of these entity could be equated to the claimed probes, wherein the set could be interpreted as an entity also transmitting and receiving data over network. As such, a skilled in the art would readily appreciate that in order to operate on the received network data, a probe entity would have to be able to perform protocol parsing from the data of at least one of i) a data link layer, ii) a physical layer. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Peter Poltorak whose telephone number is (571) 272-3840. The examiner can normally be reached Monday through Thursday from 9:00 a.m. to 5:00 p.m. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jeffrey Pwu can be reached on (571) 272-6798. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). /PIOTR POLTORAK/Primary Examiner, Art Unit 2433
Read full office action

Prosecution Timeline

Oct 07, 2024
Application Filed
Feb 21, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603883
ESTABLISHING AUTHENTICATION PERSISTENCE
2y 5m to grant Granted Apr 14, 2026
Patent 12574728
MITIGATING RISK FOR HANDS-FREE INTERACTIONS
2y 5m to grant Granted Mar 10, 2026
Patent 12563095
A method that adequately protects the authentic identity and personal data of a natural person and remotely confirms the authentic identity of this natural person through a trusted entity to a beneficiary part
2y 5m to grant Granted Feb 24, 2026
Patent 12526277
METHOD FOR MANAGING USER WHO USES FINGERPRINT AUTHENTICATION AND FINGERPRINT AUTHENTICATION SYSTEM THEREFOR
2y 5m to grant Granted Jan 13, 2026
Patent 12518278
SYSTEMS, APPARATUS AND METHODS FOR SECURE ELECTRICAL COMMUNICATION OF BIOMETRIC PERSONAL IDENTIFICATION INFORMATION TO VALIDATE THE IDENTITY OF AN INDIVIDUAL
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
99%
With Interview (+30.5%)
3y 5m
Median Time to Grant
Low
PTA Risk
Based on 594 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month