Prosecution Insights
Last updated: April 19, 2026
Application No. 18/407,542

IDENTIFICATION AND MITIGATION OF SPOOFING ATTACKS ON AUTONOMOUS VEHICLES

Non-Final OA §103
Filed
Jan 09, 2024
Examiner
ABYANEH, ALI S
Art Unit
2437
Tech Center
2400 — Computer Networks
Assignee
Kyndryl Inc.
OA Round
3 (Non-Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
485 granted / 623 resolved
+19.8% vs TC avg
Strong +56% interview lift
Without
With
+55.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
23 currently pending
Career history
646
Total Applications
across all art units

Statute-Specific Performance

§101
17.2%
-22.8% vs TC avg
§103
49.1%
+9.1% vs TC avg
§102
9.5%
-30.5% vs TC avg
§112
13.9%
-26.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 623 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claims 1-20 are pending. Claims 1, 8, 9 and 15 have been amended. In light of applicant’s amendment, rejection of claim 9 under 35 U.S.C. 112(b) have been withdrawn. Response to Arguments Applicant's amendments/arguments filed on 12-22-2025 have been fully considered but are moot in view of the new ground(s) of rejection. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 2, 4, 6-9, 11, 13-16, 18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Liu et al. (US Publication No. 2021/0112094), hereinafter Liu, in view of Scarbrough et al. (US Publication No. 2023/0386268), hereinafter, Scarbrough. Regarding claims 1, 8 and 15, Liu teaches a computer-implemented method (FIG. 8) comprising: detecting using a sensor of a vehicle that an object appears (para. 0020, the method may determine if another sensor is able to correctly identify the static object. para. 0057, perception and planning system 110 may evaluate if the sensors are able to detect and identify dynamic objects (e.g., vehicles, pedestrians) or static objects not pre-defined by the HD map within a sensor coverage area. FIG. 8 and para. 0072; at block 752, processing logic evaluates if each of the sensors of the ADV is able to identify the pre-defined static objects) in a path of the vehicle (FIG. 8 and para. 0072, at block 751, processing logic obtains a set of pre-defined static objects along a planned route from a map); preventing the vehicle from motion in response to detecting the object (para. 0021, if the real-time sensor coverage does not allow for continuous operation, the method may active degraded operation to reduce the speed of the vehicle or to allow a driver, if there is one, to take over the operation of the vehicle. In one embodiment, the method may activate the fail operation to stop the vehicle at the nearest safe spot and may inform the passenger or the service provider; para. 0040-0043); determining that the object is associated with a deception of the sensor of the vehicle (paras. 0021 and 0057, the method may perform cross-check on dynamic objects detected by multiple sensors. For example, the method may determine if all the sensors are able to identify dynamic objects or static objects within a sensor coverage area, and if only one sensor in the sensor system is not able to identify an object, and the impairment of the one sensor occurs over an extended period of time or over many objects, and vice versa, there is a greater likelihood that the impairment is due to spoofing attacks. FIG. 8 and para. 0072, at block 754, processing logic identifies one or more impaired sensors based on the evaluations of the impaired sensors not being able to identify the static and dynamic objects); and performing security actions in response to determining that the object is associated with the deception of the sensors of the vehicle (para. 0021, dynamically adjust the sensor system coverage in real-time for perception by excluding an impaired sensor. FIG. 8 and Paragraph 0072, at block 755, the impaired sensors are excluded from being used by the perception function of the ADV). While Liu discloses performing security actions in response to determining that object is associated with the deception of the sensor of the vehicle (para. 0021, 0072, 0077), Liu does not explicitly disclose, the security actions comprise at least one of locking doors of the vehicle, contacting security services, or performing a livestream from a camera of the vehicle to a designated contact. However in an analogous art, Scarbrough disclose, the security actions comprise at least one of locking doors of the vehicle, contacting security services, or performing a livestream from a camera of the vehicle to a designated contact (para. 0014, “the vehicle device may include a camera to capture video data of a scene associated with the vehicle on demand ( e.g., based on the event detection). The vehicle device may activate the camera to capture video data of the scene associated with the vehicle based on detecting the event. The scene may be associated with an exterior of the vehicle or an interior of the vehicle. The vehicle device may transmit, to a server and/or a mobile device of a user associated with the vehicle, an indication that indicates the event and the video data”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Liu with Scarbrough. This would have been obvious because one of ordinary skill in the art would have been motivated to do so, in order to achieve the predictable result of enabling user to see and learn about an event occurred with the vehicle Regarding claims 2, 9 and 16, Liu as modified teaches all of the limitations of claims 1, 8 and 15, respectively, as show above. Liu further teaches wherein detecting using the sensor of the vehicle that the object appears in the path of the vehicle is in response to an event causing the vehicle to stop or nearly stop (para. 0021, if the real-time sensor coverage does not allow for continuous operation, the method may active degraded operation to reduce the speed of the vehicle or to allow a driver, if there is one, to take over the operation of the vehicle. In one embodiment, the method may activate the fail operation to stop the vehicle at the nearest safe spot and may inform the passenger or the service provider; para. 0040-0043). Regarding claims 3, 10 and 17, Liu as modified teaches all of the limitations of claims 1, 8 and 15, respectively, as show above. Liu further teaches causing the vehicle to stop or for an event, where the vehicle is configured to move in response to completion of the event (paras. 40-42). Regarding claims 4, 11 and 18, Liu as modified teaches all of the limitations of claims 1, 8 and 15, respectively, as show above. Liu further teaches wherein determining that the object is associated with the deception of the sensor of the vehicle comprises confirming that the object is not detected by one or more other sensors of the vehicle (paras. 0021 and 0057, the method may perform cross- check on dynamic objects detected by multiple sensors. For example, the method may determine if all the sensors are able to identify dynamic objects or static objects within a sensor coverage area, and if only one sensor in the sensor system is not able to identify an object, or vice versa, there is a greater likelihood that the impairment is due to spoofing attacks. FIG. 8 and para. 0072, at block 754, processing logic identifies one or more impaired sensors based on the evaluations of the impaired sensors not being able to identify the static and dynamic objects. It would have been obvious that identify impaired sensor may also be based on the sensor being able to identify objects that are not identified by other sensors). Regarding claims 6, 13 and 20, Liu as modified teaches all of the limitations of claims 1, 8 and 15, respectively, as show above. Liu further teaches wherein the security actions comprise at least one of alerting a user of a potential attack, or causing a predesignated type of movement of the vehicle to avoid theft (FIG. 9 and para. 0077, para. 0081, at operation 721, if the sensor system coverage does not allow continuous operation of the perception function, i.e., sensors cannot correctly detect or identify static and dynamic objects, the method activates degraded operation to reduce the speed of the vehicle or to allow a driver, if there is one, to take over the operation of the vehicle. In one embodiment, the method may activate the fail operation to stop the vehicle at the nearest safe spot and may inform the passenger or the service provider of the fail operation). Regarding claims 7 and 14, Liu as modified teaches all of the limitations of claims 1 and 8, respectively, as show above. Liu further teaches wherein the deception of the sensor of the vehicle comprises receiving laser pulses by the sensor such that the laser pulses cause algorithms to detect an appearance of the object when the object is not physically present in a proximity of the vehicle (paras. 0021 and 0057, the method may perform cross-check on dynamic objects detected by multiple sensors. For example, the method may determine if all the sensors are able to identify dynamic objects or static objects within a sensor coverage area, and if only one sensor in the sensor system is not able to identify an object, or vice versa, there is a greater likelihood that the impairment is due to spoofing attacks; FIG. 8 and para. 0072, at block 754, processing logic identifies one or more impaired sensors based on the evaluations of the impaired sensors not being able to identify the static and dynamic objects. It would have been obvious that identify impaired sensor may also be based on the sensor being able to identify objects that are not identified by other sensors; paras. 0056, 0039 and 0026). Claims 5, 12 and 19 are rejected under 35 U.S.C. 103 as being patentable over Liu and of Scarbrough, as applied in the claims 1, 8 and 15 above, further in view of Passot et al. (WO 2020033808 A1), hereinafter Passot. Regarding claims 5, 12 and 19, Liu as modified teaches all of the limitations of claims 1, 8 and 15, respectively, as show above. Liu further teaches wherein determining that the object is associated with the deception of the sensor of the vehicle (paras. 0021 and 0057, the method may perform cross-check on dynamic objects detected by multiple sensors. For example, the method may determine if all the sensors are able to identify dynamic objects or static objects within a sensor coverage area, and if only one sensor in the sensor system is not able to identify an object, and the impairment of the one sensor occurs over an extended period of time or over many objects, and vice versa, there is a greater likelihood that the impairment is due to spoofing attacks. FIG. 8 and para. 0072, at block 754, processing logic identifies one or more impaired sensors based on the evaluations of the impaired sensors not being able to identify the static and dynamic objects) indicates that the object is not present (paras. 0021 and 0057, determine if all the sensors are able to identify dynamic objects or static objects within a sensor coverage area, and if only one sensor in the sensor system is not able to identify an object, or vice versa, there is a greater likelihood that the impairment is due to spoofing attacks. FIG. 8 and para. 0072, at block 754, processing logic identifies one or more impaired sensors based on the evaluations of the impaired sensors not being able to identify the static and dynamic objects. It would have been obvious that identify impaired sensor may also be based on the sensor being able to identify objects that are not identified by other sensors). Liu as modified does not explicitly teach confirming a user response. However, in an analogous art, Passot teaches confirming a user response (paras. 00103, 00149 and 00152). Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the invention to combine the modified Liu with Passot because existing security apparatuses are prone to error and fail adequately to distinguish between genuine threats and harmless objects, by confirming with a user's response, the error of distinguishing may be reduced. References Cited, Not Used The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Renaud (US Publication No, 2018/0072270) discloses, a security system for a vehicle includes a plurality of cameras disposed at the vehicle so as to have respective fields of view exterior of the vehicle. A controller is operable to activate at least one of the cameras and to record image data captured by the activated camera responsive to a triggering event indicative of a potential break-in at the vehicle by an intruder. Responsive to the triggering event, the controller activates at least one of the cameras that has a field of view that encompasses a region where the intruder is determined to be present, and does not activate at least one of the cameras that has a field of view that does not encompass the region where the intruder is determined to be present. The triggering event includes touch or movement of a door handle of the vehicle and/or breakage of a window of the vehicle. Monteuuis et al. (US Publication No. 2025/0095373) discloses, various embodiments include methods for identifying inconsistencies in images that could be due to malicious attacks. Various embodiments may include receiving a plurality of camera images from one or more cameras of an apparatus ( e.g., a vehicle), performing a plurality of different processes on the plurality of images to detect different types of image inconsistencies, using results of the plurality of different processes on the plurality of images to recognize a vision attack and performing one or more mitigation actions in response to recognizing a vision attack. Tafti et al. (US Publication No.2024/0182071) discloses, a method for mitigating an adversarial attack includes receiving input data. The input data includes sensor data from a plurality of sensors and map data. The method further includes monitoring, in real time, an environment around an autonomous vehicle to identify a region that is possibly subject to an adversarial attack and determining a probability of the adversarial attack in the region. The method further includes determining whether the probability of the adversarial attack in the region that is possibly subject to the adversarial attack is greater than a predetermined threshold and, in response, planning a motion of the autonomous vehicle by taking into account the adversarial attack to generate a planned motion. The method further includes controlling a host vehicle to move in accordance with the planned motion. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ali Abyaneh whose telephone number is (571) 272-7961. The examiner can normally be reached on Monday-Friday from (8:00-5:00). If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander Lagor can be reached on (571) 270-5143. The fax phone numbers for the organization where this application or proceeding is assigned as (571) 273-8300 Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). /ALI S ABYANEH/Primary Examiner, Art Unit 2437
Read full office action

Prosecution Timeline

Jan 09, 2024
Application Filed
Jul 21, 2025
Non-Final Rejection — §103
Sep 26, 2025
Interview Requested
Oct 07, 2025
Examiner Interview Summary
Oct 07, 2025
Applicant Interview (Telephonic)
Oct 16, 2025
Response Filed
Nov 05, 2025
Final Rejection — §103
Dec 05, 2025
Interview Requested
Dec 16, 2025
Applicant Interview (Telephonic)
Dec 18, 2025
Examiner Interview Summary
Dec 22, 2025
Response after Non-Final Action
Jan 20, 2026
Request for Continued Examination
Jan 28, 2026
Response after Non-Final Action
Feb 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603868
Endpoint Data Loss Prevention
2y 5m to grant Granted Apr 14, 2026
Patent 12579259
SYSTEMS AND METHODS FOR INTELLIGENT CYBERSECURITY ALERT SIMILARITY DETECTION AND CYBERSECURITY ALERT HANDLING
2y 5m to grant Granted Mar 17, 2026
Patent 12574374
PROVIDING ACCESS CONTROL AND IDENTITY VERIFICATION FOR COMMUNICATIONS WHEN INITIATING A COMMUNICATION TO AN ENTITY TO BE VERIFIED
2y 5m to grant Granted Mar 10, 2026
Patent 12561465
VIRTUAL REPRESENTATION OF INDIVIDUAL IN COMPUTING ENVIRONMENT
2y 5m to grant Granted Feb 24, 2026
Patent 12556553
NETWORK SECURITY AND RELATED APPARATUSES, METHODS, AND SECURITY SYSTEMS
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
99%
With Interview (+55.6%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 623 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month