Prosecution Insights
Last updated: April 19, 2026
Application No. 18/632,371

IDENTIFICATION AND MITIGATION OF JAMMING ATTACKS ON AUTONOMOUS VEHICLES

Final Rejection §101§103§112
Filed
Apr 11, 2024
Examiner
MCANDREWS, TAWRI MATSUSHIGE
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kyndryl Inc.
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
93%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
69 granted / 103 resolved
+15.0% vs TC avg
Strong +26% interview lift
Without
With
+26.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
21 currently pending
Career history
124
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
50.8%
+10.8% vs TC avg
§102
11.3%
-28.7% vs TC avg
§112
23.7%
-16.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 103 resolved cases

Office Action

§101 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments This Office Action is in response to the applicant’s amendments and remarks filed on 10/28/2025. This action is made FINAL. Claims 1-20 are pending for examination. Regarding the objection(s) to the specification (title), the examiner finds applicant’s amendment(s) to the specification filed 10/28/2025 acceptable and withdraws the objection(s) to the specification (title). Regarding the rejection of claims 1-20 under 35 U.S.C. §103, applicant’s arguments have been considered but are deemed moot in view of the new grounds of rejection necessitated by applicant’s amendment, outlined below. Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 3-7, 10-14, and 17-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 3-7, 10-14, and 17-20 recite “…wherein the safety actions further comprise…”, rendering the claims indefinite as the term “further comprise” can mean further performing the stated action in addition to the actions recited in the parent claim, or merely expanding the group of possible alternative actions listed in the parent claim without requiring performance of both an action listed in the parent claim and the action recited in the dependent claim. For the purposes of examination, the examiner will take “wherein the safety actions further comprise” as —wherein the safety actions further [[comprise]]perform— in claims 3-7, 10-14, and 17-20. Note on Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. For informational purposes only, no action required. Claims 15-20 will not be rejected un 35 U.S.C. §101 as reciting non-statutory subject matter, though they recite a “computer program product comprising a computer readable storage medium”. ¶[0095] of the specification recites “A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire” where the computer readable storage medium is explicitly defined as non-transitory. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-5 and 7-14 are rejected under 35 U.S.C. 103 as being obvious over Liu et al. (US 20210112094 A1) in view of Lull et al. (US 20200233060 A1) and Balakrishnan et al. (US 20230406356 A1), henceforth known as Liu and Lull, respectively. Liu was first cited in IDS filed 4/11/2024. Lull was first cited in a previous office action. Regarding claim 1, the claim limitations recite a method having limitations similar to those of claim 8 and is therefore rejected on the same basis, as outlined below. Regarding claim 8, Liu discloses: A system comprising: (Liu, FIG. 1; FIG. 2; ¶[0022]-¶[0024];) a memory having computer readable instructions; and (Liu, FIG. 3A; ¶[0035]-¶[0036]) one or more processors for executing the computer readable instructions, the computer readable instructions controlling the one or more processors to perform operations comprising: (Liu, FIG. 3A; ¶[0035]-¶[0036]) determining that an object… is no longer detected by a sensor of a vehicle; (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the autonomous driving vehicle, ADV, determines that a static object or dynamic object is no longer detected by at least one sensor on the ADV) in response to the object… no longer being detected by the sensor, determining that another sensor of the vehicle detects the object; and (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV cross-checks static objects and dynamic objects across multiple sensors) in response to the object… no longer being detected by the sensor, performing safety actions based on the another sensor detecting the object… (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, performs at least one safety action of a degraded operation with reduced speed, prompting the driver to take over manual control of the vehicles, or stopping at the nearest safe spot). Liu is silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Lull teaches: determining that an object previously detected is no longer detected by a sensor of a vehicle … the object previously detected no longer being detected by the sensor (Lull, FIG. 1; FIG. 2; FIG. 4; FIG. 5; ¶[0008]-¶[0009]; ¶[0020]; ¶[0023]; ¶[0025]; ¶[0031]; ¶[0034]-¶[0035]; Where correction module 56, implemented by one or more processors onboard an autonomous vehicle, determines a sensor anomaly when an object previously detected by a vehicle sensor disappears). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull because “…information from sensors regarding the environment in which the computer-controlled device, such as an autonomous vehicle, can be modified or changed for nefarious reasons. This nefarious modification or changing of information is sometimes referred to as “hacking.”” (Lull, ¶[0004]) and “…the control system 40 will be provided with a corrected signal to not utilize signals that contain anomalies that may cause an inappropriate movement of the vehicle 20. This inappropriate movement of the vehicle could result in a safety issue to the occupants of the vehicle 20 or others near the vehicle 20” (Lull, ¶[0034]). That is, the ability to identify anomalous data taught by Lull allows better protection against hacking of autonomous vehicles. Liu and Lull are silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Balakrishnan teaches: wherein the safety actions comprise performing a livestream, turning on emergency lights, honking a horn, or playing an audible warning external to the vehicle. (Balakrishnan, FIG. 3A; ¶[0050]; Where the autonomous vehicle performs turning on hazard lights as a safety action when the vehicle’s sensors are degraded). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu and Lull with the features taught by Balakrishnan because “…This application describes techniques to monitor for, and take fail-safe actions in response to, reduced visibility of image sensors during autonomous or semi-autonomous driving of an autonomous vehicle (collectively referred to herein as autonomous driving)” (Balakrishnan, ¶[0016]), increasing autonomous vehicle operational safety. Regarding claim 2, the claim limitations recite a method having limitations similar to those of claim 9 and is therefore rejected on the same basis, as outlined below. Regarding claim 9, Liu, Lull, and Balakrishnan teach the system of claim 8. Liu further discloses: wherein determining that the another sensor of the vehicle detects the object comprises checking whether any sensors of the vehicle detect the object and determining that the object is detected by the another sensor. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines another sensor detects the static object or dynamic object by cross-checking multiple sensors across the ADV to see whether any of them detect the static or dynamic object and determines the impaired sensor cannot detect the object when the other sensor(s) detect the object). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull and Balakrishnan for at least the same reasons outlined in claim 8. Regarding claim 3, the claim limitations recite a method having limitations similar to those of claim 10 and is therefore rejected on the same basis, as outlined below. Regarding claim 10, Liu, Lull, and Balakrishnan teach the system of claim 8. Liu further discloses: wherein the safety actions further [[comprise]]perform: causing the vehicle to stop in order to avoid a location of the object previously detected by the sensor. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and stops at the nearest safe spot; the impaired sensor data is excluded therefore the object that isn’t detected by the impaired sensor is avoided when the vehicle stops at the nearest safe spot). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull and Balakrishnan for at least the same reasons outlined in claim 8. Regarding claim 4, the claim limitations recite a method having limitations similar to those of claim 11 and is therefore rejected on the same basis, as outlined below. Regarding claim 11, Liu, Lull, and Balakrishnan teach the system of claim 8. Liu further discloses: wherein the safety actions further [[comprise]]perform: alerting an occupant of the vehicle. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and, in response to the impaired sensor, prompts the driver to take over manual control of the vehicle, i.e. alerts the driver of the need to take over). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull and Balakrishnan for at least the same reasons outlined in claim 8. Regarding claim 5, the claim limitations recite a method having limitations similar to those of claim 12 and is therefore rejected on the same basis, as outlined below. Regarding claim 12, Liu, Lull, and Balakrishnan teach the system of claim 8. Liu further discloses: wherein the safety actions further [[comprise]]perform: presenting an occupant of the vehicle with an available control of the vehicle. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, prompts the driver to take over manual control of the vehicle, i.e. allows the driver to take over manual control). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull and Balakrishnan for at least the same reasons outlined in claim 8. Regarding claim 13, Liu, Lull, and Balakrishnan teach the system of claim 8. Liu further discloses: wherein the safety actions further [[comprise]]perform: receiving a command from an occupant of the vehicle in response to presenting the occupant of the vehicle with control of the vehicle. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, prompts the driver to take over manual control of the vehicle, i.e. alerts the driver of the need to take over and thereby receives manual commands for the vehicle). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull and Balakrishnan for at least the same reasons outlined in claim 8. Regarding claim 7, the claim limitations recite a method having limitations similar to those of claim 14 and is therefore rejected on the same basis, as outlined below. Regarding claim 14, Liu, Lull, and Balakrishnan teach the system of claim 8. Liu further discloses: wherein the safety actions further [[comprise]]perform: switching from autonomous mode to manual control of the vehicle. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, prompts the driver to take over manual control of the vehicle). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull and Balakrishnan for at least the same reasons outlined in claim 8. Claim 6 is rejected under 35 U.S.C. 103 as being obvious over Liu et al. (US 20210112094 A1), Lull et al. (US 20200233060 A1), and Balakrishnan et al. (US 20230406356 A1) as applied to claim 1, above, and in further view of Kim et al. (US 20240124031 A1), henceforth known as Kim. Liu was first cited in IDS filed 4/11/2024. Lull was first cited in a previous office action. Regarding claim 6, Liu, Lull, and Balakrishnan teach the computer-implemented method of claim 1. Liu further discloses: wherein the safety actions further comprise receiving a command from an occupant of the vehicle in response to presenting the occupant of the vehicle with control of the vehicle; and (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, prompts the driver to take over manual control of the vehicle, i.e. alerts the driver of the need to take over and thereby receives manual commands for the vehicle). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull and Balakrishnan for at least the same reasons outlined in claim 8. The combination of Liu, Lull, and Balakrishnan is silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Kim teaches: wherein at least one of: the command is received from a device of the occupant of the vehicle, the device being communicatively connected to the vehicle; or the command is a verbal command from the occupant of the vehicle. (Kim, FIG. 1; FIG. 2; ¶[0038]: smartphone connected to autonomous driving controller; ¶[0044]: smartphone or audio input device; Where the driver requests transition to the manual driving mode using the smartphone or audio input device). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu, Lull, and Balakrishnan with the features taught by Kim because “…when there is a request of a driver for transition to manual driving mode, the vehicle may be set to a safe state with no risk of colliding with surrounding vehicles and then the driver may be provided with a warning signal notifying that safe transition to the manual driving mode is possible, providing the driver with stability and convenience in the transition to the manual driving mode” (Kim, ¶[0029]). That is, the use of the smartphone or audio input device provides driver convenience. Claims 15-20 are rejected under 35 U.S.C. 103 as being obvious over Liu et al. (US 20210112094 A1) in view of Lull et al. (US 20200233060 A1), Monteuuis et al. (US 20250095373 A1), and Balakrishnan et al. (US 20230406356 A1), henceforth known as Liu, Lull, Monteuuis, and Balakrishnan, respectively. Liu was first cited in IDS filed 4/11/2024. Lull was first cited in a previous office action. Monteuuis was first cited in a previous office action as prior art not relied upon. Regarding claim 15, Liu discloses: A computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to perform operations comprising: (Liu, FIG. 1; FIG. 2; FIG. 3A; ¶[0022]-¶[0024]: autonomous vehicle; ¶[0035]-¶[0036]: autonomous vehicle system modules implemented by hardware, software; ¶[0088]-¶[0089]: non-transitory computer readable medium) determining that an object… is no longer detected by a sensor of a vehicle; (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the autonomous driving vehicle, ADV, determines that a static object or dynamic object is no longer detected by at least one sensor on the ADV) …in response to the object… no longer being detected by the sensor, determining that another sensor of the vehicle detects the object; and (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV cross-checks static objects and dynamic objects across multiple sensors) in response to the object… no longer being detected by the sensor, performing safety actions based on the another sensor detecting the object... (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, performs at least one safety action of a degraded operation with reduced speed, prompting the driver to take over manual control of the vehicles, or stopping at the nearest safe spot). Liu is silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Lull teaches: determining that an object previously detected is no longer detected by a sensor of a vehicle … the object previously detected no longer being detected by the sensor (Lull, FIG. 1; FIG. 2; FIG. 4; FIG. 5; ¶[0008]-¶[0009]; ¶[0020]; ¶[0023]; ¶[0025]; ¶[0031]; ¶[0034]-¶[0035]; Where correction module 56, implemented by one or more processors onboard an autonomous vehicle, determines a sensor anomaly when an object previously detected by a vehicle sensor disappears). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull because “…information from sensors regarding the environment in which the computer-controlled device, such as an autonomous vehicle, can be modified or changed for nefarious reasons. This nefarious modification or changing of information is sometimes referred to as “hacking.”” (Lull, ¶[0004]) and “…the control system 40 will be provided with a corrected signal to not utilize signals that contain anomalies that may cause an inappropriate movement of the vehicle 20. This inappropriate movement of the vehicle could result in a safety issue to the occupants of the vehicle 20 or others near the vehicle 20” (Lull, ¶[0034]). That is, the ability to identify anomalous data taught by Lull allows better protection against hacking of autonomous vehicles. Liu and Lull are silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Monteuuis teaches: determining a predicted trajectory for the object previously detected by the sensor; (Monteuuis, FIG. 2: autonomous vehicle with sensors; FIG. 5: temporal consistencies; ¶[0033], ¶[0053]: predicts future trajectories of objects; Where the autonomous vehicle system predicts future trajectories of detected (previously detected) objects) in response to the object previously detected by the sensor failing to have the predicted trajectory and in response to the object previously detected no longer being detected by the sensor… and in response to the object previously detected no longer being detected by the sensor, performing safety actions… (Monteuuis, FIG. 2; FIG. 5; ¶[0033], ¶[0053]: predicted future object trajectories; ¶[0095]: objects suddenly move, change, or disappear; ¶[0098]: performs mitigation actions based on conclusion of vision attack; Where the autonomous vehicle, in response to an object with a predicted future trajectory suddenly moving, changing, or disappearing, performs mitigation actions). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu and Lull with the features taught by Monteuuis because “Various embodiments may improve the operational safety of autonomous and semi-autonomous apparatuses (e.g., vehicles)_by providing effective methods and systems for detecting malicious attacks on camera systems, and taking mitigating actions such as to reduce risks to the vehicle, output an indication, and/or report attacks to appropriate authorities” (Monteuuis, ¶[0021]). The combination of Liu, Lull, and Monteuuis is silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Balakrishnan teaches: wherein the safety actions comprise performing a livestream, turning on emergency lights, honking a horn, or playing an audible warning external to the vehicle. (Balakrishnan, FIG. 3A; ¶[0050]; Where the autonomous vehicle performs turning on hazard lights as a safety action when the vehicle’s sensors are degraded). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu, Lull, and Balakrishnan with the features taught by Balakrishnan because “…This application describes techniques to monitor for, and take fail-safe actions in response to, reduced visibility of image sensors during autonomous or semi-autonomous driving of an autonomous vehicle (collectively referred to herein as autonomous driving)” (Balakrishnan, ¶[0016]), increasing autonomous vehicle operational safety. Regarding claim 16, Liu, Lull, Monteuuis, and Balakrishnan teach the computer program product of claim of claim 15. Liu further discloses: wherein determining that the another sensor of the vehicle detects the object comprises checking whether any sensors of the vehicle detect the object and determining that the object is detected by the another sensor. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines another sensor detects the static object or dynamic object by cross-checking multiple sensors across the ADV to see whether any of them detect the static or dynamic object and determines the impaired sensor cannot detect the object when the other sensor(s) detect the object). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull, Monteuuis, and Balakrishnan for at least the same reasons outlined in claim 15. Regarding claim 17, Liu, Lull, Monteuuis, and Balakrishnan teach the computer program product of claim of claim 15. Liu further discloses: wherein the safety actions further [[comprise]]perform: causing the vehicle to stop in order to avoid a location of the object previously detected by the sensor. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and stops at the nearest safe spot; the impaired sensor data is excluded therefore the object that isn’t detected by the impaired sensor is avoided when the vehicle stops at the nearest safe spot). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull, Monteuuis, and Balakrishnan for at least the same reasons outlined in claim 15. Regarding claim 18, Liu, Lull, Monteuuis, and Balakrishnan teach the computer program product of claim of claim 15. Liu further discloses: wherein the safety actions further [[comprise]]perform: alerting an occupant of the vehicle. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and, in response to the impaired sensor, prompts the driver to take over manual control of the vehicle, i.e. alerts the driver of the need to take over). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull, Monteuuis, and Balakrishnan for at least the same reasons outlined in claim 15. Regarding claim 19, Liu, Lull, Monteuuis, and Balakrishnan teach the computer program product of claim of claim 15. Liu further discloses: wherein the safety actions further [[comprise]]perform: presenting an occupant of the vehicle with an available control of the vehicle. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, prompts the driver to take over manual control of the vehicle, i.e. allows the driver to take over manual control). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull, Monteuuis, and Balakrishnan for at least the same reasons outlined in claim 15. Regarding claim 20, Liu, Lull, Monteuuis, and Balakrishnan teach the computer program product of claim of claim 15. Liu further discloses: wherein the safety actions further [[comprise]]perform: receiving a command from an occupant of the vehicle in response to presenting the occupant of the vehicle with control of the vehicle. (Liu, FIG. 8; FIG. 9; ¶[0021]; ¶[0023]; ¶[0056]-¶[0057]; ¶[0072]-¶[0082]; Where the ADV determines the sensor that cannot detect the object is impaired and in response to the impaired sensor, prompts the driver to take over manual control of the vehicle, i.e. alerts the driver of the need to take over and thereby receives manual commands for the vehicle). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Liu with the features taught by Lull, Monteuuis, and Balakrishnan for at least the same reasons outlined in claim 15. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Harper et al. (US 20200156538 A1) discloses vehicle computing system may implement techniques to dynamically adjust a volume and/or frequency of a sound emitted from a vehicle to warn an object (e.g., dynamic object) of a potential conflict with the vehicle. The techniques may include determining a baseline noise level and/or frequencies proximate to the object. The baseline noise level and/or frequencies may be determined based on an identification of one or more noise generating objects in the environment. The vehicle computing system may determine the volume and/or a frequency of the sound based in part on the baseline noise level and/or frequencies, an urgency of the warning, a probability of conflict between the vehicle and the object, a speed of the object, etc. Oba (US 20210155268 A1) discloses a configuration to output section information of an automatic driving section and a manual driving section and display data by which a time to reach each section is confirmable to a wearable terminal is implemented. A mobile device acquires the section information of an automatic driving section and a manual driving section on a traveling route, and estimates the time to reach each section and transmits the estimated time to an information terminal. The information terminal receives the transmission data from the mobile device, and outputs the section information of the automatic driving section and the manual driving section and the display data by which a time to reach each section is confirmable. Moreover, the mobile device determines notification timing of a manual driving recovery request notification on the basis of an arousal level, a position, or the like of a driver and transmits the determined notification timing to the information terminal, and the information terminal executes the manual driving recovery request notification at appropriate notification timing. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tawri M McAndrews whose telephone number is (571)272-3715. The examiner can normally be reached M-W (0800-1000). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Lee can be reached at (571)270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.M.M./Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Apr 11, 2024
Application Filed
Aug 22, 2025
Non-Final Rejection — §101, §103, §112
Oct 15, 2025
Interview Requested
Oct 22, 2025
Applicant Interview (Telephonic)
Oct 22, 2025
Examiner Interview Summary
Oct 28, 2025
Response Filed
Feb 06, 2026
Final Rejection — §101, §103, §112
Apr 02, 2026
Interview Requested
Apr 07, 2026
Examiner Interview Summary
Apr 07, 2026
Applicant Interview (Telephonic)
Apr 10, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597299
SYSTEM AND METHOD FOR REPOSITIONING VEHICLES IN A GEOGRAPHIC AREA BASED ON UTILIZATION METRIC
2y 5m to grant Granted Apr 07, 2026
Patent 12594969
VEHICLE CONTROLLER, METHOD, AND PROGRAM FOR STEERING REACTION DURING MANUAL DRIVING FOR RETURNING TO A PRESET ROUTE
2y 5m to grant Granted Apr 07, 2026
Patent 12572809
Generating Labeled Training Instances for Autonomous Vehicles Using Temporally Correlated Timestamps
2y 5m to grant Granted Mar 10, 2026
Patent 12573091
SYSTEM AND METHOD OF CALIBRATING AN OPTICAL SENSOR MOUNTED ON BOARD OF A VEHICLE USING A GRADUATED MOUNTING BAR
2y 5m to grant Granted Mar 10, 2026
Patent 12540455
WORKING MACHINE CONTROL METHOD USING TARGET POSITION CURVE AND REWARD MODEL, WORKING MACHINE CONTROL DEVICE AND WORKING MACHINE
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
93%
With Interview (+26.1%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 103 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month