Prosecution Insights
Last updated: April 19, 2026
Application No. 18/218,760

APPARATUS FOR DRIVER ASSISTANCE AND METHOD OF CONTROLLING THE SAME

Non-Final OA §103§112
Filed
Jul 06, 2023
Examiner
SARWAR, BABAR
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
HL Klemove Corp.
OA Round
3 (Non-Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
893 granted / 1043 resolved
+33.6% vs TC avg
Strong +20% interview lift
Without
With
+20.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
27 currently pending
Career history
1070
Total Applications
across all art units

Statute-Specific Performance

§101
10.8%
-29.2% vs TC avg
§103
40.3%
+0.3% vs TC avg
§102
27.1%
-12.9% vs TC avg
§112
12.1%
-27.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1043 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-20 are presented for examination. Claims 1-8, 12-17 are rejected. Claims 9, 18 are canceled. Claims 10-11, 19-20 are objected to. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/29/2025 has been entered. Response to Arguments Applicant's arguments filed 11/04/2025 have been fully considered but they are not persuasive. The applicants argued that the prior art on record, i.e., Dantrey in view of Moosaei, does not teach or suggest “and control the autonomous driving along a traveling route changed based on at least one of output data of the camera module or output data of the radar module when it is determined that the traveling route of the vehicle needs to be changed, wherein the processor is configured to acquire position information of a lane to be changed based on the acquired road information, the position information of the traveling lane of the vehicle, the position information of the traveling lane of the emergency vehicle, and obstacle information acquired by the radar module when it is determined that the traveling lane of the vehicle needs to be changed, and control the steering and the traveling speed to move to the acquired changed lane”. The examiner would like to steer the applicants’ attention to the following fact that “One cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., Inc., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).”. In this case, Dantrey teaches “…when approaching a four-way intersection, an estimated location and distance may allow the vehicle 500 to determine that the emergency response vehicle(s) is on the road entering the intersection from the left, and the travel direction 114—described in more detail herein—may be used to determine the direction of travel of the emergency response vehicle(s) on the road. As such, where the emergency response vehicle(s) is traveling away from the intersection, the vehicle 500 may determine to continue through the intersection without accounting for the emergency response vehicle(s), while if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention). As a result, in some embodiments, knowledge of the road layout (e.g., determined using a GNSS map, a high definition (HD) map, vehicle perception, etc.) may additionally be used to determine the location 112 and/or travel direction 114… a perception layer of the drive stack 124 may use the process 100 to identify and locate emergency response vehicles for updating a world model (e.g., using a world model manager) to localize the emergency response vehicle to the world model. A planning layer of the drive stack 124 may use the location 112, travel direction 114, and/or alert type to determine a route or path plan that accounts for the emergency response vehicle—such as to slow down, pull over, come to a stop, and/or perform another operation. A control layer of the drive stack 124 may then use a route or path plan to control the vehicle 500 according to the path…”, as disclosed in ¶ [0025]-¶ [0030], ¶ [0037], ¶ [0040]-¶ [0050], and exhibited in Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594. On the other hand, Moosaei teaches …determining a yield strategy for the vehicle to yield to the emergency vehicle based on the additional sensor data…determine a yield strategy for vehicle 201 to yield to emergency vehicle 222 based on sensor data 237. Vehicle control systems 254 can use sensor data 237 to determine if other vehicles are in adjacent lanes (e.g., lane 261), speed and position of other vehicles, paths of other vehicles, other obstacles…A yield strategy can include one or more of: changing lanes (e.g., left or right), slowing down, and stopping…determine a yield strategy to pull into shoulder 263 and stop vehicle 201 until emergency vehicle 222 passes…If vehicle 401 is in the same lane as an emergency vehicle (YES at 412), vehicle 401 can determine if there is an empty lane to the right of vehicle 401 (414). If there is an empty lane to the right (YES at 414), vehicle 401 can pull into the right lane (415) and stop (416) (or pull into the right lane and slow down). If there is not an empty lane to the right of vehicle 401 (NO at 414) (e.g., other traffic is in the lane to the right), vehicle 401 can determine if there is an empty lane to the left of vehicle 401 (417). If there is an empty lane to the left (YES at 417), vehicle 401 can pull into the left lane (418) and stop (419) (or pull into the left lane and slow down)… If there is not an empty lane to the left (NO at 417), vehicle 401 can again determine if vehicle 401 is in the same lane as an emergency vehicle (412). As the emergency vehicle and other vehicles in the highway roadway environment travel, vehicle positions and lane availability can change. For example, the emergency vehicle can change lanes (or pull into a median or onto a shoulder) and/or lanes to the right of vehicle 401 and/or to the left of vehicle 401 can free up. Vehicle 401 can continual re-check for appropriate ways to automatically yield to the emergency vehicle…”, as disclosed in ¶ [0014]-¶ [0015], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and exhibited in Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532 Therefore, the previous rejection is maintained with some elucidations to clarify the examiner’s position. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-8, 12-17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. Claims 1-8, 12-17 are rejected based on the lack of antecedent basis as follows: 1. An apparatus for driver assistance provided in a vehicle, the apparatus comprising: a camera module; a radar module; and a processor configured to: determine whether a traveling route of the vehicle needs to be changed based on received traveling route information of an emergency vehicle and traveling route information of the vehicle upon receiving the traveling route information of the emergency vehicle through a communicator of the vehicle during autonomous driving; determine whether a same route, where traveling routes of the emergency vehicle is overlapped with traveling routes of the vehicle, is present based on the received traveling route information of the emergency vehicle and the traveling route information of the vehicle and control the autonomous driving along a traveling route changed based on at least one of output data of the camera module or output data of the radar module when it is determined that the traveling route of the vehicle needs to be changed, wherein the processor is configured to acquire position information of a lane to be changed based on an acquired road information, the position information of a traveling lane of the vehicle, the position information of traveling lane of the emergency vehicle, and obstacle information acquired by the radar module when it is determined that the traveling lane of the vehicle needs to be changed, and control steering and traveling speed to move to an acquired changed lane. Claim 12 is rejected for the same reasons as well. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-8, 12-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dantrey in view of Moosaei. Consider claims 1, 12: Dantrey teaches an apparatus for driver assistance provided in a vehicle (See Dantrey, e.g., “…audio alerts of emergency response vehicles may be detected and classified using audio captured by microphones of an autonomous or semi-autonomous machine in order to identify travel directions, locations, and/or types of emergency response vehicles in the environment…the audio signals may be used to generate representations of a frequency spectrum that may be processed using a deep neural network (DNN) that outputs probabilities of alert types being represented by the audio data. The locations, direction of travel, and/or siren type may allow an ego-vehicle or ego-machine to identify an emergency response vehicle and to make planning and/or control decisions in response…” of Abstract, ¶ [0003]-¶ [0005], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594), the apparatus comprising: a camera module (Figs. 5B-C element 560); a radar module (Figs. 5B-C elements 568-598); and a processor (Fig. 5C element 510) configured to: determine whether a traveling route of the vehicle needs to be changed based on received traveling route information of an emergency vehicle (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594) and traveling route information of the vehicle upon receiving the traveling route information of the emergency vehicle through a communicator (Fig. 1A elements 100-124) of the vehicle during autonomous driving (See Dantrey, e.g., “…performing one or more operations based at least in part on the probabilities. For example, the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); determine whether a same route, where traveling routes of the emergency vehicle is overlapped with traveling routes of the vehicle, is present based on the received traveling route information of the emergency vehicle and the traveling route information of the vehicle (See Dantrey, e.g., “…while if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…knowledge of the road layout (e.g., determined using a GNSS map, a high definition (HD) map, vehicle perception, etc.) may additionally be used to determine the location 112 and/or travel direction 114…” of ¶ [0025]-¶ [0030], ¶ [0037], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594). Dantrey further teaches performing one or more operations based at least in part on the probabilities. For example, the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…”, as disclosed in ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and exhibited in Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594. However, Dantrey does not explicitly teach and control the autonomous driving along a traveling route changed based on at least one of output data of the camera module or output data of the radar module when it is determined that the traveling route of the vehicle needs to be changed, wherein the processor is configured to acquire position information of a lane to be changed based on the acquired road information, the position information of the traveling lane of the vehicle, the position information of the traveling lane of the emergency vehicle, and obstacle information acquired by the radar module when it is determined that the traveling lane of the vehicle needs to be changed, and control the steering and the traveling speed to move to the acquired changed lane. In an analogous field of endeavor, Moosaei teaches and control the autonomous driving (e.g., “…Vehicle 501 and emergency vehicle 502 are traveling in lane 512. Vehicle 501 can detect the approach of emergency vehicle 502. Emergency vehicle 502 can also transmit data indicating an intent to travel path 503 to vehicle 501. Vehicle 501 can determine that vehicle 501 and emergency vehicle 502 are both in lane 512. Vehicle 501 can determine that lane 511 (a lane to the right) is occupied by vehicle 504. As such, vehicle 501 formulates a strategy to yield 506 to emergency vehicle 502 by moving into lane 513 and possibly slowing down or even stopping…” of Figs. 5A-B elements 502-532) along a traveling route changed based on at least one of output data of the camera module or output data of the radar module (See Moosaei, e.g., “…The plurality of cameras are used to detect spinning lights and also to detect if an emergency vehicle is in the same lane as the vehicle…sensor data from microphone(s) 203 and camera(s) 204 can be fused into sensor data 236. Microphone(s) 203 can detect sounds of siren 219. Camera(s) 204 can detect lights 223…External sensors 202 include one or more of: microphones 203, camera(s) 204, LIDAR sensor(s) 206, and ultrasonic sensor(s) 207…radar sensors, acoustic sensors, and electromagnetic sensors…” of ¶ [0014], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532) when it is determined that the traveling route of the vehicle needs to be changed (See Moosaei, e.g., “…an autonomous vehicle automatically yields to one or more detected emergency vehicles. Based on map data, the autonomous vehicle can determine a roadway configuration...the autonomous vehicle can use one or more cameras and one or more microphones to automatically (and safely) yield to the emergency vehicle(s). Automatically yielding can include one or more of: slowing down, changing lanes, stopping, etc. depending on the roadway configuration. The autonomous vehicle can use LIDAR sensors, ultrasound sensors, radar sensors, and cameras for planning a path that includes one or more of: safely changing lanes, slowing down, or stopping…” of ¶ [0014]-¶ [0015], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532), wherein the processor is configured to acquire position information of a lane to be changed based on the acquired road information (See Moosaei, e.g., “…determining a yield strategy for the vehicle to yield to the emergency vehicle based on the additional sensor data…determine a yield strategy for vehicle 201 to yield to emergency vehicle 222 based on sensor data 237. Vehicle control systems 254 can use sensor data 237 to determine if other vehicles are in adjacent lanes (e.g., lane 261), speed and position of other vehicles, paths of other vehicles, other obstacles…A yield strategy can include one or more of: changing lanes (e.g., left or right), slowing down, and stopping…determine a yield strategy to pull into shoulder 263 and stop vehicle 201 until emergency vehicle 222 passes…” of ¶ [0014]-¶ [0015], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532), the position information of the traveling lane of the vehicle, the position information of the traveling lane of the emergency vehicle (See Moosaei, e.g., “…If vehicle 401 is in the same lane as an emergency vehicle (YES at 412), vehicle 401 can determine if there is an empty lane to the right of vehicle 401 (414). If there is an empty lane to the right (YES at 414), vehicle 401 can pull into the right lane (415) and stop (416) (or pull into the right lane and slow down). If there is not an empty lane to the right of vehicle 401 (NO at 414) (e.g., other traffic is in the lane to the right), vehicle 401 can determine if there is an empty lane to the left of vehicle 401 (417). If there is an empty lane to the left (YES at 417), vehicle 401 can pull into the left lane (418) and stop (419) (or pull into the left lane and slow down)…” of ¶ [0014]-¶ [0015], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532), and obstacle information acquired by the radar module when it is determined that the traveling lane of the vehicle needs to be changed (See Moosaei, e.g., “…If there is not an empty lane to the left (NO at 417), vehicle 401 can again determine if vehicle 401 is in the same lane as an emergency vehicle (412). As the emergency vehicle and other vehicles in the highway roadway environment travel, vehicle positions and lane availability can change. For example, the emergency vehicle can change lanes (or pull into a median or onto a shoulder) and/or lanes to the right of vehicle 401 and/or to the left of vehicle 401 can free up. Vehicle 401 can continual re-check for appropriate ways to automatically yield to the emergency vehicle…” of ¶ [0014]-¶ [0015], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532), and control the steering and the traveling speed to move to the acquired changed lane (See Moosaei, e.g., “…the emergency vehicle can change lanes (or pull into a median or onto a shoulder) and/or lanes to the right of vehicle 401 and/or to the left of vehicle 401 can free up. Vehicle 401 can continual re-check for an appropriate strategy to automatically yield to the emergency vehicle…” of ¶ [0014]-¶ [0015], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to combine “…audio alerts of emergency response vehicles may be detected and classified using audio captured by microphones of an autonomous or semi-autonomous machine in order to identify travel directions, locations, and/or types of emergency response vehicles in the environment…the audio signals may be used to generate representations of a frequency spectrum that may be processed using a deep neural network (DNN) that outputs probabilities of alert types being represented by the audio data. The locations, direction of travel, and/or siren type may allow an ego-vehicle or ego-machine to identify an emergency response vehicle and to make planning and/or control decisions in response…”, as disclosed in Dantrey with “and control the autonomous driving along a traveling route changed based on at least one of output data of the camera module or output data of the radar module when it is determined that the traveling route of the vehicle needs to be changed, wherein the processor is configured to acquire position information of a lane to be changed based on the acquired road information, the position information of the traveling lane of the vehicle, the position information of the traveling lane of the emergency vehicle, and obstacle information acquired by the radar module when it is determined that the traveling lane of the vehicle needs to be changed, and control the steering and the traveling speed to move to the acquired changed lane”, as taught in Moosaei with a reasonable expectation of success to yield an intelligent vehicle driving system for detecting emergency vehicles and yielding to emergency vehicles in an appropriate fashion based on the surrounding of vehicles. Consider claims 2, 13: The combination of Dantrey, Moosaei teaches everything claimed as implemented above in the rejection of claims 1, 12. In addition, Dantrey teaches wherein the processor is configured to: determine whether an avoidance route is present based on current position information, destination information of the vehicle and the same route when it is determined that the same route is present (See Dantrey, e.g., “…knowledge of the road layout (e.g., determined using a GNSS map, a high definition (HD) map, vehicle perception, etc.) may additionally be used to determine the location 112 and/or travel direction 114…determined by the location determiner 108 by tracking locations 112 of the emergency response vehicle(s) over time…a change (e.g., increase or decrease) in the sound pressure, particle velocity, sound or audio frequencies, and/or other physical quantities of the sound field as represented by the audio data 104 may indicate that the emergency response vehicle(s) is approaching (e.g., increase in sound pressure) or moving away (e.g., decrease in sound pressure)…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594). Moosaei also teaches, e.g., “…Vehicle 501 and emergency vehicle 502 are traveling in lane 512. Vehicle 501 can detect the approach of emergency vehicle 502. Emergency vehicle 502 can also transmit data indicating an intent to travel path 503 to vehicle 501. Vehicle 501 can determine that vehicle 501 and emergency vehicle 502 are both in lane 512. Vehicle 501 can determine that lane 511 (a lane to the right) is occupied by vehicle 504. As such, vehicle 501 formulates a strategy to yield 506 to emergency vehicle 502 by moving into lane 513 and possibly slowing down or even stopping…”, as exhibited in Figs. 5A-B elements 502-532. The motivation is to expedite the emergency services, thereby, preserving precious time, and lives. Consider claim 3: The combination of Dantrey, Moosaei teaches everything claimed as implemented above in the rejection of claim 1. In addition, Dantrey teaches wherein the processor is configured to: determine whether the emergency vehicle is adjacent to the vehicle based on received position information of the emergency vehicle (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594) and position information of the vehicle upon receiving the position information of the emergency vehicle through the communicator (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594), and control the vehicle to change a traveling lane when it is determined that the emergency vehicle is adjacent to the vehicle (See Dantrey, e.g., “…performing one or more operations based at least in part on the probabilities. For example, the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594). Moosaei also teaches, e.g., “…Vehicle 501 and emergency vehicle 502 are traveling in lane 512. Vehicle 501 can detect the approach of emergency vehicle 502. Emergency vehicle 502 can also transmit data indicating an intent to travel path 503 to vehicle 501. Vehicle 501 can determine that vehicle 501 and emergency vehicle 502 are both in lane 512. Vehicle 501 can determine that lane 511 (a lane to the right) is occupied by vehicle 504. As such, vehicle 501 formulates a strategy to yield 506 to emergency vehicle 502 by moving into lane 513 and possibly slowing down or even stopping…”, as exhibited in Figs. 5A-B elements 502-532. The motivation is to expedite the emergency services, thereby, preserving precious time, and lives. Consider claims 4, 14: The combination of Dantrey, Moosaei teaches everything claimed as implemented above in the rejection of claims 3, 12. In addition, Dantrey teaches wherein the processor is configured to: analyze a frequency pattern of a sound collected by a sound collector of the vehicle (See Dantrey, e.g., “…to identify siren types—and thus emergency response vehicle types corresponding thereto—the audio signals may be converted to a frequency domain by extracting Mel frequency coefficients to generate Mel-spectrograms…The alert pattern may undergo a wide set of transformations prior to being captured by the microphones 102, such as Doppler, attenuation, echoes, reverberations, and/or the like. For the DNN 120 to accurately predict the alert types 122, the DNN 120 may be trained to identify the alert types 122 after these transformation have taken place…” of ¶ [0004], ¶ [0025]-¶ [0033], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); and determine that the emergency vehicle is adjacent to the vehicle when it is determined that the collected sound is a sound having a siren sound frequency pattern based on the analysis (See Dantrey, e.g., “…to identify siren types—and thus emergency response vehicle types corresponding thereto—the audio signals may be converted to a frequency domain by extracting Mel frequency coefficients to generate Mel-spectrograms…The alert pattern may undergo a wide set of transformations prior to being captured by the microphones 102, such as Doppler, attenuation, echoes, reverberations, and/or the like. For the DNN 120 to accurately predict the alert types 122, the DNN 120 may be trained to identify the alert types 122 after these transformation have taken place…” of ¶ [0004], ¶ [0025]-¶ [0033], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594). Moosaei also teaches, e.g., “…detecting that the emergency vehicle is approaching the vehicle on a roadway comprises detecting that the emergency vehicle is approaching the vehicle based on the sensor data representing the sound of the siren…”, as exhibited in Figs. 5A-B elements 502-532. The motivation is to expedite the emergency services, thereby, preserving precious time, and lives. Consider claims 5-6, 15-16: The combination of Dantrey, Moosaei teaches everything claimed as implemented above in the rejection of claims 1, 12. In addition, Dantrey teaches wherein the processor is configured to: determine whether the emergency vehicle is adjacent to the vehicle based on at least one of position information of the emergency vehicle received through the communicator or a sound collected by a sound collector of the vehicle (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); determine whether the emergency vehicle is adjacent to the vehicle based on sensor information acquired by the sensor when it is determined that the emergency vehicle is adjacent to the vehicle (See Dantrey, e.g., “…The location determiner 108 may analyze the audio data 104 from each of a plurality—e.g., three or more—of microphone arrays 202 using acoustic triangulation to determine a distance (e.g., from the vehicle 200) and/or direction (e.g., a range of angles defining a region of the environment that an emergency response vehicle(s) is located) in order to determine the location 112 of the emergency response vehicle(s). For example, acoustic triangulation may be used to determine an estimated distance of the emergency response vehicle(s) from the vehicle 200 and an estimated source direction of the alert(s) of the emergency response vehicle(s)…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); acquire distance information and traveling speed information of the emergency vehicle based on the information acquired by the sensor when it is determined that the emergency vehicle is adjacent to the vehicle (See Dantrey, e.g., “…The location determiner 108 may analyze the audio data 104 from each of a plurality—e.g., three or more—of microphone arrays 202 using acoustic triangulation to determine a distance (e.g., from the vehicle 200) and/or direction (e.g., a range of angles defining a region of the environment that an emergency response vehicle(s) is located) in order to determine the location 112 of the emergency response vehicle(s)…the CNN running on the DLA is trained to identify the relative closing speed of the emergency response vehicle (e.g., by using the Doppler Effect). The CNN may also be trained to identify emergency response vehicles specific to the local area in which the vehicle is operating, as identified by GNSS sensor(s) 558.…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], ¶ [0119], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); and control at least one of steering or a traveling speed based on the distance information, the traveling speed information of the emergency vehicle and traveling speed information of the vehicle (See Dantrey, e.g., “…performing one or more operations based at least in part on the probabilities. For example, the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594). Dantrey further teaches performing one or more operations based at least in part on the probabilities. For example, the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…”, as disclosed in ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and exhibited in Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594. Moosaei teaches determine whether the emergency vehicle is adjacent to the vehicle based on image/radar information acquired by the camera / radar module when it is determined that the emergency vehicle is adjacent to the vehicle (See Moosaei, e.g., “…The plurality of cameras are used to detect spinning lights and also to detect if an emergency vehicle is in the same lane as the vehicle…sensor data from microphone(s) 203 and camera(s) 204 can be fused into sensor data 236. Microphone(s) 203 can detect sounds of siren 219. Camera(s) 204 can detect lights 223…External sensors 202 include one or more of: microphones 203, camera(s) 204, LIDAR sensor(s) 206, and ultrasonic sensor(s) 207…radar sensors, acoustic sensors, and electromagnetic sensors…” of ¶ [0014], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532). The motivation is to detect the emergency vehicles’ routes, and comply with the rules expeditiously, thereby, preserving precious time, and lives. Consider claim 7: The combination of Dantrey, Moosaei teaches everything claimed as implemented above in the rejection of claim 1. In addition, Dantrey teaches determine whether the emergency vehicle is adjacent to the vehicle based on at least one of position information of the emergency vehicle received through the communicator or a sound collected by a sound collector of the vehicle (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); determine whether an object of the emergency vehicle is present in sensor information acquired by the sensor module when it is determined that the emergency vehicle is adjacent to the vehicle (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); acquire distance information and traveling speed information of the emergency vehicle based on radar information of the radar module when it is determined that the object of the emergency vehicle is present in the radar information (See Dantrey, e.g., “…The location determiner 108 may analyze the audio data 104 from each of a plurality—e.g., three or more—of microphone arrays 202 using acoustic triangulation to determine a distance (e.g., from the vehicle 200) and/or direction (e.g., a range of angles defining a region of the environment that an emergency response vehicle(s) is located) in order to determine the location 112 of the emergency response vehicle(s)…the CNN running on the DLA is trained to identify the relative closing speed of the emergency response vehicle (e.g., by using the Doppler Effect). The CNN may also be trained to identify emergency response vehicles specific to the local area in which the vehicle is operating, as identified by GNSS sensor(s) 558.…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], ¶ [0119], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); and control at least one of steering or a traveling speed based on the distance information, the traveling speed information of the emergency vehicle and traveling speed information of the vehicle (See Dantrey, e.g., “…performing one or more operations based at least in part on the probabilities. For example, the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594). Dantrey further teaches performing one or more operations based at least in part on the probabilities. For example, the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…”, as disclosed in ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and exhibited in Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594. Moosaei teaches determine whether the emergency vehicle is adjacent to the vehicle based on image/radar information acquired by the camera / radar module when it is determined that the emergency vehicle is adjacent to the vehicle (See Moosaei, e.g., “…The plurality of cameras are used to detect spinning lights and also to detect if an emergency vehicle is in the same lane as the vehicle…sensor data from microphone(s) 203 and camera(s) 204 can be fused into sensor data 236. Microphone(s) 203 can detect sounds of siren 219. Camera(s) 204 can detect lights 223…External sensors 202 include one or more of: microphones 203, camera(s) 204, LIDAR sensor(s) 206, and ultrasonic sensor(s) 207…radar sensors, acoustic sensors, and electromagnetic sensors…” of ¶ [0014], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532). The motivation is to detect the emergency vehicles’ routes, and comply with the rules expeditiously, thereby, preserving precious time, and lives. Consider claims 8, 17: The combination of Dantrey, Moosaei teaches everything claimed as implemented above in the rejection of claims 7, 12. In addition, Dantrey teaches wherein the processor is configured to: acquire position information of the traveling lane in which the vehicle travels (See Dantrey, e.g., “…when approaching a four-way intersection, an estimated location and distance may allow the vehicle 500 to determine that the emergency response vehicle(s) is on the road entering the intersection from the left, and the travel direction 114…determine the direction of travel of the emergency response vehicle(s) on the road…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594) and position information of a traveling lane of the emergency vehicle based on the sensor information acquired by the sensor (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); acquire road information based on pre-stored map information and the position information of the vehicle (See Dantrey, e.g., “…when approaching a four-way intersection, an estimated location and distance may allow the vehicle 500 to determine that the emergency response vehicle(s) is on the road entering the intersection from the left, and the travel direction 114…determine the direction of travel of the emergency response vehicle(s) on the road…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594); and determine whether the traveling lane of the vehicle needs to be changed based on the acquired road information, position information of the traveling lane of the vehicle, and position information of the traveling lane of the emergency vehicle (See Dantrey, e.g., “…the location determiner 108 may use one or more (e.g., passive) acoustic location algorithms—e.g., acoustic triangulation—to determine a location 112 and/or travel direction 114 of an emergency response vehicle(s)…if the emergency response vehicle is traveling toward the intersection, the vehicle 500 may determine to pull to the side of the road and stop until the emergency response vehicle(s) has cleared the intersection (in an example where pulling to the side of the road and stopping is the local rule or convention)…the type of emergency response vehicle (and/or the alert type 122), the location 112, and/or the travel direction 114 may be used to perform one or more operations—such as to comply with rules or conventions of the locale with respect to emergency response vehicles…” of ¶ [0025]-¶ [0030], ¶ [0040]-¶ [0050], and Fig. 1A elements 100-124, Figs. 3-4 steps 300-B414, Figs. 5A-D elements 500-594). Moosaei teaches determine whether the emergency vehicle is adjacent to the vehicle based on image/radar information acquired by the camera / radar module when it is determined that the emergency vehicle is adjacent to the vehicle (See Moosaei, e.g., “…The plurality of cameras are used to detect spinning lights and also to detect if an emergency vehicle is in the same lane as the vehicle…sensor data from microphone(s) 203 and camera(s) 204 can be fused into sensor data 236. Microphone(s) 203 can detect sounds of siren 219. Camera(s) 204 can detect lights 223…External sensors 202 include one or more of: microphones 203, camera(s) 204, LIDAR sensor(s) 206, and ultrasonic sensor(s) 207…radar sensors, acoustic sensors, and electromagnetic sensors…” of ¶ [0014], ¶ [0025], ¶ [0031], ¶ [0040]-¶ [0056], and Fig. 2 elements 200-254, Fig. 3 steps 300-305, Fig. 4 elements 400-429, Figs. 5A-B elements 502-532). The motivation is to detect the emergency vehicles’ routes, and comply with the rules expeditiously, thereby, preserving precious time, and lives. Allowable Subject Matter Claims 10-11, 19-20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Further, the prior art on record does not teach or suggest, either in singularity or in combination, the claimed subject matter of the claims 10-11, 19-20. Therefore, the claims 10-11, 19-20 are objected to. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Buck et al. (US Pub. No.: 2022/0363261 A1) teaches “method detects presence of a multi-tone siren type in an acoustic signal. The multi-tone siren type is associated with one or more siren patterns, where each siren pattern includes a number of time patterns at corresponding frequencies. The method includes processing a number of frequency components of a frequency domain representation of the acoustic signal over time to determine a corresponding plurality of values. That processing includes determining, for each frequency component, a value characterizing a presence of a time pattern associated with at least one siren pattern. The method also includes processing the values according to the siren patterns to determine a detection result indicating whether the multi-tone siren type is present in the acoustic signal.” Watkins et al. (US Pub. No.: 2022/0122620 A1) teaches “Systems and methods for siren detection in a vehicle are provided. A method includes recording an audio segment, using a first audio recording device coupled to an autonomous vehicle, separating, using a computing device coupled to the autonomous vehicle, the audio segment into one or more audio clips, generating a spectrogram of the one or more audio clips, and inputting each spectrogram into a Convolutional Neural Network (CNN) run on the computing device. The CNN may be pretrained to detect one or more sirens present in spectrographic data. The method further includes determining, using the CNN, whether a siren is present in the audio segment, and if the siren is determined to be present in the audio segment, determining a course of action of the autonomous vehicle.” Any inquiry concerning this communication or earlier communications from the examiner should be directed to BABAR SARWAR whose telephone number is (571)270-5584. The examiner can normally be reached on Mon-Fri 9:00 AM-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached on (313)446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free)? If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BABAR SARWAR/Primary Examiner, Art Unit 3667
Read full office action

Prosecution Timeline

Jul 06, 2023
Application Filed
Mar 21, 2025
Non-Final Rejection — §103, §112
Jun 24, 2025
Response Filed
Sep 03, 2025
Final Rejection — §103, §112
Nov 04, 2025
Response after Non-Final Action
Nov 29, 2025
Request for Continued Examination
Dec 11, 2025
Response after Non-Final Action
Jan 24, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600370
VEHICULAR CONTROL SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12602800
TIRE STATE ESTIMATION METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12602933
VEHICULAR SENSING SYSTEM WITH OCCLUSION ESTIMATION FOR USE IN CONTROL OF VEHICLE
2y 5m to grant Granted Apr 14, 2026
Patent 12594947
DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12586465
METHOD AND APPARATUS FOR ASSISTING RIGHT TURN OF AUTONOMOUS VEHICLE BASED ON UWB COMMUNICATION AND V2X COMMUNICATION AT INTERSECTION
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
99%
With Interview (+20.0%)
2y 7m
Median Time to Grant
High
PTA Risk
Based on 1043 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month