Prosecution Insights
Last updated: April 19, 2026
Application No. 18/814,708

VEHICULAR DRIVING ASSIST SYSTEM

Non-Final OA §103
Filed
Aug 26, 2024
Examiner
KINGSLAND, KYLE J
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Magna Electronics Inc.
OA Round
1 (Non-Final)
77%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
84%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
164 granted / 212 resolved
+25.4% vs TC avg
Moderate +6% lift
Without
With
+6.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
38 currently pending
Career history
250
Total Applications
across all art units

Statute-Specific Performance

§101
7.5%
-32.5% vs TC avg
§103
45.0%
+5.0% vs TC avg
§102
24.5%
-15.5% vs TC avg
§112
18.3%
-21.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 212 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of the Claims This Office Action is in response to the Application filed on August 26, 2024. Claims 1-26 are presently pending and are presented for examination. Information Disclosure Statement The information disclosure statement (IDS) submitted on August 26, 2024 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 2, 4-6, 10, and 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Huang et al. (US 20200247407; hereinafter Huang) in view of Winden (US 20200039447; already of record from IDS). In regards to claim 1, Huang discloses of a vehicular driving assistance system (“A lane trace control system for maintaining a vehicle in its intended lane of travel. The system includes a lane trace control module configured to: compare the road information obtained by a lane recognition camera with map data obtained from a map module; determine a confidence level of the lane recognition camera based on a magnitude of any differences identified between the road information obtained from the lane recognition camera and the obtained map data; generate a target wheel angle for keeping the vehicle in its intended lane based entirely on the road information obtained from the lane recognition camera when the confidence level is above a predetermined threshold; and generate the target wheel angle based on a combination of the road information obtained from the lane recognition camera and the obtained map data of the road when the confidence level is below the predetermined threshold.” (Abstract)), the vehicular driving assistance system comprising: a forward-viewing camera disposed [in] a vehicle equipped with the vehicular driving assistance system (“The system 10 includes at least one camera 20 suitable for obtaining road information regarding the road that the vehicle 12 is traveling on. The road information includes, but is not limited to, the following: road curvature; offset distance of the vehicle from a center of the road and/or from a center of a lane of the road; lane marker location; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. The camera 20 may be any suitable camera such as any suitable visual light camera. The camera 20 may also include or be combined with any other suitable sensors, such as Lidar, radar, sonar, etc.” (Para 0014), and “The lane trace control module 30 is configured to compare the road information obtained by the lane recognition camera 20 with map data obtained by the map module 42 (and the position of the vehicle 12 relative to the map data of the map module 42 as indicated by the position sensor 40), and identify any differences therebetween. The lane trace control module 30 further determines a confidence level of the lane recognition camera 20 based on the magnitude of any differences identified between the road information obtained from the lane recognition camera 20 and the map data from the map module 42. The confidence level may be a level on a scale of 0 to 100, for example. A confidence level of 100 represents full confidence in the road information obtained by the lane recognition camera 20 as being completely accurate and reliable for being solely relied on to maintain the vehicle 12 in its intended lane.” (Para 0018)) … … wherein the forward-viewing camera is operable to capture image data (“The system 10 includes at least one camera 20 suitable for obtaining road information regarding the road that the vehicle 12 is traveling on. The road information includes, but is not limited to, the following: road curvature; offset distance of the vehicle from a center of the road and/or from a center of a lane of the road; lane marker location; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. The camera 20 may be any suitable camera such as any suitable visual light camera. The camera 20 may also include or be combined with any other suitable sensors, such as Lidar, radar, sonar, etc.” (Para 0014)); an image processor for processing image data captured by the forward-viewing camera (“The camera 20 is controlled by a camera control module 22. The camera control module 22 is in communication with a lane trace control module 30. The camera control module 22 inputs the road information obtained by the camera 20 to the lane trace control module 30 for processing thereby, as described herein.” (Para 0015)); wherein the vehicular driving assistance system, via processing by the image processor of image data captured by the forward-viewing camera, determines traffic lane marking information ahead of the vehicle on a road along which the vehicle is traveling (“The system 10 includes at least one camera 20 suitable for obtaining road information regarding the road that the vehicle 12 is traveling on. The road information includes, but is not limited to, the following: road curvature; offset distance of the vehicle from a center of the road and/or from a center of a lane of the road; lane marker location; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. The camera 20 may be any suitable camera such as any suitable visual light camera. The camera 20 may also include or be combined with any other suitable sensors, such as Lidar, radar, sonar, etc.” (Para 0014), and “The lane trace control module 30 is configured to compare the road information obtained by the lane recognition camera 20 with map data obtained by the map module 42 (and the position of the vehicle 12 relative to the map data of the map module 42 as indicated by the position sensor 40), and identify any differences therebetween. The lane trace control module 30 further determines a confidence level of the lane recognition camera 20 based on the magnitude of any differences identified between the road information obtained from the lane recognition camera 20 and the map data from the map module 42. The confidence level may be a level on a scale of 0 to 100, for example. A confidence level of 100 represents full confidence in the road information obtained by the lane recognition camera 20 as being completely accurate and reliable for being solely relied on to maintain the vehicle 12 in its intended lane.” (Para 0018)); wherein the vehicular driving assistance system determines reliability of the determined traffic lane marking information (“The lane trace control module 30 is configured to compare the road information obtained by the lane recognition camera 20 with map data obtained by the map module 42 (and the position of the vehicle 12 relative to the map data of the map module 42 as indicated by the position sensor 40), and identify any differences therebetween. The lane trace control module 30 further determines a confidence level of the lane recognition camera 20 based on the magnitude of any differences identified between the road information obtained from the lane recognition camera 20 and the map data from the map module 42. The confidence level may be a level on a scale of 0 to 100, for example. A confidence level of 100 represents full confidence in the road information obtained by the lane recognition camera 20 as being completely accurate and reliable for being solely relied on to maintain the vehicle 12 in its intended lane. A confidence level of 0 represents no confidence in road information collected by the lane recognition camera 20, and/or an inoperable lane recognition camera 20. A confidence level of 50 indicates an intermediate or average confidence level in the road information obtained by the lane recognition camera 20. A confidence level between 50 and 100 indicates greater than an intermediate or average level of confidence in data from the lane recognition camera 20. A confidence level of less than 50 indicates less than an intermediate or average level of confidence.” (Para 0018), see also Para 0019-0021); a predictive sensor that uses map data to predict the road ahead of the vehicle (“The system 10 further includes any suitable position sensor for sensing the position of the vehicle 12 relative to the road that the vehicle 12 is traveling on. Any suitable sensors may be used, such as sensors associated with a GPS system including a GPS sensor/receiver 40. The location of the vehicle 12 obtained by the GPS receiver 40 and associated GPS system is input to a map module 42.” (Para 0016), “The map module 42 includes map data. Alternatively, the map module 42 may be in communication with map data stored remotely. The map data includes any suitable information regarding the road that the vehicle 12 is travelling upon. Exemplary map data included with, or accessible by, the map module 42 includes one or more of the following: road curvature; location of a center point of the road; locations of road lane markers and/or road boundary lines; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. Based on the location of the vehicle 12 determined by the position sensor 40 and the map data of the map module 42, the location of the vehicle 12 relative to a center of the road can be determined. The map module 42 is in communication with the lane trace control module 30, and inputs the map data and the location of the vehicle 12 relative to the various data points of the road stored by (or accessible by) the map module 42 to the lane trace control module 30.” (para 0017)); a plurality of radar sensors disposed at the vehicle, wherein the plurality of radar sensors is operable to capture radar data (“The system 10 includes at least one camera 20 suitable for obtaining road information regarding the road that the vehicle 12 is traveling on. The road information includes, but is not limited to, the following: road curvature; offset distance of the vehicle from a center of the road and/or from a center of a lane of the road; lane marker location; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. The camera 20 may be any suitable camera such as any suitable visual light camera. The camera 20 may also include or be combined with any other suitable sensors, such as Lidar, radar, sonar, etc.” (Para 0014)); an advanced driving-assistance system (ADAS) controller (“The lane trace control module 30 is configured to compare the road information obtained by the lane recognition camera 20 with map data obtained by the map module 42 (and the position of the vehicle 12 relative to the map data of the map module 42 as indicated by the position sensor 40), and identify any differences therebetween. The lane trace control module 30 further determines a confidence level of the lane recognition camera 20 based on the magnitude of any differences identified between the road information obtained from the lane recognition camera 20 and the map data from the map module 42. The confidence level may be a level on a scale of 0 to 100, for example. A confidence level of 100 represents full confidence in the road information obtained by the lane recognition camera 20 as being completely accurate and reliable for being solely relied on to maintain the vehicle 12 in its intended lane. A confidence level of 0 represents no confidence in road information collected by the lane recognition camera 20, and/or an inoperable lane recognition camera 20. A confidence level of 50 indicates an intermediate or average confidence level in the road information obtained by the lane recognition camera 20. A confidence level between 50 and 100 indicates greater than an intermediate or average level of confidence in data from the lane recognition camera 20. A confidence level of less than 50 indicates less than an intermediate or average level of confidence.” (Para 0018); wherein, responsive to the determined reliability of the determined traffic lane marking information being greater than or equal to a threshold reliability level, the ADAS controller generates ADAS control signals based at least in part on (i) processing of image data captured by the forward-viewing camera and (ii) processing of radar data captured by the plurality of radar sensors (“The lane trace control module 30 is configured to compare the road information obtained by the lane recognition camera 20 with map data obtained by the map module 42 (and the position of the vehicle 12 relative to the map data of the map module 42 as indicated by the position sensor 40), and identify any differences therebetween. The lane trace control module 30 further determines a confidence level of the lane recognition camera 20 based on the magnitude of any differences identified between the road information obtained from the lane recognition camera 20 and the map data from the map module 42. The confidence level may be a level on a scale of 0 to 100, for example. A confidence level of 100 represents full confidence in the road information obtained by the lane recognition camera 20 as being completely accurate and reliable for being solely relied on to maintain the vehicle 12 in its intended lane. A confidence level of 0 represents no confidence in road information collected by the lane recognition camera 20, and/or an inoperable lane recognition camera 20. A confidence level of 50 indicates an intermediate or average confidence level in the road information obtained by the lane recognition camera 20. A confidence level between 50 and 100 indicates greater than an intermediate or average level of confidence in data from the lane recognition camera 20. A confidence level of less than 50 indicates less than an intermediate or average level of confidence.” (Para 0018), “When the confidence level is at 100, or within a predetermined range from 100 (e.g., within a range of 90-100), the lane trace control module 30 generates a target wheel angle of the wheels (e.g., the front wheels) of the vehicle 12 suitable for maintaining the vehicle 12 in its intended road lane based entirely on the road information obtained from the lane recognition camera 20. Conversely, when the confidence level is below the predetermined threshold range, such as below 90 and above 10 for example, the lane trace control module 30 generates the target wheel angle based on a combination of the road information obtained by the lane recognition camera 20 and the obtained map data (and position of the vehicle 12 obtained by the position sensor 40 overlaid on the map data). The weight given to the road information obtained by the lane recognition camera 20 is generally proportional to the confidence level determined by the lane trace control module 30. Thus the higher the confidence level, the greater the weight assigned to the road information obtained by the lane recognition camera 20, and the less weight assigned to the map data from the map module 42.” (Para 0019), “The system 10 includes at least one camera 20 suitable for obtaining road information regarding the road that the vehicle 12 is traveling on. The road information includes, but is not limited to, the following: road curvature; offset distance of the vehicle from a center of the road and/or from a center of a lane of the road; lane marker location; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. The camera 20 may be any suitable camera such as any suitable visual light camera. The camera 20 may also include or be combined with any other suitable sensors, such as Lidar, radar, sonar, etc.” (Para 0014)); wherein, responsive to the determined reliability of the determined traffic lane marking information being less than the threshold reliability level, the ADAS controller generates the ADAS control signals based at least in part on an output of the predictive sensor (“When the confidence level is at 100, or within a predetermined range from 100 (e.g., within a range of 90-100), the lane trace control module 30 generates a target wheel angle of the wheels (e.g., the front wheels) of the vehicle 12 suitable for maintaining the vehicle 12 in its intended road lane based entirely on the road information obtained from the lane recognition camera 20. Conversely, when the confidence level is below the predetermined threshold range, such as below 90 and above 10 for example, the lane trace control module 30 generates the target wheel angle based on a combination of the road information obtained by the lane recognition camera 20 and the obtained map data (and position of the vehicle 12 obtained by the position sensor 40 overlaid on the map data). The weight given to the road information obtained by the lane recognition camera 20 is generally proportional to the confidence level determined by the lane trace control module 30. Thus the higher the confidence level, the greater the weight assigned to the road information obtained by the lane recognition camera 20, and the less weight assigned to the map data from the map module 42.” (Para 0019), “Based on the lane trace control module's 30 fusion of the road information obtained by the lane recognition camera 20 and the map data from the map module 42, which is fused at relative weights based on the confidence level, the lane trace control module 30 generates a target wheel angle for keeping the vehicle 12 in its intended lane of travel. If the confidence level is below or above predetermined thresholds, as described above, the lane trace control module will generate the target wheel angle based solely on the map data from the map module 42, or entirely on the road information from the camera 20. The lane trace control module 30 inputs the target wheel angle to a wheel control module 50. The wheel control module 50 sets the wheels of the vehicle 12 at the target wheel angle to maintain the vehicle 12 in its intended lane of travel.” (Para 0021)); and wherein the ADAS control signals comprise at least a steering command to control steering of the vehicle (“At block 160, the lane trace control module 30 determines a target wheel angle of the wheels of the vehicle 12 based on the road information from the camera 20 and/or the map data of the map module 42. From block 160 the method 110 proceeds to block 170, at which the lane trace control module sends the target steering wheel angle to the wheel control module 50. The wheel control module 50 is in cooperation with the wheels of the vehicle 12, and rotates the wheels to the target steering wheel angle to maintain the vehicle 12 in its intended lane of travel. At block 170, the lane trace control module 30 also sends the confidence level to the driver for the driver to monitor. Based on the confidence level, the driver will advantageously know when the lane recognition camera 20 is unable to obtain accurate road information and/or is not functioning optimally, so that the driver can be prepared to take manual control of the vehicle 12 from an autonomous drive system including the lane trace control system 10.” (Para 0028), see also Para 0022). However, Huang does not specifically disclose of a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular driving assistance system, wherein the forward-viewing camera is disposed behind the windshield of the vehicle and views forward of the vehicle through the windshield; wherein the forward-viewing camera comprises a two-dimensional imaging array having at least one million photosensor elements arranged in rows and columns. Winden, in the same field of endeavor, teaches of a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular driving assistance system, wherein the forward-viewing camera is disposed behind the windshield of the vehicle and views forward of the vehicle through the windshield (“Referring now to the drawings and the illustrative embodiments depicted therein, vision system 10 for a vehicle 12 includes at least one exterior viewing imaging sensor or camera, such as a forward viewing imaging sensor or camera, which may be disposed at and behind the windshield 14 of the vehicle and viewing forward through the windshield so as to capture image data representative of the scene occurring forward of the vehicle (FIG. 1). Optionally, the system may include multiple exterior viewing imaging sensors or cameras, such as a forward viewing camera at the front of the vehicle, and a sideward/rearward viewing camera at respective sides of the vehicle, and a rearward viewing camera at the rear of the vehicle, which capture images exterior of the vehicle. The camera or cameras each include a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera. Optionally, the forward viewing camera may be disposed at the windshield of the vehicle and view through the windshield and forward of the vehicle, such as for a machine vision system (such as for traffic sign recognition, headlamp control, pedestrian detection, collision avoidance, lane marker detection and/or the like). The vision system 10 includes a control or electronic control unit (ECU) or processor that is operable to process image data captured by the camera or cameras and may detect objects or the like and/or provide displayed images at a display device for viewing by the driver of the vehicle. The data transfer or signal communication from the camera to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.” (Para 0045)); wherein the forward-viewing camera comprises a two-dimensional imaging array having at least one million photosensor elements arranged in rows and columns (“The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.” (Para 0058)). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the forward-viewing camera, as taught by Huang, to include being behind a windshield and having an imaging array of at least a million elements, as taught by Winden, with a reasonable expectation of success in order to capture image data for imaging processing and detect objects at or near a vehicle path (Winden Para 0044 and 0058). In regards to claim 2, Huang in view of Winden teaches of the vehicular driving assistance system of claim 1, wherein the ADAS controller generates the ADAS control signals based at least in part on a determined target path for the vehicle to follow based on the radar data (“Based on the lane trace control module's 30 fusion of the road information obtained by the lane recognition camera 20 and the map data from the map module 42, which is fused at relative weights based on the confidence level, the lane trace control module 30 generates a target wheel angle for keeping the vehicle 12 in its intended lane of travel. If the confidence level is below or above predetermined thresholds, as described above, the lane trace control module will generate the target wheel angle based solely on the map data from the map module 42, or entirely on the road information from the camera 20. The lane trace control module 30 inputs the target wheel angle to a wheel control module 50. The wheel control module 50 sets the wheels of the vehicle 12 at the target wheel angle to maintain the vehicle 12 in its intended lane of travel.” (Huang Para 0021), “The system 10 includes at least one camera 20 suitable for obtaining road information regarding the road that the vehicle 12 is traveling on. The road information includes, but is not limited to, the following: road curvature; offset distance of the vehicle from a center of the road and/or from a center of a lane of the road; lane marker location; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. The camera 20 may be any suitable camera such as any suitable visual light camera. The camera 20 may also include or be combined with any other suitable sensors, such as Lidar, radar, sonar, etc.” (Huang Para 0014); see also Huang Para 0028). In regards to claim 4, Huang in view of Winden teaches of the vehicular driving assistance system of claim 1, wherein the ADAS controller determines reliability of the generated ADAS control signals (“When the confidence level is at 100, or within a predetermined range from 100 (e.g., within a range of 90-100), the lane trace control module 30 generates a target wheel angle of the wheels (e.g., the front wheels) of the vehicle 12 suitable for maintaining the vehicle 12 in its intended road lane based entirely on the road information obtained from the lane recognition camera 20. Conversely, when the confidence level is below the predetermined threshold range, such as below 90 and above 10 for example, the lane trace control module 30 generates the target wheel angle based on a combination of the road information obtained by the lane recognition camera 20 and the obtained map data (and position of the vehicle 12 obtained by the position sensor 40 overlaid on the map data). The weight given to the road information obtained by the lane recognition camera 20 is generally proportional to the confidence level determined by the lane trace control module 30. Thus the higher the confidence level, the greater the weight assigned to the road information obtained by the lane recognition camera 20, and the less weight assigned to the map data from the map module 42.” (Huang Para 0019), “Based on the lane trace control module's 30 fusion of the road information obtained by the lane recognition camera 20 and the map data from the map module 42, which is fused at relative weights based on the confidence level, the lane trace control module 30 generates a target wheel angle for keeping the vehicle 12 in its intended lane of travel. If the confidence level is below or above predetermined thresholds, as described above, the lane trace control module will generate the target wheel angle based solely on the map data from the map module 42, or entirely on the road information from the camera 20. The lane trace control module 30 inputs the target wheel angle to a wheel control module 50. The wheel control module 50 sets the wheels of the vehicle 12 at the target wheel angle to maintain the vehicle 12 in its intended lane of travel.” (Huang Para 0021), see also Para Huang 0018 and 0020). In regards to claim 5, Huang in view of Winden teaches of the vehicular driving assistance system of claim 4, wherein the vehicular driving assistance system, responsive to the determined reliability of the generated ADAS control signals being below a threshold ADAS reliability level, notifies an occupant of the vehicle (“The lane trace control module 30 generates any suitable notification to a driver of the vehicle 12 notifying the driver of the confidence level in the road information obtained by the lane recognition camera 20. This advantageously allows the driver to monitor the effectiveness of the lane recognition camera 20. The driver can be thus prepared to possibly have to take control of the vehicle 12 from an autonomous drive system including the lane trace control system 10 in instances where the confidence level is low and possibly decreasing.” (Huang Para 0023), “At block 160, the lane trace control module 30 determines a target wheel angle of the wheels of the vehicle 12 based on the road information from the camera 20 and/or the map data of the map module 42. From block 160 the method 110 proceeds to block 170, at which the lane trace control module sends the target steering wheel angle to the wheel control module 50. The wheel control module 50 is in cooperation with the wheels of the vehicle 12, and rotates the wheels to the target steering wheel angle to maintain the vehicle 12 in its intended lane of travel. At block 170, the lane trace control module 30 also sends the confidence level to the driver for the driver to monitor. Based on the confidence level, the driver will advantageously know when the lane recognition camera 20 is unable to obtain accurate road information and/or is not functioning optimally, so that the driver can be prepared to take manual control of the vehicle 12 from an autonomous drive system including the lane trace control system 10.” (Huang Para 0028)). In regards to claim 6, Huang in view of Winden teaches of the vehicular driving assistance system of claim 5, wherein the vehicular driving assistance system notifies the occupant via a request for the occupant to take control of the vehicle (“The lane trace control module 30 generates any suitable notification to a driver of the vehicle 12 notifying the driver of the confidence level in the road information obtained by the lane recognition camera 20. This advantageously allows the driver to monitor the effectiveness of the lane recognition camera 20. The driver can be thus prepared to possibly have to take control of the vehicle 12 from an autonomous drive system including the lane trace control system 10 in instances where the confidence level is low and possibly decreasing.” (Huang Para 0023), “At block 160, the lane trace control module 30 determines a target wheel angle of the wheels of the vehicle 12 based on the road information from the camera 20 and/or the map data of the map module 42. From block 160 the method 110 proceeds to block 170, at which the lane trace control module sends the target steering wheel angle to the wheel control module 50. The wheel control module 50 is in cooperation with the wheels of the vehicle 12, and rotates the wheels to the target steering wheel angle to maintain the vehicle 12 in its intended lane of travel. At block 170, the lane trace control module 30 also sends the confidence level to the driver for the driver to monitor. Based on the confidence level, the driver will advantageously know when the lane recognition camera 20 is unable to obtain accurate road information and/or is not functioning optimally, so that the driver can be prepared to take manual control of the vehicle 12 from an autonomous drive system including the lane trace control system 10.” (Huang Para 0028)). In regards to claim 10, Huang in view of Winden teaches of the vehicular driving assistance system of claim 1, wherein the vehicular driving assistance system, when switching from controlling the vehicle based in part on processing of image data captured by the forward-viewing camera to controlling the vehicle based in part on the output of the predictive sensor, limits a rate of change between the respective control signals (“When the confidence level is at 100, or within a predetermined range from 100 (e.g., within a range of 90-100), the lane trace control module 30 generates a target wheel angle of the wheels (e.g., the front wheels) of the vehicle 12 suitable for maintaining the vehicle 12 in its intended road lane based entirely on the road information obtained from the lane recognition camera 20. Conversely, when the confidence level is below the predetermined threshold range, such as below 90 and above 10 for example, the lane trace control module 30 generates the target wheel angle based on a combination of the road information obtained by the lane recognition camera 20 and the obtained map data (and position of the vehicle 12 obtained by the position sensor 40 overlaid on the map data). The weight given to the road information obtained by the lane recognition camera 20 is generally proportional to the confidence level determined by the lane trace control module 30. Thus the higher the confidence level, the greater the weight assigned to the road information obtained by the lane recognition camera 20, and the less weight assigned to the map data from the map module 42.” (Huang Para 0019), “At block 150, the lane trace control module 30 fuses the road information obtained from the lane recognition camera 20 with the map data of the map module 42 based on the confidence level. Thus as described above, the weight of the road information from the camera 20 versus the weight of the map data of the map module 42 varies based on (and is proportionate to) the confidence level of the camera 20 and the confidence level of the map data of the map module 42.” (Huang Para 0027)). In regards to claim 13, Huang in view of Winden teaches of the vehicular driving assistance system of claim 1, wherein the vehicular driving assistance system determines reliability of the predicted road ahead of the vehicle (“At lower confidence levels, such as below 50 for example, generally more weight is given to the map data from the map module 42 than to the road information obtained by the lane recognition camera 20. However, the map data of the map module 42 is also assigned a confidence level as well. Some map data may have a lower confidence level than other map data. For example, map data associated with a curvature of a lightly traveled backroad will typically have a lower confidence level than map data associated with an interstate highway, in particular a straight stretch of interstate highway that is heavily traveled. Thus, for example, when the lane trace control module 30 assigns a confidence level of 40 to road information obtained by the lane recognition camera 20, the lane trace control module 30 will place a less than fifty percent weight on the road information from the lane recognition camera 20 when the corresponding map data of the map module 42 has been assigned a relatively high reliability level. Conversely, if the map data of the map module 42 has been assigned a relatively low reliability level, the lane trace control module 30 will place a greater weight on the road information obtained by the lane recognition camera 20 as opposed to the map data of the map module 42 even when the road information from the camera 20 has a confidence level of less than 50, such as 40 for example.” (Huang Para 0020), see also Huang Para 0021). In regards to claim 14, Huang in view of Winden teaches of the vehicular driving assistance system of claim 13, wherein, responsive to the determined reliability of the determined traffic lane marking information being greater than or equal to the threshold reliability level, and responsive to the determined reliability of the predicted road ahead of the vehicle being greater than or equal to a threshold predicted road reliability level, the ADAS controller generates ADAS control signals based at least in part on (i) processing of image data captured by the forward-viewing camera, (ii) the output of the predictive sensor and (iii) processing of radar data captured by the plurality of radar sensors (“When the confidence level is at 100, or within a predetermined range from 100 (e.g., within a range of 90-100), the lane trace control module 30 generates a target wheel angle of the wheels (e.g., the front wheels) of the vehicle 12 suitable for maintaining the vehicle 12 in its intended road lane based entirely on the road information obtained from the lane recognition camera 20. Conversely, when the confidence level is below the predetermined threshold range, such as below 90 and above 10 for example, the lane trace control module 30 generates the target wheel angle based on a combination of the road information obtained by the lane recognition camera 20 and the obtained map data (and position of the vehicle 12 obtained by the position sensor 40 overlaid on the map data). The weight given to the road information obtained by the lane recognition camera 20 is generally proportional to the confidence level determined by the lane trace control module 30. Thus the higher the confidence level, the greater the weight assigned to the road information obtained by the lane recognition camera 20, and the less weight assigned to the map data from the map module 42.” (Huang Para 0019), “Based on the lane trace control module's 30 fusion of the road information obtained by the lane recognition camera 20 and the map data from the map module 42, which is fused at relative weights based on the confidence level, the lane trace control module 30 generates a target wheel angle for keeping the vehicle 12 in its intended lane of travel. If the confidence level is below or above predetermined thresholds, as described above, the lane trace control module will generate the target wheel angle based solely on the map data from the map module 42, or entirely on the road information from the camera 20. The lane trace control module 30 inputs the target wheel angle to a wheel control module 50. The wheel control module 50 sets the wheels of the vehicle 12 at the target wheel angle to maintain the vehicle 12 in its intended lane of travel.” (Huang Para 0021), “At lower confidence levels, such as below 50 for example, generally more weight is given to the map data from the map module 42 than to the road information obtained by the lane recognition camera 20. However, the map data of the map module 42 is also assigned a confidence level as well. Some map data may have a lower confidence level than other map data. For example, map data associated with a curvature of a lightly traveled backroad will typically have a lower confidence level than map data associated with an interstate highway, in particular a straight stretch of interstate highway that is heavily traveled. Thus, for example, when the lane trace control module 30 assigns a confidence level of 40 to road information obtained by the lane recognition camera 20, the lane trace control module 30 will place a less than fifty percent weight on the road information from the lane recognition camera 20 when the corresponding map data of the map module 42 has been assigned a relatively high reliability level. Conversely, if the map data of the map module 42 has been assigned a relatively low reliability level, the lane trace control module 30 will place a greater weight on the road information obtained by the lane recognition camera 20 as opposed to the map data of the map module 42 even when the road information from the camera 20 has a confidence level of less than 50, such as 40 for example.” (Huang Para 0020), “The system 10 includes at least one camera 20 suitable for obtaining road information regarding the road that the vehicle 12 is traveling on. The road information includes, but is not limited to, the following: road curvature; offset distance of the vehicle from a center of the road and/or from a center of a lane of the road; lane marker location; radius of the lane; radius of the road; lane width; and yaw offset between the road and the vehicle. The camera 20 may be any suitable camera such as any suitable visual light camera. The camera 20 may also include or be combined with any other suitable sensors, such as Lidar, radar, sonar, etc.” (Huang Para 0014)). Claim(s) 8 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Huang in view of Winden as applied to claim 1 above, and further in view of Koravadi (US 20190073541; already of record from IDS). In regards to claim 8, Huang in view of Winden teaches of the vehicular driving assistance system of claim 4. However, Huang in view of Winden do not specifically teach of wherein the image processor is part of a camera controller, and wherein the ADAS controller is remote from the camera controller Koravadi, in the same field of endeavor, teaches of wherein the image processor is part of a camera controller, and wherein the ADAS controller is remote from the camera controller (“The system may provide an output indicative of the determined lane or lanes (as determined based on processing of the captured image data or processing of the captured sensor data). The output is provided to a driving assist system of the vehicle, such as a lane departure warning system or such as a lane keep assist system or the like. Optionally, the output may be provided to an autonomous vehicle control system, whereby the vehicle is autonomously controlled to follow the lane in which the vehicle is traveling.” (Para 0035)). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the forward-viewing camera, as taught by Huang in view of Winden, to include an image processor within a camera controller that is separate from the ADAS controller, as taught by Koravadi, with a reasonable expectation of success in order to provide data indicative to the determined lanes to the driving assist system that is used to control the vehicle (Koravadi Para 0035). In regards to claim 12, Huang in view of Winden in view of Koravadi teaches of The vehicular driving assistance system of claim 1, wherein the forward-viewing camera is part of a camera module, and wherein the camera module comprises the image processor (“The system may provide an output indicative of the determined lane or lanes (as determined based on processing of the captured image data or processing of the captured sensor data). The output is provided to a driving assist system of the vehicle, such as a lane departure warning system or such as a lane keep assist system or the like. Optionally, the output may be provided to an autonomous vehicle control system, whereby the vehicle is autonomously controlled to follow the lane in which the vehicle is traveling.” (Para 0035)). The motivation for combining Huang, Winden, and Koravadi is the same as that recited for claim 8 above. Claim(s) 11, 15, 17, 19-21, and 25-26 is/are rejected under 35 U.S.C. 103 as being unpatentable over Huang in view of Winden as applied to claim 1 above, and further in view of Aurand et al. (US 20230001924; hereinafter Aurand) In regards to claim 11, Huang in view of Winden teaches of the vehicular driving assistance system of claim 1. However, Huang in view of Winden do not specifically teach of wherein the ADAS control signals further comprise an acceleration command to control speed of the vehicle. Aurand, in the same field of endeavor, teaches of wherein the ADAS control signals further comprise an acceleration command to control speed of the vehicle (“FIG. 2 shows the behavior of the vehicle 1 at a constant steering angle. A target trajectory ST is shown, which the vehicle 1 should actually follow in order to properly navigate the curve K, and an actual driving course TF of the vehicle 1. Within the second curve segment KS2, i.e., within the constant curvature of the curve K, the vehicle 1 follows the intended target trajectory ST. However, as soon as the vehicle 1 enters the exit clothoid, i.e., the third curve segment KS3, a transverse acceleration occurs due to the changing curvature of the curve K, and the vehicle 1 leaves the course of the lane FS. This leads to a road accident.” (Para 0046), “FIG. 13 shows a device 4 for carrying out the method. It comprises a control unit 5, configured to control at least one steering device 6 of the vehicle 1 for lane-keeping control of the vehicle 1 along the course of the lane FS travelled by the vehicle 1, advantageously in addition to controlling a drive train 7 and/or a braking device 8 of the vehicle 1, the at least one detection unit 3, configured to detect the lane markings, the digital map 2, a reception unit 9, configured to receive signals from the global navigation satellite system, a sensor system 10 and a memory unit 11, which are configured to detect and record the course of the path portion WA travelled by the vehicle 1, and at least one processing unit 12, configured to determine the course of the lane on the basis of the detected lane markings when the lane markings are detected and in a map-based manner on the basis of the data of the digital map 2 when the lane markings are not detected.” (Para 0061), see also Para 0042). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the ADAS control signal, as taught by Huang in view of Winden, to include an acceleration command, as taught by Aurand, with a reasonable expectation of success in order to brake the vehicle to prevent an accident when staying in a land during a curve (Aurand Para 0046 and 0061). In regards to claim 15, the claim recites analogous limitations to the combination of claims 1, 2, and 11, and is therefore rejected on the same premise. In regards to claim 17, the claim recites analogous limitations to the combination of claims 4, 5, and 6, and is therefore rejected on the same premise. In regards to claim 19, the claim recites analogous limitations to claim 10, and is therefore rejected on the same premise. In regards to claim 20, the claim recites analogous limitations to the combination of claims 13 and 14, and is therefore rejected on the same premise. In regards to claim 21, the claim recites analogous limitations to the combination of claims 1, 4, 5, 6, and 11, and is therefore rejected on the same premise. In regards to claim 25, the claim recites analogous limitations to claim 10, and is therefore rejected on the same premise. In regards to claim 26, the claim recites analogous limitations to the combination of claims 13 and 14, and is therefore rejected on the same premise. Allowable Subject Matter Claims 3, 7, 9, 16, 18, and 22-24 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: In regards to claim 3, the closest prior art of record is Huang et al. (US 20200247407; hereinafter Huang) in view of Winden (US 20200039447; already of record from IDS) in view of Krishna et al. (EP 3257729; hereinafter Krishna; already of record from IDS). Huang in view of Winden further in view of Krishna teaches of the vehicular driving assistance system of claim 1. However, Huang in view of Winden further in view of Krishna does not fully teach of wherein the determined reliability of the determined traffic lane marking information is indicated by a watchdog timer, and wherein the determined reliability of the determined traffic lane marking information is (i) above the threshold reliability level while the watchdog timer is operating and (ii) below the threshold reliability level when the watchdog timer is not operating. It is noted that the prior art teaches of a reliability of the determined traffic lane markings and of using a timer for determining if a reliability amount of a sensor is low. However, the prior art does not teach of reliability level being above the threshold while the watchdog timer is operating and below the threshold when the watchdog timer is not operating, in combination with the remaining claim limitations. Therefore the claim recites allowable subject matter. In regards to claim 7, the closest prior art of record is Huang et al. (US 20200247407; hereinafter Huang) in view of Winden (US 20200039447; already of record from IDS) in view of Krishna et al. (EP 3257729; hereinafter Krishna; already of record from IDS). Huang in view of Winden further in view of Krishna teaches of the vehicular driving assistance system of claim 5. However, Huang in view of Winden further in view of Krishna does not fully teach of wherein the determined reliability of the generated ADAS control signals is indicated by a watchdog timer, and wherein the determined reliability of the generated ADAS control signals is (i) above the threshold ADAS reliability level while the watchdog timer is operating and (ii) below the threshold ADAS reliability level when the watchdog timer is not operating. It is noted that the prior art teaches of a reliability of the ADAS control signals and of using a timer for determining if a reliability amount of a control signal is low. However, the prior art does not teach of reliability level being above the threshold while the watchdog timer is operating and below the threshold when the watchdog timer is not operating, in combination with the remaining claim limitations. Therefore the claim recites allowable subject matter. In regards to claim 9, the closest prior art of record is Huang et al. (US 20200247407; hereinafter Huang) in view of Winden (US 20200039447; already of record from IDS) in view of Koravadi (US 20190073541; already of record from IDS) further in view of Ono et al. (US 20200114933; hereinafter Ono). Huang in view of Winden in view of Koravadi further in view of Ono teaches of the vehicular driving assistance system of claim 8. However, Huang in view of Winden in view of Koravadi further in view of Ono do not teach of wherein the camera controller, responsive to the determined reliability of the ADAS control signals being below a threshold ADAS reliability level, generates control commands to control steering of the vehicle. The prior art does teach of an ADAS controller and a camera controller that can each send control signals, and a reliability of an ADAS control signal can be determined. However, there is no teaching of responsive to determining that the camera controller responds to a determination that the ADAS control signals are below a threshold reliability level by controlling the steering of the vehicle, in combination with the previous limitations. Therefore the claim contains allowable subject matter. In regards to claims 16 and 22, the claims recite analogous claim limitations to claim 3 and are therefore allowable subject matter on the same premise. In regards to claim 23, the claim recites analogous claim limitations to claim 7 and are therefore allowable subject matter on the same premise. In regards to claim 18, the claim recites analogous limitations to the combination of claims 8 and 9, and is therefore contains allowable subject matter on the same premise. In regards to claim 24, the claim recites analogous limitations to the combination of claims 8 and 9, and is therefore contains allowable subject matter on the same premise. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sun (US 20220398854) discloses of capturing a road image for a current location and determining an accuracy of the captured image. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Kyle J Kingsland whose telephone number is (571)272-3268. The examiner can normally be reached Mon-Fri 8:00-4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn can be reached at (571) 272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KYLE J KINGSLAND/ Primary Patent Examiner, Art Unit 3663
Read full office action

Prosecution Timeline

Aug 26, 2024
Application Filed
Jan 28, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600240
METHOD FOR OPERATING A BRAKE CONTROL SYSTEM, BRAKE CONTROL SYSTEM, COMPUTER PROGRAM, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12595699
VEHICLE INCLUDING A CAP THAT IS AUTOMATICALLY SEPARATED FROM A VEHICLE BODY
2y 5m to grant Granted Apr 07, 2026
Patent 12589784
SYSTEM AND METHOD FOR A VIRTUAL APPROACH SIGNAL
2y 5m to grant Granted Mar 31, 2026
Patent 12576727
DIFFERENTIAL ELECTRICAL DRIVE ARRANGEMENT FOR HEAVY DUTY VEHICLES
2y 5m to grant Granted Mar 17, 2026
Patent 12570246
MULTI-STANCE AERIAL DEVICE CONTROL AND DISPLAY
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
77%
Grant Probability
84%
With Interview (+6.5%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 212 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month