Prosecution Insights
Last updated: April 19, 2026
Application No. 17/908,608

System for Monitoring the Surround of a Motor Vehicle

Final Rejection §103§112
Filed
Sep 01, 2022
Examiner
ORANGE, DAVID BENJAMIN
Art Unit
2663
Tech Center
2600 — Communications
Assignee
Zkw Group GmbH
OA Round
4 (Final)
34%
Grant Probability
At Risk
5-6
OA Rounds
3y 7m
To Grant
63%
With Interview

Examiner Intelligence

Grants only 34% of cases
34%
Career Allow Rate
51 granted / 151 resolved
-28.2% vs TC avg
Strong +29% interview lift
Without
With
+29.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
51 currently pending
Career history
202
Total Applications
across all art units

Statute-Specific Performance

§101
13.1%
-26.9% vs TC avg
§103
29.0%
-11.0% vs TC avg
§102
20.2%
-19.8% vs TC avg
§112
32.0%
-8.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 151 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments and amendment have persuasively overcome the 112(a) rejections and one of the two 112(b) rejections. For the 112(b), Applicant argues “One skilled in the art understands that it refers to a threshold value that is defined/set for the system and that the "defined threshold value" is a configurable system parameter, not a subjective term.” However, the claim does not recite this, and the broadest reasonable interpretation is not so limited. Similar reasoning applies to Applicant’s other arguments for this limitation. The updated rejection provides options to overcome the rejection. For the 103, Applicant argues “Kim fails to disclose adjusting illumination based on the outcomes of two different sensing devices as recited in claim 1.” Applicant also argues “Bauer does not teach adjusting illumination based on a comparison of detection/classification capabilities between the two sensors or based on a confidence value falling below a threshold.” However, the claims were rejected over the combination of Kim and Bauer. In other words, what Kim lacks is taught by Bauer and vice versa. MPEP 2145(IV). Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-19 (all claims) are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1 and 3 recite “corresponding to,” but this is subjective (i.e., different people can have different opinions about whether or not something corresponds). MPEP 2173.05(b)(IV). One way to overcome this rejection is to recite an objective standard, such as “is” or “being.” Claim 1 recites “a defined threshold value,” but this is subjective because there is no objective definition of what the threshold is (i.e., there is not a reference value). MPEP 2173.05(b)(IV). One way to overcome this rejection is to recite an objective standard, such as that the value is stored in memory (because then there is not disagreement as to what the threshold value is). Alternatively, reciting “the threshold is the confidence level below which reliable object classification is no longer possible” overcomes the rejection. Dependent claims are likewise rejected. Examiner Note The claims recite “KO” and “NKO.” While their usage appears as though these are terms of art, the search has not identified these terms as such. Therefore, they are best understood as labels that do not convey patentable weight. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 2, 7-11, 13, and 15-18 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (U.S 2020/0001774 A1) in view of Bauer (DE 102009034026 A1). Regarding claim 1, Kim discloses a system (1) for monitoring an environment of a motor vehicle (100) (Figure. 5, Paragraph [0141], "The vehicle 10 includes a car, a train and a motorcycle"), the system (1) comprising: at least one image capture device (2), (Paragraph [0148], "The object detection device 210 may include at least one of a camera"), wherein the image capture device (2) is designed to sense a sensing region (E1) of the environment to sense objects within the sensing region (E1); (Paragraph [0293], "A camera 1310 of FIG. 13, which is a component functionally combined with the processor 1350, can generate image data around the vehicle 1200... Further, the camera 1310 of FIG. 13 may perform the same function as the object detection device 210 of FIGS. 6 and 8 and may be configured as a part of the imaging device 320 of FIG. 11"); at least one illuminating device (3), which is configured to at least partially illuminate the sensing region (E1) of the at least one image capture device (2); (headlights 1320); and at least one environment sensing device (7), which comprises RADAR or LIDAR and is configured, to sense at least a part of the sensing region (E1) of the image capture device (2) and to detect objects (Paragraph [0148], “The object detection device 210 may include at least one of a camera, a radar, a lidar … .” The phrase “at least one of” teaches that this can be a second device, e.g., both a camera and a radar.) wherein the image capture device (2) is configured: to classify, in terms of object type, an object situated in a sensing region of the at least one image capture device (2), when said object has been detected (Figure. 14, Paragraph [0298], "Referring to FIG. 14, the vehicle 1200 may include... a camera 1310, ... , a processor 1350... the processor 1350 may include an object recognition module 1355"; Paragraph [0302], "the object recognition module 1355 can recognize objects (e.g., another vehicle, a pedestrian, a traffic light, a sign, an obstacle) positioned ahead of the vehicle 1200 from image data acquired from the camera 1310"), and in each case to determine a confidence value, a so-called KO confidence value, "KO", said KO corresponding to a probability that the object type of a detected object can be established correctly by the image capture device (2) (Paragraph [0303], "object recognition module 1355 can determine a recognition error (or recognition accuracy of each of objects positioned ahead of the vehicle 1200 together with the recognition information of the objects. For example, in the objects positioned ahead of the vehicle 1200, it is possible to have 0.1 as a recognition error (or recognition accuracy 0.9) for a forwarding vehicle, 0.2 as a recognition error (or recognition accuracy 0.8) for a pedestrian, or 0.7 as a recognition error (or recognition accuracy 0.3) for a sign"), and wherein the image capture device (2) and the environment sensing device (7) are configured to … (ii) the environment sensing device (7) detects the object, and this object is detected by the image capture device (2) but cannot be classified by the image capture device (2), or (Paragraph [0387], "The vehicle control apparatus 1300 checks whether an object with low object recognition accuracy is detected during driving, and keeps driving when an object with low object recognition accuracy is not detected (S3225)") (iii) the environment sensing device (7) detects this object, and this object is detected by the image capture device (2) but the KO which is determined by the image capture device (2) during classification falls below a defined threshold value (Paragraph [0315], "the object recognition module 1355 can detect an object having recognition accuracy lower than a reference value from objects that are recognized during driving while outputting light corresponding to the first brightness value"), the system actuates the illuminating device (3) such that an illumination intensity is increased or decreased in the sensing region (E1) of the object (OBJ) (Paragraph [0303], "When an object having a recognition error larger than a predetermined range is found out from objects recognized by the object recognition module 1355, a procedure of changing the setting of the headlights by the headlight control module 1360 can be performed." See also, [0361] “That is, backlight is generated in the camera 1310 of the vehicle 1200 by the light output through the headlights of the surrounding vehicle 1260.” The camera’s field of view teaches the claimed sensing region (as previously mapped), and [0361] states that the headlights (the source of the illumination) are detected, thus teaching the claimed “in the sensing region.”). Kim is not relied on for the below claim language. However, Bauer teaches wherein the image capture device (2) and the environment sensing device (7) are configured to sense the sensing region (E1) and (Fig. 1 and paragraph [0027].) the system (1) is configured such that when there is an object (OBJ) in the sensing region (E1) and (i) the environment sensing device (7) detects the object, and this object is not detected by the image capture device (2) (Paragraph 0024]. Bauer teaches that far-infrared (as well as radar and lidar) work during twilight, but the camera (i.e., “a sensor operating in the visible wavelength range”) does not. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kim to incorporate the vehicle monitoring system of Bauer by including Bauer’s parallelism and near-infrared camera to sense in the non-visible range. Both references are from the same field of endeavor object detection. The motivation for doing so would be to enhance the automatic detection of the different objects in front of the vehicle based on the infrared radiation reflected from the objects, suggested by Bauer, Paragraph [0003]. See also, Bauer, paragraph [0029] stating that performing the stereo method using infra-red improves detection. Additionally, Bauer provides implementation details, such as camera placement. See, e.g., Bauer, paragraph [0027]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kim with Bauer. Regarding claim 2, Kim discloses the system according to claim 1, which is configured to increase or decrease the illumination intensity in the region of the object (OBJ) (Paragraph [0316], "It is possible to improve recognition accuracy of an object by adjusting the brightness of light output by the headlights in correspondence to detection of an object having a large error") when the KO falls below a defined threshold value KOmin for the KO confidence value (Paragraph [0315], "the object recognition module 1355 can detect an object having recognition accuracy lower than a reference value from objects that are recognized during driving while outputting light corresponding to the first brightness value"). Regarding claim 7, Kim discloses the system according to claim1, wherein the illuminating device (3) is designed to generate a motor vehicle beam pattern or a part of a motor vehicle beam pattern, wherein the illuminating device (3) comprises a dimmed beam module (3) for generating a dimmed beam pattern or a full beam module (4) for generating a full beam pattern or a combined module for generating a dimmed beam pattern and a full beam pattern (Paragraph [0172], “The autonomous driving device 260 can implement at least one ADAS… The ADAS can implement at least one of… HBA (High Beam Assist)”; where examiner will interpret that, when the high beam is activated, a full beam pattern is generated to maximize illumination for improved visibility, and when the high beam is deactivated, a dimmed beam patter is generated to reduce glare). Regarding claim 8, Kim discloses the system according to claim1, wherein the environment sensing device (7) comprises an ultrasound-based sensor or an IR camera or a TOF ("time of flight") camera or an MS ("multispectral") camera and/or a thermal imaging camera (Paragraph [0148], "The object detection device 210 may include at least one of a camera, a radar, a lidar, an ultrasonic sensor and an infrared sensor"; Paragraph [0156], "The lidar may be realized according to TOF or phase shift"). Regarding claim 9, Kim discloses the system according to claim1, wherein the image capture device (2) comprises one or more cameras or one or more camera systems (Paragraph [0151], "The camera may be at least one of a mono camera, a stereo camera and an around view monitoring (AVM) camera"). Regarding claim 10, Kim discloses the system according to claim1, wherein the image capture device (2) operates in a visible wavelength range or in a non-visible wavelength range (Paragraph [0256], “The image sensor can acquire a user image using light of the visible band or infrared band.”) Regarding claim 11, Kim discloses the system according to claim 1, wherein illuminating device (3) is designed to illuminate the object (OBJ) continuously (Paragraph [0011], “headlights that are combined with the processor and outputs light to the surrounding vehicle”; and Paragraph [0300], “The illumination sensor 1315 can acquire information about illumination outside the vehicle 1200 and can provide the acquired external illumination to the processor 1350”; when the car lights are turned on, they provide continuous illumination to the surroundings, ensuring consistent visibility as processed by the processor), or wherein the illuminating device (3) can be operated cyclically and synchronised with the image capture device (2) such that the object is illuminated only when the image capture device (2) is active, or wherein the illuminating device (3) is designed to emit light flashes at the object. Regarding claim 13, Kim discloses a motor vehicle headlight for a motor vehicle (100), wherein the motor vehicle headlight (10) comprises a system (1) according to claim 1 (Figures. 5-6, Paragraph [0141], "The vehicle 10 includes a car, a train and a motorcycle"; Paragraph [0011], "An apparatus for controlling a vehicle in an autonomous driving system… includes: … headlights that are combined with the processor and outputs light to the surrounding vehicle"). Regarding claim 15, Kim discloses a method for monitoring the environment of a motor vehicle (100), wherein a system (1) according to claim1 is used to carry out the method (Figures. 5-6, and 14, Paragraph [0141], "The vehicle 10 includes a car, a train and a motorcycle"; Paragraph [0026], "FIG. 14 is a flowchart showing a method for controlling a vehicle in an autonomous driving system"). Regarding claim 16, Kim discloses wherein the non-visible wavelength range is the infrared (IR) range. (Paragraph [0256], “The image sensor can acquire a user image using light of the visible band or infrared band.”) Regarding claim 17, the combination of Kim and Bauer discloses the motor vehicle headlight according to claim 13, wherein the optical image capture device (2) is arranged in a lateral edge region of the headlight (10) (Paragraph [0026], "In the vehicle of Fig. 1, a lighting unit 4 is also provided, which is arranged analogously to the cameras 2 and 3 in the front area of the vehicle. In the embodiment described here, the lighting unit is a lighting device provided in addition to the headlights"). Regarding claim 18, Kim discloses the system according to claim 1, wherein the image capture device (2) is configured to fully illuminate the sensing region (E1) of the environment or (The sensing region is chosen as within the area that is illuminated.) wherein the environment sensing device (7) is configured to sense the entire sensing region (E1). (The sensing region is chosen as within the area that is sensed.) Claims 3-6 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (U.S 2020/0001774 A1) in view of Bauer (DE 102009034026 A1) further in view of WO 2019225349 A1 to Yamamoto. Regarding claim 3, the combination of Kim, Bauer and Yamamoto discloses the system according to claim 1, [wherein the environment sensing device (7) is designed to classify, in terms of type, an object sensed by the environment sensing device (7) and situated and detected in the sensing region (E1) of the image capture device (2) and to determine a further confidence value, the so-called NKO confidence value, "NKO", said NKO corresponding to the probability that the object type of the detected object has been established correctly, depending on the NKO for the object (OBJ), or depending on the KO and the NKO for the object (OBJ), or when KO< NKO]. However, Kim fails to disclose wherein the environment sensing device (7) is designed to classify, in terms of type, an object sensed by the environment sensing device (7) and situated and detected in the sensing region (E1) of the illuminating device (2) and to determine a further confidence value, the so- called NKO confidence value, "NKO", said NKO indicating the probability that the object type of the detected object has been established, in particular correctly, depending on the NKO for the object (OBJ), or depending on the KO and the NKO for the object (OBJ), or when KO< NKO. Yamamoto teaches wherein the environment sensing device (7) is designed to classify, in terms of type, an object sensed by the environment sensing device (7) and situated and detected in the sensing region (E1) of the illuminating device (2) (Paragraph [0080], "Specifically, the recognition unit 311 performs semantic segmentation on the infrared image, detects the type and position of each object in the infrared image at the pixel level, and divides the infrared image into multiple recognition regions based on the recognition results. In addition, the recognition unit 311 calculates a recognition score for each recognition area in the infrared image"; and Paragraph [0085], "The recognition unit 311 supplies recognition information including the position of each recognition area, and recognition results such as the type of object in each recognition area and a recognition score to an irradiation control unit 313 and an operation control unit 314") and to determine a further confidence value, the so- called NKO confidence value, "NKO", said NKO indicating the probability that the object type of the detected object has been established, in particular correctly, depending on the NKO for the object (OBJ), or depending on the KO and the NKO for the object (OBJ), or when KO< NKO (Paragraph [0092], "the irradiation control unit 313 sets the amount of visible light for a recognition area (hereinafter referred to as a low reliability area) in an infrared image whose recognition score is less than the threshold T1"; and Paragraph [0093], "the irradiation control unit 313 sets the amount of visible light for a recognition area in the infrared image whose recognition score is equal to or greater than the threshold value T1, based on, for example, the type of object in the recognition area"). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kim and Bauer to incorporate the vehicle monitoring system of Yamamoto by incorporating both the recognition unit 311 and the irradiation control unit 313 to classify objects and determine confidence value. Both references are from the same field of endeavor object detection. The motivation for doing so would be to improve the image recognition processing performance and the result of image recognition processing on the infrared image, suggested by Yamamoto, Paragraphs [0079] and [0091]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kim and Bauer with Yamamoto. Regarding claim 4, the combination of Kim, Bauer and Yamamoto discloses the system according to claim3, [which is configured to increase or decrease the illumination intensity in the region of the object (OBJ) when the KO is less than the NKO]. However, Kim fails to disclose which is configured to increase or decrease the illumination intensity in the region of the object (OBJ) when the KO is less than the NKO. Yamamoto teaches which is configured to increase or decrease the illumination intensity in the region of the object (OBJ) when the KO is less than the NKO (Paragraph [0092], "the irradiation control unit 313 sets the amount of visible light for a recognition area (hereinafter referred to as a low reliability area) in an infrared image whose recognition score is less than the threshold T1"; Paragraph [0093], "the irradiation control unit 313 sets the amount of visible light for a recognition area in the infrared image whose recognition score is equal to or greater than the threshold value T1, based on, for example, the type of object in the recognition area"; and Paragraph [0099], "when the average luminance value of the low reliability region is less than the threshold value Th2, the irradiation control unit 313 increases (brightens) the amount of visible light for the low reliability region. On the other hand, when the average luminance value of the low reliability region is equal to or greater than the threshold value Th2, the illumination control unit 313 reduces (darkens) the amount of visible light for the low reliability region"). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kim and Bauer to incorporate the vehicle monitoring system of Yamamoto by utilizing the irradiation control unit 313 to adjust the illumination intensity in reliability region. Both references are from the same field of endeavor object detection. The motivation for doing so would be to control the irradiation pattern based on the result of image recognition processing on the visible image, suggested by Yamamoto, Paragraph [0098]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kim and Bauer with Yamamoto. Regarding claim 5, the combination of Kim, Bauer and Yamamoto discloses the system according to claim 1, [comprising confidence value determining means (Al, A2) for determining the KO or the NKO]. Yamamoto teaches comprising confidence value determining means (Al, A2) for determining the KO or the NKO (Paragraph [0099], "when a recognition area (low reliability area) in which the recognition score is less than a threshold value T1 exists in the visible image, even though the irradiation control unit 313 has controlled the irradiation pattern based on the results of image recognition processing on the infrared image, the irradiation control unit 313 sets the amount of visible light to be irradiated to the low reliability area based on the average brightness value of the low reliability area in the visible image"). Yamamoto is combined as per claims 3 and 4. Regarding claim 6, the combination of Kim, Bauer and Yamamoto discloses the system according to claim 1, comprising at least one controller (9) for actuating the at least one illuminating device (3) depending on KO, or on NKO, or on KO and NKO. Yamamoto teaches comprising at least one controller (9) for actuating the at least one illuminating device (3) depending on KO, or on NKO, or on KO and NKO (Paragraph [0080], "the recognition unit 311 calculates a recognition score for each recognition area in the infrared image"; and Paragraph [0085], "The recognition unit 311 supplies recognition information including the position of each recognition area, and recognition results such as the type of object in each recognition area and a recognition score to an irradiation control unit 313 and an operation control unit 314"). Yamamoto is combined as per claims 3 and 4. Claims 12, 14 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (U.S 2020/0001774 A1) in view of Bauer (DE 102009034026 A1) further in view of U.S 2019/0070997 A1 to Mouri et al. (hereinafter, “Mouri”). Regarding claim 12, the combination of Kim, Bauer and Mouri discloses the system according to claim 1, [wherein the illuminating device (3) is part of a motor vehicle headlight (10), of the motor vehicle (100)]. However, Kim fails to disclose wherein the illuminating device (3) is part of a motor vehicle headlight (10), in particular of the motor vehicle (100). Mouri teaches wherein the illuminating device (3) is part of a motor vehicle headlight (10), in particular of the motor vehicle (100) (Paragraph [0050], "As shown in FIG. 1, a pair of left and right headlamp units 14… are provided at the vehicle 12"; Paragraph [0051], "Each of the headlamp unit 14R and the headlamp unit 14L is structured to include a low beam unit 16... and a high beam unit 18"; and Paragraph [0111], "in the illumination devices 10 for a vehicle relating to the present embodiments, the low beam unit 16 and the high beam unit 18 may be made integral"). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kim and Bauer to incorporate the vehicle monitoring system of Mouri by including a pair of headlamp units 14. Both references are from the same field of endeavor object detection. The motivation for doing so would be to ensure the field of view at the front side of the vehicle, with the illumination device 10 positioned on the outer side of the low beam the vehicle transverse direction, suggested by Mouri, Paragraphs [0050] and [0052]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kim and Bauer with Mouri. Regarding claim 14, the combination of Kim, Bauer and Mouri discloses a motor vehicle having a motor vehicle headlight (10), according to Claim 13, wherein at least the illuminating device (3) is part of a motor vehicle headlight (10) of the motor vehicle (100)]. Kim further discloses a motor vehicle having a motor vehicle headlight (10) (Figures. 5-6, Paragraph [0141], "The vehicle 10 includes a car, a train and a motorcycle"; and Paragraph [0011], "An apparatus for controlling a vehicle in an autonomous driving system… includes: … headlights that are combined with the processor and outputs light to the surrounding vehicle"). However, Kim fails to disclose wherein at least the illuminating device (3) is part of a motor vehicle headlight (10) of the motor vehicle (100). Mouri teaches preferably two motor vehicle headlights, a left one and a right one (Figure. 1, Paragraph [0050], "a headlamp unit 14R is disposed at the right side front end portion of the vehicle 12, and a headlamp unit 14L is disposed at the light side front end portion of the vehicle 12"), according to Claim 13, wherein at least the illuminating device (3) is part of a motor vehicle headlight (10) of the motor vehicle (100) (Paragraph [0050], "As shown in FIG. 1, a pair of left and right headlamp units 14 for ensuring the field of view at the front side of a vehicle 12 are provided at the vehicle 12"; Paragraph [0051], "Each of the headlamp unit 14R and the headlamp unit 14L is structured to include a low beam unit 16... and a high beam unit 18"; and Paragraph [0111], "in the illumination devices 10 for a vehicle relating to the present embodiments, the low beam unit 16 and the high beam unit 18 may be made integral"). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kim and Bauer to incorporate the vehicle monitoring system of Mouri by installing the right headlamp unit 14R and the left headlamp unit 14L with left-right symmetry in the vehicle transverse direction. Both references are from the same field of endeavor object detection. The motivation for doing so would be to incorporate a low beam unit 16 positioned on the outer side portion of the vehicle’s transverse direction, and a high beam unit 18 positioned on the inner side portion of the vehicle’s transverse direction in the illumination devices 10, suggested by Mouri, Paragraph [0051]. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kim and Bauer with Mouri. Regarding claim 19, Kim, Bauer and Mouri disclose wherein the motor vehicle has a left motor vehicle headlight and a right motor vehicle headlight (Figure. 1, Paragraph [0050], "a headlamp unit 14R is disposed at the right side front end portion of the vehicle 12, and a headlamp unit 14L is disposed at the light side front end portion of the vehicle 12") Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US11852746B2 – abstract “multi-sensor fusion platform for use in autonomous vehicles” and claim 1 describes how higher resolution camera and lidar can be used to train the lower resolution radar. This reference also describes changing sensors based on object distance (see, e.g., Fig. 12) US11675068B2 – claim 7 is a method of multisensory fusion with either a monochromatic or RGB camera (and the background states that this technology is widely used in ADAS) US10404261B1 – abstract “A system for detecting the surrounding environment of vehicle comprising a RADAR unit … .” The abstract describes changing the frequency of the PLL, and Fig. 16 is explicit that this is related to power consumption. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID ORANGE whose telephone number is (571)270-1799. The examiner can normally be reached Mon-Fri, 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at (571) 272-3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID ORANGE/Primary Examiner, Art Unit 2663
Read full office action

Prosecution Timeline

Sep 01, 2022
Application Filed
Dec 23, 2024
Non-Final Rejection — §103, §112
Apr 25, 2025
Response Filed
May 22, 2025
Final Rejection — §103, §112
Aug 21, 2025
Response after Non-Final Action
Sep 19, 2025
Request for Continued Examination
Oct 08, 2025
Response after Non-Final Action
Nov 18, 2025
Non-Final Rejection — §103, §112
Feb 23, 2026
Response Filed
Mar 06, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12567126
INFRASTRUCTURE-SUPPORTED PERCEPTION SYSTEM FOR CONNECTED VEHICLE APPLICATIONS
2y 5m to grant Granted Mar 03, 2026
Patent 11300964
METHOD AND SYSTEM FOR UPDATING OCCUPANCY MAP FOR A ROBOTIC SYSTEM
2y 5m to grant Granted Apr 12, 2022
Patent 10816794
METHOD FOR DESIGNING ILLUMINATION SYSTEM WITH FREEFORM SURFACE
2y 5m to grant Granted Oct 27, 2020
Patent 10433126
METHOD AND APPARATUS FOR SUPPORTING PUBLIC TRANSPORTATION BY USING V2X SERVICES IN A WIRELESS ACCESS SYSTEM
2y 5m to grant Granted Oct 01, 2019
Patent 10285010
ADAPTIVE TRIGGERING OF RTT RANGING FOR ENHANCED POSITION ACCURACY
2y 5m to grant Granted May 07, 2019
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
34%
Grant Probability
63%
With Interview (+29.4%)
3y 7m
Median Time to Grant
High
PTA Risk
Based on 151 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month