Prosecution Insights
Last updated: April 18, 2026
Application No. 18/798,333

Camera monitoring (CMS) system for a vehicle, method of controlling the camera monitoring (CMS) system, and vehicle

Final Rejection §103
Filed
Aug 08, 2024
Examiner
HUANG, FRANK F
Art Unit
2485
Tech Center
2400 — Computer Networks
Assignee
Motherson Innovations Company Limited
OA Round
2 (Final)
75%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
92%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
519 granted / 691 resolved
+17.1% vs TC avg
Strong +17% interview lift
Without
With
+17.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
33 currently pending
Career history
724
Total Applications
across all art units

Statute-Specific Performance

§101
5.0%
-35.0% vs TC avg
§103
72.0%
+32.0% vs TC avg
§102
3.6%
-36.4% vs TC avg
§112
9.3%
-30.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 691 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment Applicant’s response received has been fully considered and entered. Response to Arguments Applicant’s arguments have been considered but are moot because the arguments do not apply to any of the references being used in the current rejection. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kurtz et al . US 2021/0291739 A1 “Kasarla”, in view of Mochizuki, EP4032751A1 “Mochizuki” (IDS), further in view of Kurtz et al. US 8,159,519 B2 “Kurtz”. Regarding claim 1, KASARLA discloses a camera monitoring (as cited below) system for a vehicle (MOCHIZUKI , Abstract), an electronic control unit ( ECU ) disposed at an interior rearview mirror assembly of a vehicle equipped with the vehicular vision system , wherein the ECU comprises electronic circuitry and associated software (see KASARLA, claim 1), wherein the object is at least one of a person, a tree, a sign, or an animal (KASARLA, para. 22) It is noted that KASARLA is silent about an image processing device configured to identify an object in the scene recorded by the at least one camera; and a control device configured to provide information to be used for control of an infrared lighting device configured to emit infrared light to light up the scene in the field of view of the at least one camera in such a way that infrared light illuminating an object upon identification of the object is reduced to at least one predetermined permissible emission power or is deflected away from the object as claimed. However, MOCHIZUKI discloses the camera monitoring system comprising: at least one camera (See MOCHIZUKI ¶ [0006], A vehicle infrared lamp system according to a first aspect of the present invention is a vehicle infrared lamp system mounted on a vehicle equipped with an infrared camera, including: an infrared light source configured to emit infrared light; an optical member configured to transmit the infrared light emitted from the infrared light source to a lamp front side; an other-vehicle position acquisition unit configured to acquire position information of another vehicle such as an oncoming vehicle or a preceding vehicle; and a control unit configured to control a lighting state of the infrared light source based on the position information of the oncoming vehicle or the preceding vehicle acquired by the other-vehicle position acquisition unit such that a dimming region where radiant intensity of infrared light is lower than radiant intensity of any other region is formed on at least a part of the oncoming vehicle or the preceding vehicle)configured to record a scene (as cited below) in a field of view (as cited below) of the at least one camera (as cited above, para [0006] "an other-vehicle position acquisition" and see also MOCHIZUKI the flowchart in Fig. 6); an image processing device configured to identify an object (¶ [0175] In addition, in the case where the region where the other vehicle CA is detected is set as the normal region and the region other than the normal region is set as the emphasized region, for example, as shown in FIG. 21, an object having low infrared reflection intensity such as a pedestrian HU in the vicinity of the other vehicle CA is easily detected by the infrared camera 2135.) in the scene recorded by the at least one camera (¶ 147, the control unit 2101 controls the infrared unit 2030 to sense an object such as another vehicle by the infrared light emitted from the infrared light source 2031); and a control device configured to provide information to be used for control of an infrared lighting device configured to emit infrared light to light up the scene in the field of view of the at least one camera (as cited above, i.e. MOCHIZUKI ¶ [0006], i.e. a control unit configured to control a lighting state of the infrared light source based on the position information of the oncoming vehicle or the preceding vehicle acquired by the other-vehicle position acquisition unit, and MOCHIZUKI , the flowchart in Fig. 6) in such a way that infrared light illuminating an object upon identification of the object is reduced to at least one predetermined permissible emission power or is deflected away from the object (See MOCHIZUKI para. [0006]: a control unit configured to control a lighting state of the infrared light source based on the position information of the oncoming vehicle or the preceding vehicle acquired by the other-vehicle position acquisition unit… and MOCHIZUKI the flowchart in Fig. 6.). Both KASARLA and MOCHIZUKI teach systems with camera system used for vehicle operation, and those systems are comparable to that of the instant application. Because the two cited references are analogous to the instant application, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains, to include in the KASARLA disclosure, dimming the lights according to the distance, as taught by MOCHIZUKI . Such inclusion would have increased the usefulness of the system by providing the vehicle infrared lamp system capable of detecting an object having low infrared reflection intensity while preventing occurrence of halation in an image captured by an infrared camera, and would have been consistent with the rationale of combining prior art elements according to known methods to yield predictable results to show a prima facie case of obviousness (MPEP 2143(I)(A)) under KSR International Co. v. Teleflex Inc., 127 S. Ct. 1727, 82 USPQ2d 1385, 1395-97 (2007). wherein the object is at least one of a person, a tree, a sign, or an animal (The amendment has similar features as claim 15 and is rejected under the same ground, see KURTZ, para. 22). Regarding claim 2, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 1, further comprising a coupling device configured to be coupled to at least one further camera which is configured to record a further scene in a further field of view (as cited below, see also Fig. 23, ¶ 183); wherein the at least one camera and the at least one further camera are arranged such that their fields of view are aligned on a common point (as cited below) or area of interest (MOCHIZUKI , ¶ [0033] the other-vehicle position acquisition unit 102 is an acquisition unit that acquires position information of another vehicle (including, for example, a preceding vehicle, an oncoming vehicle, and the like). The other-vehicle position acquisition unit 102 acquires the position information of the other vehicle based on an image captured by an infrared camera, a visible light camera, or the like (an example of the in-vehicle camera 6) mounted on the vehicle 1, for example. In addition, the other-vehicle position acquisition unit 102 acquires the position information of the other vehicle based on, for example, information acquired by the LiDAR -an example of the radar 7). Regarding claim 3, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 2, wherein the image processing device is configured to identify an object (i.e. as cited below, i.e. object) in the further scene, in particular for verification (i.e. identifying the vehicle) of an identification (as cited below, i.e. identification of the various incoming vehicles) of an object (MOCHIZUKI , ¶ 6). Regarding claim 4, KASARLA/MOCHIZUKI/KURTZ/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 2, wherein the at least one further camera is configured to provide additional distance information (as cited below, dimming according to the distance) for the identified object; wherein the image processing device is configured to determine a distance (as cited below, i.e. distance detection) of the identified object from the infrared lighting device based on the distance information; and wherein the control device is configured to provide information to an infrared lighting device (i.e. controlling the lighting intensity by dimming) in such a way (i.e. dimming as cited below) that the intensity (as cited below, i.e., dimming) of infrared light emitted by the infrared lighting device illuminating the identified object is reduced if the determined distance is equal to or smaller than a predetermined distance (MOCHIZUKI , ¶ In addition, for example, the control unit 101 may set a dimming level of the infrared light source 32 according to the distance from the own vehicle 1 to the other vehicle A based on the distance information between the other vehicle A and the own vehicle 1 acquired from the distance acquisition unit 103. Specifically, as the distance from the own vehicle 1 to the other vehicle A becomes closer, the second current value supplied to the infrared light source 32 is reduced and the dimming level of the infrared light source 32 relative to the dimming region Q1 is increased. On the other hand, as the distance from the own vehicle 1 to the other vehicle A becomes farther, the second current value supplied to the infrared light source 32 is increased and the dimming level of the infrared light source 32 relative to the dimming region Q1 is reduced.) Regarding claim 5, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 2, wherein the image processing device is configured to determine a select region of the identified object (as cited below, i.e. the detected vehicle), whether the identified object comprises a face or whether a direction of view (i.e. the distance from the other vehicle) of the object necessitates reduction or deflection of infrared light illuminating the object to avoid blinding of the object (MOCHIZUKI , ¶ 54) Regarding claim 6, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 1, further comprising an infrared lighting device coupled to the control device and comprising at least one infrared, IR, lamp, wherein the control device is configured to control the infrared lighting device in such a way that the intensity of infrared light is reduced to the predetermined intensity (as cited below, i.e. predetermined light level) by the at least one IR lamp being switched off or dimmed down (MOCHIZUKI , ¶ 54) Regarding claim 7, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 6, wherein the infrared lighting device comprises at least one optical system associated with the at least one infrared lamp (as cited below i.e. one infrared light), wherein the control device is configured to control the infrared lighting device in such a way (as cited below, i.e. dimming) that the intensity of infrared light emitted by the infrared lighting device illuminating the identified object is reduced by adjusting the at least one optical system (MOCHIZUKI , ¶ 54) Regarding claim 8, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 7, wherein the optical system is configured to deflect infrared light away from an identified object and wherein the control device is configured to control the infrared lighting device in such a way that the intensity (as cited below, i.e. when the car approach the other vehicle, the ir led is dimmed) is reduced by deflecting infrared light emitted by the at least one infrared lamp away from the identified object by means of the optical system (MOCHIZUKI , ¶ 54) Regarding claim 9, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 1, wherein the at least one camera comprises an image sensor (as cited below, i.e. camera) having a plurality of pixel (as cited below, imaging mode) and the image processing device is configured to distinguish and process information provided by different pixel (MOCHIZUKI , ¶ [0084] The infrared light source 1032 is constituted by a plurality of light emitting diodes (LED) that emit infrared light. The infrared light source 1032 is mounted on a board 1039. On and off of the infrared light source 1032 mounted on the board 1039 is controlled by the control unit 1101. The infrared light source 1032 is controlled to be driven in, for example, an on-off state for an imaging mode (first mode) suitable for imaging by the infrared camera 6a and an on-off state for a sensing mode (second mode) suitable for sensing by the infrared sensor 1034). Regarding claim 10, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 9, wherein the infrared lighting device comprises a plurality of IR lamps arranged in a matrix (as cited below, i.e. multiple led); wherein each IR lamp is associated with a pixel or a cluster of pixels of the image sensor (see, Fig. 23); wherein the image processing device is configured to determine in which pixel or cluster of pixel the identified object is located (see the vehicle citation above); and wherein the control device is configured to switch off or dim down (see the dimming citation above) the respective IR lamp associated with the pixel or the cluster of pixels in which the identified object is located (MOCHIZUKI , ¶ 84). Regarding claim 11, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 9, wherein the at least one camera is configured to provide distance information for each pixel of the image sensor (see MOCHIZUKI , i.e. ¶ 52); wherein the image processing device is configured to determine a distance (as cited above, i.e. distance measuring unit) of the identified object from the infrared lighting device based on the distance information supplied for each pixel of the image sensor; and wherein the control device is configured to provide information to control the infrared lighting device in such a way (as cited below, i.e. dimming) that the intensity of infrared light illuminating the identified object is reduced if the determined distance is equal to or smaller than a predetermined distance (MOCHIZUKI , ¶ [0052] Based on the position information of the other vehicle A acquired by the other-vehicle position acquisition unit 102 and the distance information of the other vehicle A acquired by the distance acquisition unit 103, the control unit 101 sets at least a partial region of the other vehicle A as a dimming region where radiant intensity of infrared light is lower than radiant intensity of infrared light with which the other regions are irradiated (step S04). The dimming region of the present example means a region where radiant intensity of infrared light is lower than that in the normal region. The control unit 101 supplies a current having a second current value smaller than the first current value to the infrared light source 32, and irradiates the dimming region with infrared light at illuminance lower than illuminance of the normal region). Regarding claim 12, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 10, wherein the infrared lighting device comprises: a first printed circuit board comprising at least one first IR lamp attached thereto (KASARLA, claim 8), wherein the at least one first IR lamp comprises a first beam angle (MOCHIZUKI , ¶ 108, In addition, according to the vehicle infrared sensor system 1100, in the imaging mode, the infrared LEDs 01a to 10p are simultaneously lighted to irradiate all regions within an angle of view of the infrared camera 6a with infrared light. In addition, in the sensing mode, only one infrared LED is lighted at each moment, and the infrared LED to be lighted is sequentially changed so as to detect presence or absence of reflected infrared light from any direction, thereby detecting presence or absence and a position of an object on the lamp front side. Therefore, imaging accuracy of the infrared camera 6a can be improved in the imaging mode, and the detection accuracy of the infrared sensor 1034 can be improved in the sensing mode) and a first emission power (see MOCHIZUKI , claim 16), a second printed circuit board comprising at least one second IR lamp attached thereto (as cited above), wherein the at least one second IR lamp comprises a second beam angle and a second emission power (see MOCHIZUKI , ¶ 188), wherein the first printed circuit board is arranged in such a way that infrared light emitted by the at least one first IR lamp illuminates a first partial area of the vehicle in a longitudinal direction of the vehicle (as cited above, i.e. MOCHIZUKI , Fig. 23), and wherein the second printed circuit board is arranged in such a way that infrared light emitted by the at least one second IR lamp illuminates a second partial area of the vehicle in the longitudinal direction of the vehicle (MOCHIZUKI , Fig. 5, ¶ [0057], Meanwhile, in a vehicle headlamp that irradiates the front of an own vehicle with visible light, it is known that a dimming region where illuminance of the visible light is reduced is set in a range where a position of another vehicle is detected in order to prevent a driver of the other vehicle from being dazzled. FIG. 5 shows an example of a dimming region Q2 set on the other vehicle A in a case where a light source emits visible light. As shown in FIG. 5, in order to prevent a driver of the other vehicle A from being dazzled when the visible light is emitted, a right boundary line 51 of the dimming region Q2 is set to be rightward of the right end portion 43 of the other vehicle A, and a left boundary line 52 is set to be leftward of the left end portion 44 of the other vehicle A. That is, the dimming region Q2 in the case of emitting visible light is set to be a region having a margin relative to the region of the other vehicle A. A width W3 of the dimming region Q2 is set to be wider than the width W2 of the other vehicle A. In this case, although it is possible to prevent the driver of the other vehicle A from being dazzled, it is difficult to acquire information on the region of the other vehicle A and information on the pedestrians B and C in the vicinity of the other vehicle A.). Regarding claim 13, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 12, wherein at least one of the first printed circuit board or the second printed circuit board comprises at least one heat sink (MOCHIZUKI , ¶ 44) on its rear side for dissipating the heat generated by the respective IR lamp provided on the front side of the printed circuit board (MOCHIZUKI , ¶ [0045], Immediately after the vehicle 1 starts to travel, the normal region is set in all the regions of the irradiable range PI as described above, and the irradiable range PI is irradiated with infrared light with uniform illuminance. In this state, an image of the front of the own vehicle 1 is captured by the infrared camera 34. The image captured by the infrared camera 34 is transmitted to the control unit 101). Regarding claim 14, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the camera monitoring system of claim 13, wherein the heat sink is a flat, plate-shaped heat sink and is made of a thermally conductive material or material combination or a thermally conductive metal or metal alloy (as cited above, see also MOCHIZUKI , claim 26). Regarding claim 15, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, discloses a vehicle comprising a camera monitoring system and an infrared lighting device (see rejection of claim 1), wherein the camera monitoring system and the infrared lighting device are arranged on the vehicle in such a way that the field of view of the at least one camera or the further field of view of the at least one further camera comprises (see rejection of claim 1) at least one longitudinal side (as cited below, i.e., Fig. 23, ¶ 183) of the vehicle (see rejection of claim 1), and wherein the camera monitoring system, the at least one camera , the at least one further camera and the infrared lighting device are arranged independently of one another at different vehicle positions of the vehicle or are arranged in a common housing (MOCHIZUKI , Fig. 23, ¶ 183). Kurtz discloses wherein the camera monitoring system identifies a person as an object within the field of view (KURTZ, col. 41, ln 20-40) and within the further (KURTZ, see citation above) field of view (KURTZ, Col. 26, ln 42- col 27, ln 5). Both D1/D2 and KURTZ teach systems with monitoring system using cameras, and those systems are comparable to that of the instant application. Because the two cited references are analogous to the instant application, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains, to include in the D1/D2 disclosure, object recognition with different FOV, as taught by KURTZ. Such inclusion would have increased the usefulness of the system by using zoom to create an artificial experience, which is equivalent to "better than being there", and would have been consistent with the rationale of combining prior art elements according to known methods to yield predictable results to show a prima facie case of obviousness (MPEP 2143(I)(A)) under KSR International Co. v. Teleflex Inc., 127 S. Ct. 1727, 82 USPQ2d 1385, 1395-97 (2007). Regarding claim 16, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, further discloses the vehicle of claim 15, wherein the vehicle is a truck, a passenger car, a van or a bus (MOCHIZUKI , Fig. 6, abstract). Regarding claim 17, KASARLA/MOCHIZUKI/KURTZ , for the same motivation of combination, discloses a method for controlling a camera monitoring system, the method comprising: lighting up at least one field of view of at least one camera by means of infrared radiation of an infrared lighting device (see rejection of claim 1); recording a scene in the field of view of the at least one camera (see rejection of claim 1); identifying an object in images of the scene recorded by the at least one camera (see rejection of claim 1); and reducing the infrared light illuminating the object to at least one predetermined permissible emission power or deflecting the infrared light illuminating the object away from the object (see rejection of claim 1); wherein the object is at least one of a person, a tree, a sign, or an animal (KASARLA, para. 22); The other amendment has similar features as claim 15 and is rejected under the same ground; wherein the object is at least one of a person, a tree, a sign, or an animal (The amendment has similar features as claim 15 and is rejected under the same ground, see KURTZ, para. 22). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: US 20140267757 A1 PARALLAX CORRECTION IN THERMAL IMAGING CAMERAS US 8520970 B2 Infrared resolution and contrast enhancement with fusion US 8045764 B2 Expedient encoding system US 20110144462 A1 MINIATURIZED MULTI-SPECTRAL IMAGER FOR REAL-TIME TISSUE OXYGENATION MEASUREMENT US 7915652 B2 Integrated infrared and color CMOS imager sensor Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK F HUANG whose telephone number is (571)272-0701. The examiner can normally be reached Monday-Friday, 8:30 am - 6:00 pm (Eastern Time), Federal Alternative First Friday Off. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached at (571)272-2988.. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /FRANK F HUANG/ Primary Examiner, Art Unit 2485
Read full office action

Prosecution Timeline

Aug 08, 2024
Application Filed
Oct 12, 2025
Non-Final Rejection — §103
Jan 14, 2026
Response Filed
Apr 03, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593052
LOCAL ILLUMINATION COMPENSATION FOR VIDEO ENCODING AND DECODING USING STORED PARAMETERS
2y 5m to grant Granted Mar 31, 2026
Patent 12587725
IMAGE CAPTURING DEVICE AND IMAGE CAPTURING METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12579815
VIDEO SURVEILLANCE SYSTEM
2y 5m to grant Granted Mar 17, 2026
Patent 12574625
SYSTEM WITH LIGHTING CONTROL INCLUDING GROUPED CHANNELS
2y 5m to grant Granted Mar 10, 2026
Patent 12568248
METHOD AND APPARATUS FOR DECODING A VIDEO SIGNAL
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
75%
Grant Probability
92%
With Interview (+17.3%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 691 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month