DETAILED ACTION
This office action is in response to an amendment filed 12/17/2025, wherein claims 1-20 are pending and being examined. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
The amendment to claim 20 directs the claim to statutory subject matter. Therefore the rejection of claim 20 under 35 U.S.C. 101 is withdrawn.
Applicant's arguments filed 12/17/2025 with respect to the rejections under 35 U.S.C. 102(a)(1) and 35 U.S.C. 103 have been fully considered but they are not persuasive.
Applicant argues that Li does not disclose claim 1. Specifically, Applicant argues on page 2 of the filed response: “Applicant respectfully disagrees with these teachings as follows. According to the limitation of claim 1 (quoted above), the camera cleaning unit cleans the camera unit. In contrast, in Li, the cleaning mechanism does not clean the camera 1125 of Fig. 11B of Li. In fact, the cleaning mechanism cleans the slanted surface 1170 in front of the camera 1125, not the camera 1125 itself (see Fig. 11B of Li).” The examiner respectfully disagrees.
The claim recites a “camera unit”, wherein a person of reasonable skill in the art would readily appreciate that the entire device (including all pictured components) shown in Fig.11B of Li maybe considered the claimed “camera unit”. Applicant refers to the “slanted surface” of the device shown in Fig.11B, but it is clear that a camera inside a housing is still considered a “camera unit” and therefore the camera’s housing (and slanted surface) shown by Li is still part of a “camera unit”. As common in the art, cameras have housings that are considered to be part of the camera. For example, well-known wearable cameras, trail cameras, dash cameras, etc. are all known to be camera sensors mounted inside housings. Despite all these products being camera sensors located inside housings, the term in the art is still “camera” when referring to such devices. That is, the final product of a camera inside it’s housing is still referred to as a “camera”. A person of ordinary skill would readily appreciate that a camera sensor inside a housing (like that shown in Fig.11B of Li) is considered a “camera unit” under broadest reasonable interpretation.
Applicant further argues on page 3 of the filed response: “Applicant respectfully disagrees with these teachings as follows. According to the limitation of claim 1 (quoted above), the cleaning control unit detects a contamination on the camera unit. In contrast, in Li, the environment-proof sensor system controller 1300 does not detect contamination on the camera 1125 of Fig. 11B of Li. In fact, at best, the environment-proof sensor system controller 1300 detects contamination on the slanted surface 1170 in front of the camera 1125, not contamination on the camera 1125 itself.” The examiner respectfully disagrees.
Again, under broadest reasonable interpretation, the entire device shown in Fig.11B may be considered the claimed “camera unit”. There is nothing in the claim precluding the entire device (camera sensor included in a housing) shown in Fig.11B from being interpreted as a “camera unit”. Applicant appears to be alleging that the sensor itself is the only thing that can be considered a “camera unit”. But as consistent with the art, under broadest reasonable interpretation an imaging sensor inside a housing, like that of Li, may be considered a “camera unit”.
Applicant further argues on pages 3-4 of the filed response: “Applicant respectfully disagrees with this teaching as follows. According to this limitation of claim 1, the cleaning control unit is part of said camera arrangement which is arranged on the frontal section of the vehicle. In other words, the cleaning control unit of claim 1 is on said frontal section of the vehicle. In contrast, in Li, the environment-proof sensor system controller 1300 is not described as being on the frontal section of the vehicle 100 of Fig. 1.” The examiner respectfully disagrees.
There is nothing in claim 1 that requires the cleaning control unit that is part of the “camera arrangement” to be at a frontal section of the vehicle’s exterior. The limitation of “a camera arrangement and a frontal section appliance arranged on the frontal section of the exterior of the vehicle” can be interpreted as only the frontal section appliance being arranged on the front of the vehicle. In fact, if one was to interpret this as the cleaning control unit being at a frontal exterior section of the vehicle, this would be inconsistent with dependent claim 8, which acknowledges the location of the cleaning control unit may be part of an ECU of a vehicle (and thus not “arranged on the frontal section of the exterior of the vehicle”). That is, claim 8 states “The frontal section camera arrangement of claim 1, wherein said cleaning control unit is integrated with said camera unit in a common housing element, or wherein said cleaning control unit is part of an electronic control unit of the vehicle that is connected via at least one communication channel to said camera unit and to said camera cleaning unit.” Therefore it is abundantly clear that requiring the cleaning control unit to be arranged on a frontal section of a vehicle’s exterior as alleged by Applicant is inconsistent with Applicant’s own claimed invention. It is additionally clear from claim 8 that a prior art that includes a cleaning control unit as part of a control unit of the vehicle and communicates with cleaning units of a camera (like that of Li) would satisfy this limitation.
Furthermore, it is apparent from Li that the cleaning signals may originate from a controller and then be sent to controllers of the cleaning devices, and thus controllers located locally to the sensors. For example, ¶0109 of Li states “Such control signals are then be sent, at 1499, from the cleaning control signal generator 1460 to corresponding controllers that may then control their respective cleaning tools to carry out the scheduled cleaning.” Fig.13A additionally shows the system controller in direct communication with the camera assembly. Therefore even if one was to interpret the claim in the manner as argued by Applicant (which, again, is inconsistent with the invention as claimed), it is clear that the controller(s) of Li can be anywhere relative to the sensors, including an area on the front exterior of the vehicle.
Applicant further argues on pages 4-5 of the filed response: “Firstly, this limitation of claim 13 includes detecting contamination on a camera unit. In contrast, Li does not detect contamination on the camera unit 1125 of Fig. 11B of Li. In fact, Li detects contamination on the slanted surface 1170, not contamination on the camera unit 1125 (see Fig. 11B). Secondly, this limitation of claim 13 includes an air-based cleaning process of the camera unit. In contrast, Li provides an air-based cleaning process of the slanted surface 1170, not the camera unit 1125 (see Fig. 11B). 13C. Thirdly, this limitation of claim 13 includes a liquid-based cleaning process of the camera unit. In contrast, Li provides a liquid-based cleaning process of the slanted surface 1170, not the camera unit 1125 (see Fig. 11B).” The examiner respectfully disagrees.
This arguments is the same argument that was originally presented, which is that the camera assembly shown in Fig.11B cannot be considered the claimed “camera unit”. As explained above, there is nothing in the claim precluding the entire device (camera sensor included in a housing) shown in Fig.11B of Li from being interpreted as a “camera unit”. Although Applicant alleges that the sensor itself is the only thing that can be considered a “camera unit”, under broadest reasonable interpretation, an imaging sensor inside a housing, like that of Li, may be considered a “camera unit”. It is well-known in the art to refer to imaging sensors inside of protective housings as “cameras”.
Applicant further argues on pages 5-6 of the filed response: “Applicant respectfully disagrees with these teachings as follows. According to claim 5, the frontal section appliance includes one or more connection elements for attaching said camera unit to said chassis. In contrast, in Li (Fig. 2B), the sensor rack 260 is not described as including any connection elements that attach the camera 140 (Fig. 1) to the chassis of the vehicle 100.” The examiner respectfully disagrees.
It is clear from the various figures of Li that the sensor rack is attached to the chassis of the vehicle. If there were no connection elements as alleged by Applicant, the sensor rack would simply fall off. Therefore “connection elements” are necessarily present in prior art Li in order for the sensor rack to properly be attached to the vehicle. There is nothing in claim 5 to elaborate on the “connection elements” rather than providing a function of attaching (mounting) a camera unit to a chassis of the vehicle. As recited in ¶0047-¶0048 of Li, “different sensor racks are mounted at different parts of a vehicle… sensor racks are installed on a vehicle for their respective roles to collect needed information.” As Li discloses a sensor rack “installed” or “mounted” to a vehicle, is inherent that any installation or mounting would require “connection elements” performing the claimed function of “attaching said camera unit to said chassis”.
Applicant further argues on page 6 of the filed response: “Applicant respectfully disagrees with this teaching. According to this limitation of claim 11, the camera unit is arranged on the chassis. In contrast, in Li, the camera 140 (Fig. 1) may be mounted to the sensor rack 260 (Fig. 2B), or possibly to the bumpers or body panels of the vehicle 100, but not necessarily to the chassis of the vehicle 100.” The examiner respectfully disagrees.
Applicant argues that Li disclosing mounting the sensor rack to any part of the vehicle does “not necessarily” include the chassis of the vehicle. The various figures of Li show that the sensor rack may be attached to vehicle frames, bumpers, and panels, and thus the sensor rack is also attached to any vehicle “chassis”. The claim does not delineate what parts of the vehicle are considered the chassis. In fact, in commercial vehicles, a “chassis cab” consists of an assembly of all the essential parts of a truck, including a frame and cab, without the towed body. It is shown in Fig.1, Fig.2B, Fig.4A, and Fig.4B of Li that sensors can be mounted on the frame of the vehicle, the cab of the vehicle, etc. and ¶0047 notes that the sensors may by “strategically installed at appropriate parts of the vehicle”. Therefore a person of ordinary skill in the art would readily appreciate that under broadest reasonable interpretation, the sensors of Li may be installed on a vehicle’s “exterior… chassis”.
Applicant further argues on page 7 of the filed response: “According to this limitation of claim 18, a first sensor data used to detect and select said object is captured before a second sensor data used for said probing is captured at the predicted location and at the predicted point in time. In contrast, in Herman, the first sensor data captured by the time-of-flight sensor and used as a reference and the second sensor data used for comparison with the reference are captured at the same time (see Abstract of Herman)… the core of the present invention consists in predicting a position and/or a point in time at which the object should be detected by a sensor unit or by the camera unit. This prediction is then validated. However, both Li and Herman do not reveal the core of the invention, namely the prediction of a position and/or a point in time at which the object should be detected by a sensor unit or by the camera unit.” The examiner respectfully disagrees.
Applicant argues that the combination of references does not “reveal the core of the invention”. Herman discloses determining light reflectivity maps and depth maps representing predicted locations of objects that should be detected in subsequent data captures. For example, Fig.4 of Herman shows a reflectivity map and depth map representing a detected object area, wherein this data is captured at a time where no occluded debris is attached to the sensor. As shown in Fig.5 and described by ¶0047 and ¶0056-¶0059 of Herman, a subsequent data capture uses pixel comparison to determine that the vehicle object is not detected at certain image locations, which indicates occluded debris. Therefore Herman does disclose predicting a position… at which the object should be detected by a sensor/camera, because the maps shown in Fig.4 represent information indicating pixel locations of an object that should be detected by the sensor/camera if no occlusion was to occur.
For these reasons as a whole, Applicant’s arguments are not persuasive and the claims are rejected as outlined below.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1, 3, 5-14, 17, and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Li et al. (US 2021/0181502) (hereinafter Li). Li was cited in the information disclosure statement (IDS) filed 8/26/2024.
In regard to claim 1, this claim is drawn to a camera arrangement corresponding to the vehicle of claim 10, wherein claim 1 contains the same limitations as claim 10 and is therefore rejected upon the same basis. See below for the rejection of claim 10.
In regard to claim 3, Li discloses the frontal section camera arrangement of claim 1. Li further discloses,
wherein said cleaning control unit is further connected to one or more sensing units of the vehicle for receiving corresponding sensing data; and, said cleaning control unit is further configured to detect and/or identify a type of contamination using the sensing data [¶0046, ¶0074-¶0076, ¶0097-¶0101, ¶0105-¶0106].
In regard to claim 5, Li discloses the frontal section camera arrangement of claim 1. Li further discloses,
wherein said exterior of the vehicle includes a chassis [Fig.1, Fig.4A, Fig.4B, Fig.5]; and,
said frontal section appliance includes one or more connection elements for attaching said camera unit to said chassis of the vehicle [¶0050-¶0052, ¶0114-¶0115, Fig.1, Fig.4A, Fig.4B, Fig.15].
In regard to claim 6, Li discloses the frontal section camera arrangement of claim 5. Li further discloses,
wherein said camera cleaning unit is integrated into said frontal section appliance [Fig.11B, Fig.15, Fig.16A, Fig.16B, ¶0094-¶0096, ¶0114-¶0116].
In regard to claim 7, Li discloses the frontal section camera arrangement of claim 1. Li further discloses,
wherein said camera unit and said camera cleaning unit are integrated into a common housing element [Fig.11B, ¶0083, ¶0094-¶0096].
In regard to claim 8, Li discloses the frontal section camera arrangement of claim 1. Li further discloses,
wherein said cleaning control unit is integrated with said camera unit in a common housing element, or wherein said cleaning control unit is part of an electronic control unit of the vehicle that is connected via at least one communication channel to said camera unit and to said camera cleaning unit [Fig.13A, Fig.14B, ¶0096-¶0100, ¶0109-¶0111].
In regard to claim 9, Li discloses the frontal section camera arrangement of claim 1. Li further discloses,
wherein said exterior includes a chassis and/or a bumper [Fig.1, Fig.4B, Fig.5].
In regard to claim 10, Li discloses a vehicle [Fig.1; vehicle (100)] comprising:
an exterior defining a frontal section [Fig.1; vehicle with frontal section];
a frontal section camera arrangement [Fig.1; vehicle (100) with cameras mounted on the frontal section] including:
a camera arrangement and a frontal section appliance arranged on said frontal section of said exterior of the vehicle [¶0047-¶0050; different sensor racks are mounted at different parts of a vehicle to provide sensing information to facilitate autonomous driving… sensors deployed on an autonomous vehicle may also be mounted in a sensor rack. Fig.1, Fig.6; various sensors mounted on a sensor rack for attachment to the front side of the vehicle]; said camera arrangement including:
a camera unit arranged on said frontal section appliance for said frontal section of said exterior of the vehicle and configured to provide image data [Fig.2B, Fig.4B; cameras arranged on the front sensor rack. ¶0074; camera 1110 with lens 1102 having environment induced objects deposited thereon and the resultant images];
a camera cleaning unit configured to clean the camera unit [¶0095-¶0094; cleaning mechanisms may be provided alone or in combination on housing assembly… mechanisms are controlled to operate when cleaning is needed] and said camera cleaning unit including:
a compressed air provision unit for providing compressed air for use in an air-based cleaning process for said camera unit [¶0090-¶0091; additional mechanism to clean the slanted surface 1170 may be introduced… air blow controller 1320 … output through the delivery device 1245 (e.g., conduit) (if provided), and through the air hole 1150. ¶0080-¶0081]; and,
a liquid provision unit for providing a liquid for use in a liquid-based cleaning process for said camera unit [¶0080-¶0081; sensor is a camera with its lens facing the cross section of the protruded portion 1165 so that visual information is sensed by the camera through the transparent slanted surface 1170 of the housing assembly 1120… one or more mechanisms or devices may include a fluid cleaning device. ¶0082-¶0084]; and,
said camera cleaning unit being configured to receive operation instructions for driving the camera cleaning unit in the air-based cleaning process and/or the liquid-based cleaning process [¶0100-¶0103; Based on the configuration/schedule for applying the needed cleaning, the environment-proof system controller 1300 then generates, at 1375, control signals to implement the cleaning schedule and sends, at 1380, the control signals to the controller(s) responsible for controlling the cleaning means to carry out the scheduled cleaning… to actually control the cleaning tools to carry out the scheduled cleaning, appropriate control signals for different cleaning tools may be generated and used to activate the selected cleaning tools. ¶0091; An air blow controller 1320 (see FIG. 13A) may be implemented to initiate or drive the compressor 1250 in response to an input command signal ];
said camera arrangement further including:
a cleaning control unit connected to said camera unit for receiving the image data and being connected to said camera cleaning unit for providing said operation instructions [¶0084; controller 1330 (see FIG. 13A) may be implemented to initiate or drive the pump 1210 (and/or fluid cleaning device) in response to an input command signal (e.g., based on sensed information (such as visual information observed by a camera residing in the environment-proof sensor housing assembly 1120). ¶0096-¶0097; a sensor (e.g., a camera) 1125 housed in an environment-proof sensor housing assembly 1120 is connected to environment-proof sensor system controller 1300… activate one or more controllers related to different cleaning means. ¶0075; spurious objects/events from the environment as appeared in an image of a scene acquired by the camera may be detected from the image using various known signal processing techniques];
said cleaning control unit being configured:
to detect a contamination on the camera unit [¶0097; raindrops are detected in images acquired by a camera residing in the assembly 1120… debris are observed from the images from the camera (e.g., objects consistently detected in different frames without any change or movement). ¶0074-¶0076; spurious objects identified by detected edges from the image… detected degradation information may also be used to estimate the cause of the degradation]; and,
upon detecting the contamination on the camera unit, to activate said air-based cleaning process by providing an operation instruction indicative thereof [¶0097; If debris are observed from the images from the camera (e.g., objects consistently detected in different frames without any change or movement), the sensing quality control unit 1310 may invoke the air blow controller 1320 to activate the air hose 1150 to blow off the debris deposited on the slanted surface 1170. ¶0111; first control signal includes instructions to the air blow controller 1320 to control the air hose to blow for 5 seconds];
upon determining that the contamination is still on the camera unit after having performed the air-based cleaning process during a predetermined process time, to additionally or alternatively activate the liquid-based cleaning process by providing an operation instruction indicative thereof [¶0097-¶0101; environment-proof system controller 1300 may first apply the air hose to blow off dust/debris when detected and then monitor whether it satisfactorily remove the degradation without more cleaning. If it further observed that the degradation still exists (even though it may have been reduced via air hose 1150), the environment-proof system controller 1300 may then invoke additional cleaning using either the wiper blade 1140 and/or the hydrophobic spray 1130… cleaning configuration or schedule, e.g., when to apply which cleaning means for how long. ¶0111; second control signal includes instructions to the hydrophobic spray controller 1330 to control the spray holes (and the associated valves and pressure) to release cleaning fluid for 3 seconds]; and,
said camera unit of the camera arrangement being arranged on the frontal section of the exterior of the vehicle [Fig.1, Fig.2B, Fig.4A].
Li discloses a vehicle system wherein multiple sensors, including camera sensors, are mounted within one or more sensor racks and/or directly attached to the front of a vehicle. Images of the vehicle’s environment are detected by a controller, wherein based on objects detected in the captured images, it is determined if defects/debris are obstructing a window of the camera. The controller may differentiate between water droplets and dirt (i.e. determine the “type of contamination”), and depending on what substance is detected, the controller controls one or more cleaning elements. Specifically, the controller may generate signals to activate an air-based cleaning process and/or a liquid-based cleaning process depending on sensed obstruction information.
In regard to claim 11, Li discloses the vehicle of claim 10. Li further discloses,
wherein said exterior of said vehicle includes a chassis [Fig.1, Fig.4A, Fig.4B, Fig.5]; and, said camera unit is arranged on said chassis [Fig.1, Fig.4A, Fig.4B, Fig.5].
In regard to claim 12, Li discloses the vehicle of claim 11. Li further discloses,
wherein said compressed air provision unit is pneumatically connected to an air compressor of the vehicle for receiving the compressed air [¶0090-¶0091].
In regard to claim 13, this claim is drawn to a method corresponding to vehicle of claim 10, wherein every limitation in claim 13 is anticipated or rendered obvious by a corresponding limitation in claim 10. See the rejection of claim 10.
In regard to claim 14, this claim is drawn to a method corresponding to the vehicle of claim 11, wherein every limitation in claim 14 is anticipated or rendered obvious by a corresponding limitation in claim 11. See the rejection of claim 11.
In regard to claim 17, this claim is drawn to a method corresponding to the frontal section camera arrangement of claim 3, wherein every limitation in claim 17 is anticipated or rendered obvious by a corresponding limitation in claim 3. See the rejection of claim 3.
In regard to claim 20, this claim is drawn to a computer program corresponding to vehicle of claim 10, wherein every limitation in claim 20 is anticipated or rendered obvious by a corresponding limitation in claim 10. See the rejection of claim 10. Furthermore, Li discloses that the system may be implemented as executable instructions stored on a computer-readable medium in at least ¶0125-¶0128.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 2 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li (US 2021/0181502) in view of Yamauchi et al. (US 2020/0391702) (hereinafter Yamauchi).
In regard to claim 2, Li discloses the frontal section camera arrangement of claim 1. Li further discloses,
wherein said cleaning control unit is further configured to identify a type of contamination… and to directly select a corresponding one of the air-based cleaning process and the liquid-based cleaning process in dependence on the identified type of contamination and a predetermined association rule between identifiable types of contaminations and cleaning processes [¶0097-¶0101].
Li does not explicitly disclose wherein said cleaning control unit is further configured to identify a type of contamination from a predetermined list of identifiable types of contaminations. However Yamauchi discloses,
wherein said cleaning control unit is further configured to identify a type of contamination from a predetermined list of identifiable types of contaminations [¶0075-¶0076; ECU 22 determines the type of dirt on the cleaning target corresponding to the ith onboard sensor based on the detection data… three types, these being “water droplets”, “muddy water”, and “Dried-on mud”. ¶0089; cleaning condition table 23 b stipulates respective cleaning conditions according to given values of the various input parameter values for the vehicle data, environmental data, dirtiness level, type of dirt] and to directly select a corresponding one of the air-based cleaning process and the liquid-based cleaning process in dependence on the identified type of contamination and a predetermined association rule between identifiable types of contaminations and cleaning processes [¶0089-¶0093; respective cleaning parameter values are then chosen according to the priority ranking for each cleaning parameter as stipulated in the cleaning condition table 23 b. ¶0114-¶0117. Fig.10].
Yamauchi discloses a system for cleaning one or more windows of one or more camera sensors mounted on a vehicle, like Li. As shown in at least Fig.6, a table ("list") of predetermined types of dirt is stored with corresponding cleaning conditions. As noted in ¶113-¶0117 and as shown in Fig.10, an air-cleaning process is selected for a first type of contamination and a liquid-cleaning process is selected for a second type of contamination.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the frontal section camera disclosed by Li with the predetermined list of types of contamination as disclosed by Yamauchi in order to choose cleaning conditions in a simplified manner and prevent unnecessary cleaning processes [Yamauchi ¶0011-¶0014, ¶0117-¶0121]. As disclosed by Yamauchi, pre-storing types of contamination with cleaning conditions allows for reduced processing, as only a finite number of contamination types need to be considered.
In regard to claim 15, this claim is drawn to a method corresponding to the frontal section camera arrangement of claim 2, wherein every limitation in claim 15 is anticipated or rendered obvious by a corresponding limitation in claim 2. See the rejection of claim 2.
Claim(s) 4 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li (US 2021/0181502) in view of Gassend et al. (US 2020/0142041) (hereinafter Gassend).
In regard to claim 4, Li discloses the frontal section camera arrangement of claim 3. Li further discloses,
a wiper status sensor;
a radar sensor [¶0045-¶0046];
a radar sensor in a frontward oriented mounting position close to said camera unit [¶0045-¶0046];
a LIDAR sensor [¶0045-¶0047];
a LIDAR sensor in a frontward oriented mounting position close to said camera unit [¶0045-¶0047];
an ultrasound sensor;
an ultrasound sensor in a frontward oriented mounting position close to said camera unit;
an infrared sensor;
an infrared sensor in a frontward oriented mounting position close to the camera unit; or
an auxiliary camera unit different than said camera unit [¶0045-¶0047].
However Li does not explicitly disclose wherein said sensing units connected to said cleaning control unit and configured to provide the sensing data used for detecting and/or identifying the contamination on the camera unit include one or more of: a wiper status sensor; a radar sensor; a radar sensor in a frontward oriented mounting position close to said camera unit; a LIDAR sensor; a LIDAR sensor in a frontward oriented mounting position close to said camera unit; an ultrasound sensor; an ultrasound sensor in a frontward oriented mounting position close to said camera unit; an infrared sensor; an infrared sensor in a frontward oriented mounting position close to the camera unit; or an auxiliary camera unit different than said camera unit. However Gassend discloses,
wherein said one or more sensing units connected to said cleaning control unit and configured to provide the sensing data used for detecting and/or identifying the contamination on the camera unit [¶0024. ¶0139-¶0141; likelihood that the occlusion is physically coupled to the LIDAR device (e.g., attached to the LIDAR device, or attached to another nearby structure, etc.), the extent of the occlusion… system can make these determinations by assessing various factors such as: returning light pulse intensities/numbers, prior information about the prevalence of a certain type of occlusion in a region of the environment where the LIDAR device is currently located, a speed of a vehicle on which the LIDAR device is mounted, and/or corroborating data from other sensors. ¶0178; use the obtained data to distinguish external occlusions (e.g., external object, dust, etc.) that obstruct at least a portion of the scanned FOV] include one or more of:
a wiper status sensor;
a radar sensor [¶0089; sensors (e.g., RADARs, SONARs, cameras, other active sensors, etc.)];
a radar sensor in a frontward oriented mounting position close to said camera unit;
a LIDAR sensor [¶0173; LIDAR device of method 400 can be included in a vehicle such as vehicle 300. In this example, the data indicating the location of the target can be from another sensor of the vehicle, such as one or more of the sensors in sensor system 304 (e.g., another LIDAR, a RADAR, a camera, etc.), which scans a second FOV that at least partially overlaps with the FOV of the LIDAR device];
a LIDAR sensor in a frontward oriented mounting position close to said camera unit;
an ultrasound sensor [¶0089; sensors (e.g., RADARs, SONARs, cameras, other active sensors, etc.) ¶0044; (e.g., infrasonic, ultrasonic, etc.)];
an ultrasound sensor in a frontward oriented mounting position close to said camera unit;
an infrared sensor [¶0089; sensors (e.g., RADARs, SONARs, cameras, other active sensors, etc.) ¶0044; (e.g., infrasonic, ultrasonic, etc.)];
an infrared sensor in a frontward oriented mounting position close to the camera unit; or
an auxiliary camera unit different than said camera unit [¶0028; non-exhaustive list of example sensors of the present disclosure includes LIDAR sensors, RADAR sensors, SONAR sensors, active IR cameras, and/or microwave cameras, among others].
Although Li discloses multiple sensors for detecting data, as Li does not explicitly disclose detecting/identifying a contamination for a first camera/sensor using data of an additional camera/sensor, Gassend is relied upon. Gassend discloses a vehicle system wherein multiple sensor devices including cameras, LIDARs, etc. are mounted on a vehicle. The degree of occlusion for a first sensor can be determined by using corroborating data from a second sensor having an overlapping view as the first sensor.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the frontal section camera disclosed by Li with the additional sensor data of Gassend in order to provide occlusion detection for a sensor array with a large detection area [Gassend Abstract, ¶0051-¶0052, ¶0173-¶0178]. As disclosed by Gassend, using multiple devices assists with occlusion detection for wide-angle scanning device by tracking occlusion in overlapping fields of view.
In regard to claim 16, this claim is drawn to a method corresponding to the frontal section camera arrangement of claim 4, wherein every limitation in claim 16 is anticipated or rendered obvious by a corresponding limitation in claim 4. See the rejection of claim 4.
Claim(s) 18 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li (US 2021/0181502) in view of Herman et al. (US 2020/0398797) (hereinafter Herman).
In regard to claim 18, Li discloses the method of claim 13. Li does not explicitly disclose, wherein the step of detecting the contamination and/or identifying a type of contamination further comprises: selecting an object detected by the camera unit and/or one or more of the sensing units; predicting a location and/or point in time for detecting the selected object by another one of the camera unit or one or more of the sensing units; capturing image data by the camera unit and/or sensing data by one or more of the sensing units; probing whether the selected object has been detected at the predicted location and/or point in time by the camera unit and/or one or more of the sensing units; wherein, as follow up of the probing: a determination is made that the selected object has not been detected at the predetermined location and/or point in time by the camera unit and/or one or more of the sensing units based thereon it is decided and/or indicated by a signal that a contamination has been detected. However Herman discloses,
wherein the step of detecting the contamination and/or identifying a type of contamination [¶0060; identifying the type and amount of obstruction causing the occlusion… if the occlusion is heavy dirt on the camera 205, the computer 105 can actuate an air nozzle to blow air onto the camera 205 to remove the dirt. In another example, if the occlusion is water and dirt on the camera, the computer 105 can actuate a fluid nozzle to spray cleaning fluid onto the camera 205] further comprises:
selecting an object detected by the camera unit and/or one or more sensing units [¶0047-¶0048; “reflectivity map” is a map of an amount of light reflected from objects viewed by the camera 205];
predicting a location and/or point in time for detecting the selected object by another one of the camera unit or one or more of the sensing units [¶0047; depth map 405 based on data 115 collected from a time-of-flight sensor 200 having no occluding debris 300. ¶0052-¶0053; develop a three-dimensional cross section of the object detected by sensors 110 other than the time-of-flight sensor 200, e.g., with radar and/or lidar, and can interpolate a surface connecting the data 115 from the sensors 110 to generate the predicted depth map];
capturing image data by the camera unit and/or sensing data by one or more of the sensing units [¶0047-¶0048; To generate a reflectivity map 400, 500 and a depth map 405, 505, the computer 105 instructs the time-of-flight sensor 200 and one or more other sensors 110 to collect data 115 around the vehicle 101… image sensor 110 can collect image data 115 of a target vehicle 101 near the vehicle 101… additional sensors 110 can include, e.g., lidar, radar, infrared emitters, ultrasonic transducers, etc.];
probing whether the selected object has been detected at the predicted location and/or point in time by the camera unit and/or one or more of the sensing units [¶0052-¶0053; Based on the predicted amplitude and phase delay, the computer 105 can predict a reflectivity and a depth from the object for the pixel. Upon applying the ray-tracing technique for each pixel of the fused data 115, the computer 105 can collect the predicted reflectivity and depth into respective maps, i.e., a predicted reflectivity map and a predicted depth map. ¶0028-¶0032];
wherein, as follow up of the probing:
a determination is made that the selected object has not been detected at the predetermined location and/or point in time by at least one or the camera unit and one or more of the sensing units and based thereon it is decided and/or indicated by a signal that a contamination has been detected [¶0057-¶0059; computer 105 can determine an occlusion of the time-of-flight sensor 200 based on the reflectivity map 400, 500, the depth map 405, 505, the predicted reflectivity map, and the predicted depth map… computer 105 can determine the occlusion of the time-of-flight sensor 200 based on a number of pixels of the reflectivity map 400, 500 and/or the depth map 405, 505 that have a reflectivity below a reflectivity threshold or a depth below a depth threshold. Fig.6].
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the frontal section camera disclosed by Li with the probing of Herman in order to allow for multiple data sources to be fused together, thereby improving accuracy of collected data [Herman ¶0036, ¶0046-¶0048, ¶0050-¶0053].
In regard to claim 19, Li discloses the method of claim 13. Li does not explicitly disclose, wherein the step of detecting the contamination and/or identifying a type of contamination further comprises: selecting an object detected by the camera unit and/or one or more of the sensing units; predicting a location and/or point in time for detecting the selected object by another one of the camera unit or one or more of the sensing units; capturing image data by the camera unit and/or sensing data by one or more of the sensing units; probing whether the selected object has been detected at the predicted location and/or point in time by the camera unit and/or one or more of the sensing units; wherein, as follow up of the probing: a determination is made that the selected object has not been detected at the predetermined location and/or point in time by the camera unit and/or one or more of the sensing units. However Herman discloses,
wherein the step of detecting the contamination and/or identifying a type of contamination [¶0060; identifying the type and amount of obstruction causing the occlusion… if the occlusion is heavy dirt on the camera 205, the computer 105 can actuate an air nozzle to blow air onto the camera 205 to remove the dirt. In another example, if the occlusion is water and dirt on the camera, the computer 105 can actuate a fluid nozzle to spray cleaning fluid onto the camera 205] further comprises:
selecting an object detected by the camera unit and/or one or more sensing units [¶0047; “reflectivity map” is a map of an amount of light reflected from objects viewed by the camera 205];
predicting a location and/or point in time for detecting the selected object by another one of the camera unit or one or more of the sensing units [¶0047; depth map 405 based on data 115 collected from a time-of-flight sensor 200 having no occluding debris 300. ¶0052-¶0053; develop a three-dimensional cross section of the object detected by sensors 110 other than the time-of-flight sensor 200, e.g., with radar and/or lidar, and can interpolate a surface connecting the data 115 from the sensors 110 to generate the predicted depth map];
capturing image data by the camera unit and/or sensing data by one or more of the sensing units [¶0047-¶0048; To generate a reflectivity map 400, 500 and a depth map 405, 505, the computer 105 instructs the time-of-flight sensor 200 and one or more other sensors 110 to collect data 115 around the vehicle 101… image sensor 110 can collect image data 115 of a target vehicle 101 near the vehicle 101… additional sensors 110 can include, e.g., lidar, radar, infrared emitters, ultrasonic transducers, etc.];
probing whether the selected object has been detected at the predicted location and/or point in time by the camera unit and/or one or more of the sensing units [¶0052-¶0053; Based on the predicted amplitude and phase delay, the computer 105 can predict a reflectivity and a depth from the object for the pixel. Upon applying the ray-tracing technique for each pixel of the fused data 115, the computer 105 can collect the predicted reflectivity and depth into respective maps, i.e., a predicted reflectivity map and a predicted depth map. ¶0028-¶0032];
wherein, as follow up of the probing:
a determination is made that the selected object has not been detected at the predetermined location and/or point in time by the camera unit and/or one or more of the sensing units [¶0057-¶0059; computer 105 can determine an occlusion of the time-of-flight sensor 200 based on the reflectivity map 400, 500, the depth map 405, 505, the predicted reflectivity map, and the predicted depth map… computer 105 can determine the occlusion of the time-of-flight sensor 200 based on a number of pixels of the reflectivity map 400, 500 and/or the depth map 405, 505 that have a reflectivity below a reflectivity threshold or a depth below a depth threshold. Fig.6].
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the frontal section camera disclosed by Li with the probing of Herman in order to allow for multiple data sources to be fused together, thereby improving accuracy of collected data [Herman ¶0036, ¶0046-¶0048, ¶0050-¶0053].
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to REBECCA A VOLENTINE whose telephone number is (571)270-7261. The examiner can normally be reached Monday-Friday 9am - 5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Joe Ustaris can be reached at (571)272-7383. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/REBECCA A VOLENTINE/Primary Examiner, Art Unit 2483 March 31, 2026