DETAILED ACTION
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 7 and 17 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the enablement requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to enable one skilled in the art to which it pertains, or with which it is most nearly connected, to make and/or use the invention.
For clarification, claim 7 requires “the fifth way comprises first distance value and the sixth way comprises a second distance value different than the first distance value.” This means that the claim requires the range sensor perceive the first and second unpowered geometric objects at different distances. However, this does not appear to be disclosed in the specification. For example, the specification discloses:
[0006] In another example, a system includes a visual spectrum sensor, a thermal sensor, a range sensor, and an unpowered calibration tool. The calibration tool includes a plurality of geometric objects configured to be perceived in one of two different ways by the visual spectrum sensor, one of two different ways by the thermal sensor, and one of two different ways by the range sensor.
[0042] The following portion of this paragraph delineates example 14 of the subject matter, disclosed herein. According to example 14, a system includes a visual spectrum sensor, a thermal sensor, a range sensor, and an unpowered calibration tool. The calibration tool comprises a plurality of geometric objects configured to be perceived in one of two different ways by the visual spectrum sensor, one of two different ways for the thermal sensor, and one of two different ways for the range sensor.
Therefore, the specification does have support for a range sensor being perceived in more than one way. However, in regards to different distance values for each set of unpowered object as required by “a fifth way” and “a sixth way” in claim 7, the Examiner does not see from the specification how this is performed. For example, in regards to distance values the specification discloses:
[0018] The range sensor may include a three-dimensional (3D) scanning radar device 136 that is configured to generate digital data in a range spectrum (i.e., radar images with different distance values).
[0023] The LIDAR produces a distance value to the calibration target 144.
[0035] According to example 7, which encompasses any of examples 2-6, above, the another different way perceived by the third type of sensor comprises a distance value.
From these portions which disclose distance values it merely discloses that the sensor can generate images with different values but makes no mention of different values corresponding to different sets of unpowered geometric objects as claimed. Furthermore, it appears to only disclose that the third sensor perceives “a distance value”. This is notably not plural and does not address any correspondence between this distance value and each different set of unpowered geometric objects. Essentially, the disclosure does not enable distinguishing a different distance value for each different set of unpowered geometric objects. For example, how are all the black squares distinguished from the white squares using a distance value of the range sensor? The specification discloses the LIDAR produces “a distance value” to the entire calibration target 144. Not to each unpowered set of geometric objects. Furthermore, the example 7 only discloses “another different way” for the third sensor to be perceived as “a distance value”, but does not make mention of a second “another different way” or “a fifth way” and “a sixth way” or any type of correspondence to each set of unpowered objects.
Most notably the specification paragraph [0045] states:
[0045] The following portion of this paragraph delineates example 17 of the subject matter, disclosed herein. According to example 17, which encompasses any of examples 14-16, above, the two different ways that the geometric objects are perceived by the visual spectrum sensor comprise visible light of different intensities, the two different ways perceived by the visual spectrum sensor are different by greater than a threshold amount of light, the two different ways that the geometric objects are perceived by the thermal sensor comprise thermal images of different thermal values, the two different ways perceived by the thermal sensor are different by greater than a threshold amount, and the different way that the unpowered calibration tool is perceived by the range sensor comprises a distance value.
It is noted by the Examiner that in this portion of the disclosure it is specifically stated that the for the first and second sensor the geometric objects are perceived in two different ways. However, for the range sensor it is the unpowered calibration tool that is perceived and comprises a distance value. It is explicitly that the range sensor perceives the calibration tool, not the geometric objects, in regards to distance. Nowhere in the disclosure is it stated that the range sensor can perceive two different distance values corresponding to the unpowered geometric objects. The fact that the distance are required to be different means that the black and white squares are at different distances and the disclosure does not appear to describe a calibration target where the white and black squares are at distinguishably different distances to the range sensor.
For example, Figures 5A-C and paragraph [0024] state “the 3D scanning radar device 136) will produce lighter and darker outputs that correspond to the light-colored spaces 152 and the dark-colored spaces 154, respectively.” However, it does not disclose that there are differences in distance values merely based on the different colored squares. This would appear to be based on the reflection of light off of white and black squares as indicated in paragraph [0020] where it states “The dark-colored spaces 154 reflect less than a first threshold amount of light wavelengths (i.e., black or other dark color) and the light-colored spaces 152 reflect greater than a second threshold amount of light wavelengths (i.e., white or other light color).” However, conceivably a black and white square next to each other could have the same distance value.
In regards to claim 17, it requires the features of “the fifth way comprises a distance greater than a threshold distance and the sixth way comprises a distance less than the threshold distance” however, the specification does not include any reference to “a threshold distance” or each set of unpowered objects being greater or less than a threshold distance. The specification includes thresholds for the amount of light ([0045]) however, no references to a threshold distance. Therefore, this limitation is not enabled for reasons similar to claims 7 and additionally for an added distance threshold feature.
For the purposes of examination the Examiner will interpret that a lidar sensor that scans a checkerboard target and receives distance values is equivalent to a fifth way (first distance measurement from a first pulse in the scan) and a sixth way (second distance measurement from a second pulse in the scan). The Examiner will also interpret the LIDARs ability to determine two measurements are different requires a threshold difference between them. Since the disclosure is not enabling the Examiner interprets the ability to detect different distance values as being able to perceive the fifth way and a sixth way where the fifth way is a larger value than the sixth way.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-3, 5-6, 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda et al. US 2012/0069193 hereinafter referred to as Ramegowda in view of Xiang US 2024/0383560 hereinafter referred to as Xiang.
In regards to claim 1, Ramegowda teaches:
“A calibration tool comprising: a first set of unpowered geometric objects; and a second set of unpowered geometric objects disposed adjacent to the first set of unpowered geometric objects, wherein: the first set of unpowered geometric objects is configured to be perceived in a first way for a digital camera and the second set of unpowered geometric objects is configured to be perceived in a second way for the digital camera”
Ramegowda paragraph [0013] and Figure 1 teach a checkerboard pattern 11 captured by an electro-optical camera. Ramegowda Figure 5 and paragraph [0016] teaches in a situation where camera 21 is an electro-optical camera, an infrared image of target 12 from camera 15 and a visible wavelength image of target 12 from camera 21 may go to processor 20 for comparison from a geometrical or other perspective, as long as target 12 has attributes visible to the electro-optical camera. Ramegowda claim 6 teaches the camera may be a digital camera. The Examiner interprets that the black and white squares may be equivalent to being perceived in a first way and a second way for a first type of sensor (electro-optical camera).
“the first set of unpowered geometric objects is configured to be perceived in a third way for an infrared camera and the second set of unpowered geometric objects is configured to be perceived in a fourth way for the infrared camera”
Ramegowda paragraph [0013] teaches image captured by a thermal camera with the presently constructed checkerboard pattern 12. Ramegowda paragraph [0014] teaches light geometrical symbols 17 may have a first emissivity value and darker geometrical symbols 18 may have a second emissivity value. The first and second emissivity values should be at least 20 percent different from each other for effortless viewing and use of calibration target 12 in infrared image 16. One may note that satisfactory emissivity value differences may instead be 30, 40, 50, 60, 70, 80, 90 or virtually 100 percent. The emissivity values could be less than 20 percent different and yet be useful. The Examiner interprets the different emissivities of the geometric objects as being perceived in a third way and a fourth way for the second type of sensor (thermal camera). Ramegowda paragraph [0010] teaches infrared image may be captured by the thermal camera. The Examiner interprets that if the camera captured infrared images then it is an infrared camera.
Ramegowda does not explicitly teach:
“and the first set of unpowered geometric objects is further configured to be perceived in a fifth way fora 3D scanning radar device and the second set of unpowered geometric objects is configured to be perceived in a sixth way for the 3D scanning radar device”
Xiang teaches in paragraph [0208] there are two keys insisted in this disclosure. The first one is to use the geometrical features to find the dimension of the checkerboard in 3D point cloud. The second one is using the different LIDAR reflection of the white and black blocks on the checkerboard. Xiang paragraph [0280] teaches the result of detecting checkerboard corners has been shown in FIG. 46. The different color on point cloud means different reflection intensity. The Examiner interprets that different reflection intensities are a fifth and a sixth way. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda in view Xiang to have included the features of “and the first set of unpowered geometric objects is further configured to be perceived in a fifth way fora 3D scanning radar device and the second set of unpowered geometric objects is configured to be perceived in a sixth way for the 3D scanning radar device” because the intelligence based on high-definition images help locate moving or stationary targets while precise distance measurement and positioning are offered by Lidar data, coherently (Xiang [0007]).
In regards to claim 2, Ramegowda teaches all the limitations of claim 1 but does not explicitly teach:
“wherein: each one of the first way, the second way, the third way, the fourth way, the fifth way, and the sixth way is different from any other one of the first way, the second way, the third way, the fourth way, ...”
The Examiner interprets that being perceived optically is different than being perceived thermally. In addition being perceived white is different than being perceived black and being perceived at 22 deg C is different than being perceived as 30 deg C. Therefore, all the four ways are perceived differently.
Ramegowda does not explicitly teach:
“the fifth way, and the sixth way”
Xiang teaches in paragraph [0208] there are two keys insisted in this disclosure. The first one is to use the geometrical features to find the dimension of the checkerboard in 3D point cloud. The second one is using the different LIDAR reflection of the white and black blocks on the checkerboard. Xiang paragraph [0280] teaches the result of detecting checkerboard corners has been shown in FIG. 46. The different color on point cloud means different reflection intensity. The Examiner interprets that different reflection intensities are a fifth and a sixth way. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda in view Xiang to have included the features of “the fifth way, and the sixth way” because the intelligence based on high-definition images help locate moving or stationary targets while precise distance measurement and positioning are offered by Lidar data, coherently (Xiang [0007]).
In regards to claim 3, Ramegowda/Xiang teach all the limitations of claim 2 and further teach:
“wherein the first way comprises a first intensity of visible light and the second way comprises a second intensity of visible light that is different than the first intensity of visible light”
Ramegowda Figure 1 teaches the calibration pattern 12 can perceive both black and white light intensities.
In regards to claim 5, Ramegowda/Xiang teach all the limitations of claim 3 and further teach:
“wherein the third way comprises a first thermal value and the fourth way comprises a second thermal value different than the first thermal value”
Ramegowda Figure 3 illustrates different thermal values detected.
In regards to claim 6, Ramegowda/Xiang teach all the limitations of claim 5 and further teach:
“wherein the third way comprises a thermal value greater than a threshold thermal value and the fourth way comprise a thermal value less than the threshold thermal value”
Ramegowda paragraph [0014] teaches the first and second emissivity values should be at least 20 percent different from each other for effortless viewing and use of calibration target 12 in infrared image 16.
In regards to claim 14, Diederichs teach all the limitations of claim 2 and claim 14 contains similar limitations as in claim 2 (including independent claim 1). Therefore, claim 14 is rejected for similar reasoning as applied to claim 2.
Additionally, Ramegowda teaches:
“A system comprising: a visual spectrum sensor; a thermal sensor; … and an unpowered calibration tool comprising”
Ramegowda Figure 5 and paragraph [0016] teaches in a situation where camera 21 is an electro-optical camera, an infrared image of target 12 from camera 15 and a visible wavelength image of target 12 from camera 21 may go to processor 20 for comparison from a geometrical or other perspective, as long as target 12 has attributes visible to the electro-optical camera. From Figures 1 and 3-4 inter alia, the darker geometrical symbols 18 and light geometrical symbols 17 are able to be perceived by both sensors.
Ramegowda does not explicitly teach:
“a 3d scanning radar device”
Xiang paragraph [0169] teaches A typical point cloud scan frame captured from LiDAR can be visualized in FIG. 22. The points were colored by order of intensity. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda in view Xiang to have included the features of “a 3d scanning radar device” because the intelligence based on high-definition images help locate moving or stationary targets while precise distance measurement and positioning are offered by Lidar data, coherently (Xiang [0007]).
Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda in view of Xiang in view of Wu 2022/0284630 hereinafter referred to as Wu.
In regards to claim 4, Ramegowda/Xiang teach all the limitations of claim 3 but do not explicitly teach:
“wherein the first way comprises an amount of light greater than a threshold amount of light and the second way comprises an amount of light less than the threshold amount of light”
The ability to tell the difference between black and white implies the sensor can detect black and lighter than black colors at least. The Examiner interprets that the light squares are greater than whatever minimum detection amount results in lighter than black detection. This claimed feature does not appear to provide any unpredictable results. It has been held that “[t]he combination of familiar elements according to known methods is likely to be obvious when it does not more than yield predictable results.” KSR., 127 S. Ct. at 1739, 82 USPQ2d at 1395 (2007) (Citing Graham, 383 U.S. at 12).
For example, Wu teaches in paragraph [0043] it should be understood that specific colors of the checkerboard are not particularly limited in this embodiment, provided that a color contrast between a checkerboard cell and an adjacent cell is greater than a particular preset threshold. The preset threshold may be customized according to the intensity of the contrast provided that the preset threshold meets a camera calibration requirement. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Xiang to have included the features of “wherein the two different ways that the unpowered geometric objects are perceived by the first type of sensor are different by greater than a threshold amount of light” because an image of a calibration board with a fixed-pitch pattern array is captured by a camera, and through calculation using a calibration algorithm, a geometric model of the camera can be obtained, thereby obtaining a high-precision measurement and reconstruction result (Wu [0003]).
Claim(s) 7 and 15-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda in view of Xiang in view of Diederichs et al. US 2021/0192788 hereinafter referred to as Diederichs.
In regards to claim 7, Ramegowda/Xiang teach all the limitations of claim 5 but do not explicitly teach:
“wherein: the fifth way comprises first distance value and the sixth way comprises a second distance value different than the first distance value”
Diederichs teaches in paragraph [0138] the calibration targets can be simultaneously detected by the camera and LiDAR. In the examples that follow, the calibration targets 1305a . . . 1305d are checkerboards, as shown in FIG. 13B. The Examiner interprets that different parts of the checkerboard will have different distances and can provide a fifth way and a sixth way of being perceived. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Xiang in view of Diederichs to have included the features of “and the fifth way comprises first distance value and the sixth way comprises a second distance value different than the first distance value” because checkerboards are commonly used to perform the extrinsic calibration of various cameras and LiDAR range sensors because they provide numerous geometrical constraints and are easily detected by both sensors (Diederichs [0139]).
In regards to claim 15, Ramegowda/Xiang teach all the limitations of claim 14 and further teach:
“further comprising a processor configured to perform calibration of the visual spectrum sensor, the thermal sensor …”
Ramegowda teaches in paragraph [0015] FIG. 5 is a diagram of thermal camera 15 and another camera 21 capturing images of calibration target 12. Camera 21 may be another thermal camera or an electro-optical camera. In a situation where camera 21 is a thermal camera, infrared images of target 12 from cameras 15 and 21 may go to processor 20 for comparison or analysis. Camera 15 may be calibrated to camera 21 or vice versa.
Ramegowda/Xiang do not explicitly teach:
“[calibration of the visual spectrum sensor] and the range sensor”
Diederichs teaches in paragraph [0138] the calibration targets can be simultaneously detected by the camera and LiDAR. In the examples that follow, the calibration targets 1305a . . . 1305d are checkerboards, as shown in FIG. 13B. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda in view of Diederichs to have included the features of “a range sensor” and “[two different ways by the visual spectrum sensor] and a different way by the range sensor” because this enables autonomous vehicles to generate a deeper understanding of the surrounding environment (Diederichs [0003]).
In regards to claim 16, Ramegowda/Xiang/Diederichs teach all the limitations of claim 15 and further teach:
“wherein the calibration of the visual spectrum sensor and the thermal sensor is a simultaneous and consistent calibration”
Ramegowda Figure 5 and paragraph [0016] teaches in a situation where camera 21 is an electro-optical camera, an infrared image of target 12 from camera 15 and a visible wavelength image of target 12 from camera 21 may go to processor 20 for comparison from a geometrical or other perspective, as long as target 12 has attributes visible to the electro-optical camera. From Figures 1 and 3-4 inter alia, the darker geometrical symbols 18 and light geometrical symbols 17 are able to be perceived by both sensors. The Examiner interprets from Figure 5 the target is being imaged at the same time and the comparison would require both images. Therefore this would be equivalent to simultaneous. Furthermore, the Examiner interprets the calibration is consistent in that as disclosed it is repeatable over time in the same manner.
Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda in view of Xiang in view of Diederichs in view of Shotan et al. US 12,085,678 hereinafter referred to as Shotan.
In regards to claim 8, Ramegowda/Xiang/Diederichs teach all the limitations of claim 7 but do not explicitly teach:
“wherein the calibration tool has a known distance value from at least the third type of sensor”
Shotan column 4 lines 27-30 teach one technique for calibrating a lidar device may include positioning a lidar calibration target (e.g., having a predetermined reflectivity) at a predetermined distance relative to the lidar device. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Xiang/Diederichs in view of Shotan to have included the features of “wherein the calibration tool has a known distance value from at least the third type of sensor” because calibration can correct for defects in the fabrication and/or assembly of a lidar device. For example, if a lens of a lidar device has one or more defects such that an intensity of a transmitted signal and/or reflected signal is reduced upon being transmitted through the lens, a calibration can allow this to be accounted for in run-time (e.g., using a processor that provides a correction to measurements such that they more accurately reflect a physical scene) (Shotan column 1 lines 30-38).
Claim(s) 9-10 and 18-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda in view of Xiang in view of Ning et al. US 2023/0419541 hereinafter referred to as Ning.
In regards to claim 9, Ramegowda/Xiang teach all the limitations of claim 7 and further teach:
“wherein the first set of geometric objects and the second set of geometric objects are located on a surface of a rigid material”
Ning paragraph [0036] teaches the calibration target 400A comprises a substrate 402A with a checkerboard pattern 404A printed, stamped, engraved, imprinted, or otherwise marked thereon. The substrate 402A may be paper, cardboard, plastic, metal, foam, or some combination thereof. The Examiner interprets foam as a rigid material. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Xiang in view of Ning to have included the features of “wherein the geometric objects are located on a surface of a rigid material” because prior art solutions for intrinsic camera calibration for AVs are not entirely satisfactory (Ning paragraph [0005]).
In regards to claim 10, Ramegowda/Xiang/Ning teach all the limitations of claim 9 and further teach:
“wherein the rigid material comprises a rigid, lightweight sheet material comprising a foam layer …”
Ning paragraph [0036] teaches the calibration target 400A comprises a substrate 402A with a checkerboard pattern 404A printed, stamped, engraved, imprinted, or otherwise marked thereon. The substrate 402A may be paper, cardboard, plastic, metal, foam, or some combination thereof. The Examiner interprets foam as a rigid material. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Xiang in view of Ning to have included the features of “wherein the geometric objects are located on a surface of a rigid material” because prior art solutions for intrinsic camera calibration for AVs are not entirely satisfactory (Ning paragraph [0005]).
Ramegowda/Xiang/Ning do not explicitly teach:
“… between wood pulp veneer layers”
However, based on Applicant’s specification this feature refers to a known foam board material Gatorboard© that is available off-the-shelf. This feature appears to be nothing more than a combination of familiar elements (foam board calibration target known from Ning and off-the-shelf foam board commercially available) using known methods (applying the calibration pattern to the foam board). It has been held that “[t]he combination of familiar elements according to known methods is likely to be obvious when it does not more than yield predictable results.” KSR., 127 S. Ct. at 1739, 82 USPQ2d at 1395 (2007) (Citing Graham, 383 U.S. at 12).
In regards to claim 18, Ramegowda/Xiang teach all the limitations of claim 14 and claim 18 contains similar limitations to that of claim 9. Therefore, claim 18 is rejected for similar reasoning as applied to claim 9.
In regards to claim 19, Ramegowda/Dierderichs teach all the limitations of claim 18 and claim 19 contains similar limitations to that of claim 10. Therefore, claim 19 is rejected for similar reasoning as applied to claim 10.
Claim(s) 11-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda in view of Xiang in view of Ning in view of Mitra US 2020/0261302 hereinafter referred to as Mitra.
In regards to claim 11, Ramegowda/Xiang/Ning teach all the limitations of claim 10 and further teach:
“the rigid material”
Ning paragraph [0036] teaches the calibration target 400A comprises a substrate 402A with a checkerboard pattern 404A printed, stamped, engraved, imprinted, or otherwise marked thereon. The substrate 402A may be paper, cardboard, plastic, metal, foam, or some combination thereof. The Examiner interprets foam as a rigid material. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Xiang in view of Ning to have included the features of “the rigid material” because prior art solutions for intrinsic camera calibration for AVs are not entirely satisfactory (Ning paragraph [0005]).
Ramegowda/Xiang/Ning do not explicitly teach:
“wherein the first set of geometric objects is formed by a dark colored tape applied to the surface of [white paper]; and the second set of geometric objects being the surface of [white paper] disposed adjacent to the first set of geometric objects”
This is known way of creating a calibration pattern. For example, Mitra teaches Figure 1 and paragraph [0016] Calibrate the IR sensor by facing down the IR sensor on a blank white piece of paper with black electrical tape in the middle. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Xiang/Ning in view of Mitra to have included the features of “wherein the first set of geometric objects is formed by a dark colored tape applied to the surface of [white paper]; and the second set of geometric objects being the surface of [white paper] disposed adjacent to the first set of geometric objects” to find a way that can eliminate the use of expensive tactile paving and find an alternative cost-effective solutions for blind people to navigate roads (Mitra [0068]).
In regards to claim 12, Ramegowda/Xiang/Ning/Mitra teach all the limitations of claim 11 and further teach:
“wherein: the first set of geometric objects reflects less than a first threshold amount of light wavelengths; and the second set of geometric objects reflects greater than a second threshold amount of light wavelengths”
However, any black and white regions received at a camera are necessarily distinguished by the differences in reflectivities from the black and white regions on the calibrator board. There is necessarily a value between these two extremes with which the black regions reflect less than and a value with which the white regions reflect greater than. For example, all the grey scale values between black and white. These values may be chosen as thresholds. Therefore, the ability to detect black and white regions necessitates at least two grey scale values which may be interpreted as thresholds. Ramegowda teaches detection of the dark and lighter regions of the checkerboard pattern as indicated in claim 1 (see claim 1).
In regards to claim 13, Ramegowda/Xiang/Ning/Mitra teach all the limitations of claim 12 and further teach:
“wherein the first set of geometric objects appears as black and the second set of geometric objects appears as white”
Ramegowda Figure 1.
Claim(s) 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda in view of Xiang in view of Dierderichs in view of Wu 2022/0284630 hereinafter referred to as Wu.
In regards to claim 17, Ramegowda/Xiang teach all the limitations of claim 14 and further teach:
“the two different ways that the geometric objects are perceived by the thermal sensor comprise thermal images of different thermal values”
Ramegowda Figure 3 illustrates different thermal values detected.
“the third way comprises a temperature greater than a threshold temperature and the fourth way comprises a temperature perceived by the thermal sensor are different by greater less than the threshold temperature”
Ramegowda paragraph [0014] teaches the first and second emissivity values should be at least 20 percent different from each other for effortless viewing and use of calibration target 12 in infrared image 16.
Ramegowda/Diederichs further teach:
“and the fifth way comprises a distance greater than a threshold distance and the sixth way comprises a distance is perceived by the range sensor comprises a less than the threshold distance”
Diederichs teaches in paragraph [0138] the calibration targets can be simultaneously detected by the camera and LiDAR. In the examples that follow, the calibration targets 1305a . . . 1305d are checkerboards, as shown in FIG. 13B. The Examiner interprets that since not every point on the checkeboards are at the same distance it will be perceived in a fifth way and a sixth way. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda in view of Diederichs to have included the features of ““and the fifth way comprises a distance greater than a threshold distance and the sixth way comprises a distance is perceived by the range sensor comprises a less than the threshold distance” because this enables autonomous vehicles to generate a deeper understanding of the surrounding environment (Diederichs [0003]).
Ramegowda/Dierderichs do not explicitly teach:
“wherein: the first way comprises an amount of light greater than a threshold amount of light and the second way comprises an amount of light less than the threshold amount of light”
The ability to tell the difference between black and white implies the sensor can detect black and lighter than black colors at least. The Examiner interprets that the light squares are greater than whatever minimum detection amount results in lighter than black detection. This claimed feature does not appear to provide any unpredictable results. It has been held that “[t]he combination of familiar elements according to known methods is likely to be obvious when it does not more than yield predictable results.” KSR., 127 S. Ct. at 1739, 82 USPQ2d at 1395 (2007) (Citing Graham, 383 U.S. at 12).
For example, Wu teaches in paragraph [0043] it should be understood that specific colors of the checkerboard are not particularly limited in this embodiment, provided that a color contrast between a checkerboard cell and an adjacent cell is greater than a particular preset threshold. The preset threshold may be customized according to the intensity of the contrast provided that the preset threshold meets a camera calibration requirement. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda/Dierderichs to have included the features of “wherein: the first way comprises an amount of light greater than a threshold amount of light and the second way comprises an amount of light less than the threshold amount of light” because an image of a calibration board with a fixed-pitch pattern array is captured by a camera, and through calculation using a calibration algorithm, a geometric model of the camera can be obtained, thereby obtaining a high-precision measurement and reconstruction result (Wu [0003]).
Claim(s) 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ramegowda in view of Dierderichs.
In regards to claim 20, Ramegowda teaches:
“A method of calibrating comprising: placing an unpowered calibration tool at a position to be perceived by a visual spectrum sensor, a thermal sensor, …, the calibration tool comprising a first set of geometric objects and a second set of geometric objects; detecting intensity of visible light for each of the geometric objects of the first set and of the second set from at least one digital image sensed by the visual spectrum sensor to produce detected intensities; detecting a temperature value for each of the geometric objects of the first set and of the second set from at least one thermal image sensed by the thermal sensor to produce detected temperature values”
Ramegowda Figure 5 and paragraph [0016] teaches in a situation where camera 21 is an electro-optical camera, an infrared image of target 12 from camera 15 and a visible wavelength image of target 12 from camera 21 may go to processor 20 for comparison from a geometrical or other perspective, as long as target 12 has attributes visible to the electro-optical camera. From Figures 1 and 3-4 inter alia, the darker geometrical symbols 18 and light geometrical symbols 17 are able to be perceived by both sensors. Ramegowda claim 6 teaches the camera may be a digital camera. Ramegowda teaches in paragraph [0015] FIG. 5 is a diagram of thermal camera 15 and another camera 21 capturing images of calibration target 12. Camera 21 may be another thermal camera or an electro-optical camera. In a situation where camera 21 is a thermal camera, infrared images of target 12 from cameras 15 and 21 may go to processor 20 for comparison or analysis. Camera 15 may be calibrated to camera 21 or vice versa. The Examiner interprets dark and light regions as “two different ways”.
“calibrating the visual spectrum sensor using the detected intensity of visible light for each of the geometric objects of the first set and second set, calibrating the thermal sensor using the detected temperature value for each of the geometric objects of the first set and second set, …”
Ramegowda teaches in paragraph [0015] FIG. 5 is a diagram of thermal camera 15 and another camera 21 capturing images of calibration target 12. Camera 21 may be another thermal camera or an electro-optical camera. In a situation where camera 21 is a thermal camera, infrared images of target 12 from cameras 15 and 21 may go to processor 20 for comparison or analysis. Camera 15 may be calibrated to camera 21 or vice versa.
Ramegowda does not explicitly teach:
“and a range sensor of a vehicle” and “detecting a range value for each of the geometric objects of the first set and the second set from a plurality of radar scans taken by the range sensor to produce a detected range values” and “[calibrate the visual spectrum sensor] and calibrating the range sensor using the detected range value for each geometric objects of the first set and the second set”
Dierderichs paragraph [0002] teaches this description relates generally to the operation of vehicles and specifically to camera-to-LiDAR calibration and validation. Dierderichs teach in paragraph [0139] Checkerboards are commonly used to perform the extrinsic calibration of various cameras and LiDAR range sensors because they provide numerous geometrical constraints and are easily detected by both sensors. Traditionally, checkerboard plane parameters are estimated in both the camera and LiDAR data. By establishing a plane-to-plane correspondence between the checkerboard observed by each sensor, the coordinate transformation between the LiDAR and camera is derived. It would have been obvious for a person with ordinary skill in the art before the invention was effectively filed to have modified Ramegowda in view of Diederichs to have included the features of “and a range sensor of a vehicle” and “detecting a range value for each of the geometric objects with the range sensor to produce a detected range value” and “[calibrate the visual spectrum sensor] and the range sensor using the detected range value” because this enables autonomous vehicles to generate a deeper understanding of the surrounding environment (Diederichs [0003]).
Response to Arguments
Applicant’s arguments with respect to claim(s) 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Applicant's arguments filed 7/3/2025 have been fully considered but they are not persuasive. Applicant asserts that Ramegowda does not disclose a third and a fourth way for the second sensor to be perceived.
Applicant argues that the language that enables “one of two different ways” for the perception of the visual spectrum sensor and the thermal sensor enables perception of a first way, a second way, a third way and a fourth way. This language should also then enable a fifth way and a sixth way for the range sensor. However, this argument is not persuasive. The Examiner in not asserting that the sensors can perceive more than one value. The Examiner freely admits that in page 3 of the Office Action mailed 4/15/2025 that “the specification does have support for measuring more than one distance value.” However, the claims require that “the unpowered geometric object” are perceived in a fifth way and a sixth way and that way is a different distance value (claim 7). Claim 17 further introduces a threshold distance which never mentioned anywhere in the specification. The only distinguishing characteristics that the range sensor can make between the unpowered geometric objects appear to be in the reflectivity of the objects as in Figure 5A-C. The only mention of a threshold is associated with a threshold amount of light (e.g. [0020]). Therefore, the fifth way and sixth way has enablement for a different reflectivity of the unpowered geometric object but not a different distance measurement.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL E TEITELBAUM, Ph.D. whose telephone number is (571)270-5996. The examiner can normally be reached 8:30AM-5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John Miller can be reached at 571-272-7353. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHAEL E TEITELBAUM, Ph.D./Primary Examiner, Art Unit 2422