DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Response to Amendment
The following addresses applicant’s remarks/amendments dated 10th March 2026.
Claims 1, 9, 12, 15 and 18 were amended; claims 10 and 19 were cancelled; new claims 21 and 22 were added; therefore, claims 1-9, 11-18, and 20-22 are pending in current application and are addressed below.
Response to Arguments
Applicant's arguments filed 10th March 2026 have been fully considered but they are not persuasive. Applicant’s arguments with respect to claims 1 have been considered but are moot because the arguments do not apply to the specific combination of the references being used in the current rejection.
In response to applicant’s argument that references fail to show certain features of applicant’s invention, it is noted that features upon which applicant relies (i.e., “detecting an apparent object … during the saturation recovery period” and “excluding the apparent object … during the saturation recovery period”) are not recited in the rejected claims. Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). [[Here, Applicant argues that Lovering does not teach “detecting an apparent object … during the saturation recovery period” and “excluding the apparent object … during the saturation recovery period”]] However, these claim limitations were not present in the original independent claims and were presented by amendment on 10th March 2026. Therefore, the issue of whether Yeruhami, Seong and Lovering addresses these limitations are not relevant. These amended claims containing new limitations have been addressed by Banks in the present Office Action.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-5, 7-8, 12-17 and 21 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Yeruhami et al. (US 20200249354 A1, hereinafter “Yeruhami”), modified in view of Seong (KR 101937777 B1, hereinafter “Seong”), in view of Banks (US 20100128109 A1, hereinafter “Banks”).
Regarding claim 1, Yeruhami teaches a method comprising:
repeatedly scanning a range of angles in a field-of-view (FOV) of a light detection and ranging (LIDAR) device (Yeruhami; Fig. 16, Fig. 17, [0348], [0349], region 1601 of 1st scan cycle and region 1601’ of 2nd scan cycle of FOV);
for each scan of the range of angles, detecting a plurality of light pulses intercepted at a light detector of the LIDAR device during a plurality of successive detection periods, wherein the light detector is configured to intercept light from a different angle in the range of angles during each of the plurality of successive detection periods of the scan (Yeruhami; [0123], scanning, detecting and comparing the environment around LiDAR system in the field of view with light pulses and characterizing various aspects of the reflected light, an entire scan of field of view can be achieved);
generating a three-dimensional (3D) representation of the FOV based on at least light intensity measurements indicated by outputs from the light detector for the detected plurality of light pulses (Yeruhami; Fig. 6D, [0198], line 7, using single rotatable LiDAR system to obtain 3D data representing FOV );
Yeruhami does not teach
comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and
based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan.
detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and
excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period
Seong teaches
comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and
based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan (Seong; Fig. 20, Fig. 21, [0395], is a drawing for a method of determining and predicting whether a light receiving unit 5200 has entered a saturation region; [0396], the 1st electric signal may be delayed by d while returning to the normal operating range as it reaches the saturation value (Vsat). The control unit 5300 may obtain an actual light reception time point t0 from the detected edge as described in Fig. 11. Additionally, the controller 5300 may acquire a 2nd electrical signal from the second target. The size of the 1st electrical signal may be greater than the size of the 2nd electrical signal. Thus, the magnitude of the 1st slope may be greater than the magnitude of the 2nd slope, where the 1st slope and 2nd slope can be calculated from edges detected from the same reference values (Vth1, Vth2). In this case, the control unit 5300 can compare the first slop with a predetermined slope to determine and predict whether the 1st electric signal is saturated. Likewise, the control unit 5300 can compare the 2nd slope with a predetermined slope to determine and predict whether the 2nd electric signal is saturated; It would have been obvious to one of ordinary skill in the art to recognized that when saturation happens, the detection of the signal will be onset of a saturation recovery period because the saturation recovery occurs right after saturation happens. Thus, Seong’s teach satisfied the claim limitation).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong with a reasonable expectation of success. The reasoning for this is to provide a method of determining and predicting whether a light receiving unit has entered a saturation region and adjust the gain value of the sensor unit of the light receiving unit. Further, the control unit 5300 can obtain the actual light receiving time point (t0) by extrapolating only the rising edge (Seong; Fig. 20, Fig. 21, [0395], [0401]-[0402], [0404]).
However, Yeruhami modified in view of Seong still not teach,
detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and
excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period.
Banks disclosed in paragraph [0152]-[0153], when increases the energy of emitted light to improve the resolution such that the object that were insufficiently resolved in the lower energy light pulsed (1st 3D image) can be measured. However, with higher emitted light energy, some pixels may be saturated due to high reflected objects in 2nd 3D image (these data from the high reflected object is equivalent to be obtained during the saturation recovery period). As such, both 3D images measured by lower (1st 3D image) and higher (2nd 3D image) energy of emitted light will be combined to obtain a 3D image having increased resolution. Any suitable algorithm may be used to determine which portions of the 1st and 2nd images to use or to discard as corresponding to insufficiently resolved object or saturated pixels (equivalent to excluding the apparent object from the generated 3D representation based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks with a reasonable expectation of success. The reasoning for this is during measurement, some pixels may be saturated due to high reflected objects. Removing or discarding the data obtained under pixels which were saturated, predictably to remove the artifact or error of the distance measurement caused by the saturated pixels effects.
Regrading claim 2, Yeruhami as modified above teaches the method as recited in claim 1, further comprising: emitting one or more light pulses from the LIDAR device toward the FOV (Yeruhami; Fig. 1A, [0120], light source 112 emits light pulses toward FOV ), wherein the detected plurality of light pulses comprise reflected portions of the one or more emitted light pulses that are reflected back to the LIDAR device from the FOV (Yeruhami; Fig. 1A, [0153] sensors 116 detecting light pulses reflected back from FOV).
Regrading claim 3, Yeruhami as modified above teaches the method as recited in claim 1.
Yeruhami does not teach, further comprising: identifying one or more scans of the range of angles that are obtained during the saturation recovery period of the light detector
Seong teaches, further comprising: identifying one or more scans of the range of angles that are obtained during the saturation recovery period of the light detector (Seong; Fig. 20, Fig. 21, [0395], is a drawing for a method of determining and predicting whether a light receiving unit 5200 has entered a saturation region; [0396], the 1st electric signal may be delayed by d while returning to the normal operating range as it reaches the saturation value (Vsat). The control unit 5300 may obtain an actual light reception time point t0 from the detected edge as described in Fig. 11. Additionally, the controller 5300 may acquire a 2nd electrical signal from the second target. The size of the 1st electrical signal may be greater than the size of the 2nd electrical signal. Thus, the magnitude of the 1st slope may be greater than the magnitude of the 2nd slope, where the 1st slope and 2nd slope can be calculated from edges detected from the same reference values (Vth1, Vth2). In this case, the control unit 5300 can compare the first slop with a predetermined slope to determine and predict whether the 1st electric signal is saturated. Likewise, the control unit 5300 can compare the 2nd slope with a predetermined slope to determine and predict whether the 2nd electric signal is saturated; It would have been obvious to one of ordinary skill in the art to recognized that when saturation happens, the detection of the signal will be onset of a saturation recovery period because the saturation recovery occurs right after saturation happens. Thus, Seong’s teach satisfied the claim limitation).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan and further comprising: identifying one or more scans of the range of angles that are obtained during the saturation recovery period of the light detector taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks with a reasonable expectation of success. The reasoning for this is to provide a method of determining and predicting whether a light receiving unit has entered a saturation region. Further, the control unit 5300 can obtain the actual light receiving time point (t0) by extrapolating only the rising edge (Seong; Fig. 20, Fig. 21, [0395], [0404]).
Regrading claim 4, Yeruhami as modified above teaches the method as recited in claim 1, wherein comparing the first scan with the second scan comprises comparing first light intensity measurements indicated by first outputs from the light detector for first light pulses detected during the first scan with second light intensity measurements indicated by second outputs from the light detector for second light pulses detected during the second scan (Yeruhami; Fig. 16, Fig. 17, [0348], [0349], region 1601 of 1st scan cycle and region 1601’ of 2nd scan cycle of FOV).
Regarding claim 5, Yeruhami as modified above teaches the method as recited in claim 4, wherein comparing the first light intensity measurements with the second light intensity measurements comprises comparing respective maximum values of the first light intensity measurements and the second light intensity measurements (Yeruhami; Fig. 16, Fig. 17, [0348], [0349], region 1601 of 1st scan cycle and region 1601’ of 2nd scan cycle of FOV).
Regrading claim 7, Yeruhami as modified above teaches the method as recited in claim 1, further comprising: determining, based on at least a light intensity measurement indicated by output from the light detector for a detected light pulse of the detected plurality of light pulses, a time-of-flight of an emitted light pulse emitted from the LIDAR device toward the FOV and at least partially reflected back from the FOV toward the LIDAR device as the detected light pulse (Yeruhami; [0156], [0158], analyzing reflected light including determining a time of flight for reflected light based on output of individual detectors of FOV).
Regrading claim 8, Yeruhami as modified above teaches the method as recited in claim 7.
Yeruhami does not teach, further comprising: determining whether the detected light pulse is detected at the light detector during the saturation recovery period, wherein determining the time-of-flight is further based on the determination of whether the detected light pulse is detected during the saturation recovery period.
Seong teaches, further comprising: determining whether the detected light pulse is detected at the light detector during the saturation recovery period, wherein determining the time-of-flight is further based on the determination of whether the detected light pulse is detected during the saturation recovery period (Seong; Fig. 20, Fig. 21, [0395], is a drawing for a method of determining and predicting whether a light receiving unit 5200 has entered a saturation region; [0286], the control unit 5300 can obtain the actual light receiving time point (t0) by extrapolating multiple edges; Fig. 14, [0345], the distance measuring device 500 can detect three ringing edges using predetermined reference (Vth1, Vth2, Vth3) even if the 1st electrical signal reaches the saturation value. Accordingly, the distance measuring device 5000 can obtain the first actual light receiving time point (t01) by extrapolating the three rising edges. That is the distance measurement device 5000 can calculate the exact distance to the target object even if the electric signal converted from the sensing unit enters the saturation region; [0404], the extrapolation data of the control unit 5300 may vary depending on whether the light receiving unit enters the saturation region).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include determining whether the detected light pulse is detected at the light detector during the saturation recovery period, wherein determining the time-of-flight is further based on the determination of whether the detected light pulse is detected during the saturation recovery period taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks with a reasonable expectation of success. The reasoning for this is first to determine if the signal is detected when the sensing unit enters the saturation region and then further to detect the time of flight distance based on obtain the actual light receiving time point (t0) by extrapolating multiple edges operating by the controller unit 5300 (Seong; [0286], [0345], [0395]. [0404]).
Regarding claim 12, Yeruhami teaches a light detection and ranging (LIDAR) device comprising:
a light detector (Yeruhami; Fig. 1A, [0153], sensors 116);
one or more optical elements configured to direct light received by the LIDAR device from a field-of-view (FOV) onto the light detector (Yeruhami; Fig. 2A, [0120], line 27, scanning unit also include a pivotable return deflector 114B that direct reflected light reflected back from an object within FOV toward sensor);
a controller configured to cause the LIDAR device to perform operations comprising:
repeatedly scanning the light detector across a range of angles in the FOV (Yeruhami; [0017], line 2, The processor may configure to control at least one LiDAR light source in a manner enabling light projected from the at least one light source to vary over a plurality of scans of a FOV );
for each scan of the range of angles, detecting a plurality of light pulses intercepted at the light detector during a plurality of detection periods, wherein the light detector is configured to intercept light from a different angle in the range of angles during each of the plurality of detection periods of the scan (Yeruhami; [0017], line 12, the processor may be further configured to a possible existence of an object in the background area based on different scanning cycles);
generating a three-dimensional (3D) representation of the FOV based on at least light intensity measurements indicated by outputs from the light detector for the detected plurality of light pulses (Yeruhami; Fig. 6D, [0198], line 7, using single rotatable LiDAR system to obtain 3D data representing FOV );
Yeruhami does not teach
comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and
based on the comparison, detecting onset of a saturation recovery period of the light detector.
detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and
excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period.
Seong teaches
comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and
based on the comparison, detecting onset of a saturation recovery period of the light detector. (Seong; Fig. 20, Fig. 21, [0395], is a drawing for a method of determining and predicting whether a light receiving unit 5200 has entered a saturation region; [0396], the 1st electric signal may be delayed by d while returning to the normal operating range as it reaches the saturation value (Vsat). The control unit 5300 may obtain an actual light reception time point t0 from the detected edge as described in Fig. 11. Additionally, the controller 5300 may acquire a 2nd electrical signal from the second target. The size of the 1st electrical signal may be greater than the size of the 2nd electrical signal. Thus, the magnitude of the 1st slope may be greater than the magnitude of the 2nd slope, where the 1st slope and 2nd slope can be calculated from edges detected from the same reference values (Vth1, Vth2). In this case, the control unit 5300 can compare the first slop with a predetermined slope to determine and predict whether the 1st electric signal is saturated. Likewise, the control unit 5300 can compare the 2nd slope with a predetermined slope to determine and predict whether the 2nd electric signal is saturated; It would have been obvious to one of ordinary skill in the art to recognized that when saturation happens, the detection of the signal will be onset of a saturation recovery period because the saturation recovery occurs right after saturation happens. Thus, Seong’s teach satisfied the claim limitation).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the device taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector taught by Seong with a reasonable expectation of success. The reasoning for this is to provide a method of determining and predicting whether a light receiving unit has entered a saturation region and adjust the gain value of the sensor unit of the light receiving unit. Further, the control unit 5300 can obtain the actual light receiving time point (t0) by extrapolating only the rising edge (Seong; Fig. 20, Fig. 21, [0395], [0401]-[0402], [0404]).
However, Yeruhami modified in view of Seong still not teach,
detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and
excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period.
Banks disclosed in paragraph [0152]-[0153], when increases the energy of emitted light to improve the resolution such that the object that were insufficiently resolved in the lower energy light pulsed (1st 3D image) can be measured. However, with higher emitted light energy, some pixels may be saturated due to high reflected objects in 2nd 3D image (these data from the high reflected object is equivalent to be obtained during the saturation recovery period). As such, both 3D images measured by lower (1st 3D image) and higher (2nd 3D image) energy of emitted light will be combined to obtain a 3D image having increased resolution. Any suitable algorithm may be used to determine which portions of the 1st and 2nd images to use or to discard as corresponding to insufficiently resolved object or saturated pixels (equivalent to excluding the apparent object from the generated 3D representation based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the device taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks with a reasonable expectation of success. The reasoning for this is during measurement, some pixels may be saturated due to high reflected objects. Removing or discarding the data obtained under pixels which were saturated, predictably to remove the artifact or error of the distance measurement caused by the saturated pixels effects.
Regrading claim 13, Yeruhami as modified above teaches the device as recited in claim 12, further comprising: a light emitter configured to emit one or more light pulses toward the FOV (Yeruhami; Fig. 1A, [0120] light source 112 emits light pulses toward FOV ), wherein the detected plurality of light pulses correspond to reflected portions of the one or more emitted light pulses that are reflected back from the FOV toward the LIDAR device (Yeruhami; Fig. 1A, [0153] sensors 116 detecting light pulses reflected back from FOV).
Regrading claim 14, Yeruhami as modified above teaches the device as recited in claim 12, wherein the one or more optical elements comprise: a rotating mirror configured to direct the light received by the LIDAR device from different angles toward the light detector based on corresponding rotational positions of the rotating mirror (Yeruhami; Fig. 1A, [0106], [0112], [0144], light deflector 114 (mirror) may be movable to cause light deviate to differing degree and may be operable to change an angle of deflection within two non-parallel plans; The deflector may be operated to allow sensor to collect reflected photons from substantially the same portion of FOV).
Regarding claim 15, Yeruhami teaches a method comprising:
receiving, from a light detection and ranging (LIDAR) device, an indication of a plurality of scans of a range of angles in a field-of-view (FOV), wherein the LIDAR device is configured to repeatedly scan the range of angles using a light detector of the LIDAR device (Yeruhami; Fig. 16, Fig. 17, [0348], region 1601 of 1st scan cycle and region 1601’ of 2nd scan cycle of FOV);
for each scan of the range of angles, identifying a plurality of light pulses received at different angles in the range of angles, wherein the plurality of light pulses are intercepted at the light detector during different detection periods in the scan (Yeruhami; [0017], line 12, the processor may be further configured to a possible existence of an object in the background area based on different scanning cycles);
generating a three-dimensional (3D) representation of the FOV based on at least light intensity measurements indicated by outputs from the light detector for the detected plurality of light pulses (Yeruhami; Fig. 6D, [0198], line 7, using single rotatable LiDAR system to obtain 3D data representing FOV );
Yeruhami does not teach
comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and
based on the comparison, identifying one or more scans of the plurality of scans obtained during a saturation recovery period of the light detector.
detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and
excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period.
Seong teaches
comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and
based on the comparison, identifying one or more scans of the plurality of scans obtained during a saturation recovery period of the light detector (Seong; Fig. 20, Fig. 21, [0395], is a drawing for a method of determining and predicting whether a light receiving unit 5200 has entered a saturation region; [0396], the 1st electric signal may be delayed by d while returning to the normal operating range as it reaches the saturation value (Vsat). The control unit 5300 may obtain an actual light reception time point t0 from the detected edge as described in Fig. 11. Additionally, the controller 5300 may acquire a 2nd electrical signal from the second target. The size of the 1st electrical signal may be greater than the size of the 2nd electrical signal. Thus, the magnitude of the 1st slope may be greater than the magnitude of the 2nd slope, where the 1st slope and 2nd slope can be calculated from edges detected from the same reference values (Vth1, Vth2). In this case, the control unit 5300 can compare the first slop with a predetermined slope to determine and predict whether the 1st electric signal is saturated. Likewise, the control unit 5300 can compare the 2nd slope with a predetermined slope to determine and predict whether the 2nd electric signal is saturated; It would have been obvious to one of ordinary skill in the art to recognized that when saturation happens, the detection of the signal will be onset of a saturation recovery period because the saturation recovery occurs right after saturation happens. Thus, Seong’s teach satisfied the claim limitation).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, identifying one or more scans of the plurality of scans obtained during a saturation recovery period of the light detector taught by Seong with a reasonable expectation of success. The reasoning for this is to provide a method of determining and predicting whether a light receiving unit has entered a saturation region and adjust the gain value of the sensor unit of the light receiving unit. Further, the control unit 5300 can obtain the actual light receiving time point (t0) by extrapolating only the rising edge (Seong; Fig. 20, Fig. 21, [0395], [0401]-[0402], [0404]).
However, Yeruhami modified in view of Seong still not teach,
detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and
excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period.
Banks disclosed in paragraph [0152]-[0153], when increases the energy of emitted light to improve the resolution such that the object that were insufficiently resolved in the lower energy light pulsed (1st 3D image) can be measured. However, with higher emitted light energy, some pixels may be saturated due to high reflected objects in 2nd 3D image (these data from the high reflected object is equivalent to be obtained during the saturation recovery period). As such, both 3D images measured by lower (1st 3D image) and higher (2nd 3D image) energy of emitted light will be combined to obtain a 3D image having increased resolution. Any suitable algorithm may be used to determine which portions of the 1st and 2nd images to use or to discard as corresponding to insufficiently resolved object or saturated pixels (equivalent to excluding the apparent object from the generated 3D representation based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks with a reasonable expectation of success. The reasoning for this is during measurement, some pixels may be saturated due to high reflected objects. Removing or discarding the data obtained under pixels which were saturated, predictably to remove the artifact or error of the distance measurement caused by the saturated pixels effects.
Regrading claim 16, Yeruhami as modified above teaches the method as recited in claim 15, wherein comparing the first scan with the second scan comprises comparing first light intensity measurements indicated by first outputs from the light detector for first light pulses detected during the first scan with second light intensity measurements indicated by second outputs from the light detector for second light pulses detected during the second scan (Yeruhami; Fig. 16, Fig. 17, [0348], region 1601 of 1st scan cycle and region 1601’ of 2nd scan cycle of FOV).
Regrading claim 17, Yeruhami as modified above teaches the method as recited in claim 16, wherein comparing the first light intensity measurements with the second light intensity measurements comprises comparing respective maximum values of the first light intensity measurements and the second light intensity measurements (Yeruhami; Fig. 16, Fig. 17, [0348], region 1601 of 1st scan cycle and region 1601’ of 2nd scan cycle of FOV).
Regarding claim 21, Yeruhami as modified above teaches the method as recited in claim 1.
Yeruhami does not teach, wherein the apparent object is an artifact caused by degraded measurement accuracy of the light detector during the saturation recovery period.
Banks disclosed in paragraph [0152]-[0153], when increases the energy of emitted light to improve the resolution such that the object that were insufficiently resolved in the lower energy light pulsed (1st 3D image) can be measured. However, with higher emitted light energy, some pixels may be saturated due to high reflected objects in 2nd 3D image (these data from the high reflected object is equivalent to be obtained during the saturation recovery period). As such, both 3D images measured by lower (1st 3D image) and higher (2nd 3D image) energy of emitted light will be combined to obtain a 3D image having increased resolution. Any suitable algorithm may be used to determine which portions of the 1st and 2nd images to use or to discard as corresponding to insufficiently resolved object or saturated pixels (equivalent to excluding the apparent object from the generated 3D representation based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period). This implies that the data taken by the pixels under saturation condition (due to high reflected objects) would become an artifact because of the degraded of the light detector.
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period; wherein the apparent object is an artifact caused by degraded measurement accuracy of the light detector during the saturation recovery period taught by Banks with a reasonable expectation of success. The reasoning for this is during measurement, some pixels may be saturated due to high reflected objects. The data taken by the pixels under saturation condition (due to high reflected objects) would become an artifact because of the degraded of the light detector. Removing or discarding the data obtained under pixels which were saturated, predictably to remove the artifact or error of the distance measurement caused by the saturated pixels effects.
Claim 6 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Yeruhami modified in view of Seong, in view of Banks, in view of Tachino (US 20210341592 A1, hereinafter “Tachino”).
Regrading claim 6, Yeruhami as modified above teaches the method as recited in claim 5.
Yeruhami does not teach, wherein detecting onset of the saturation recovery period is based on difference between the respective maximum values exceeding a threshold difference.
Tachino teaches, wherein detecting onset of the saturation recovery period is based on difference between the respective maximum values exceeding a threshold difference (Tachino; Fig. 4, step 60, 70, Fig. 5, [0038]-[0040], The determiner acquires signal values of signals output during a dead time in response to incidence of photons of reflected light on the SPADs; Dead time within the reflected light period is a period of time from when the signal value decreases to below the reference value after having increased and being clipped in response to incidence of photons of the reflected light on the SPAD).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include to include comparing the first scan with the second scan comprises comparing first light intensity measurements indicated by first outputs from the light detector for first light pulses detected during the first scan with second light intensity measurements indicated by second outputs from the light detector for second light pulses detected during the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks and further include detecting onset of the saturation recovery period is based on difference between the respective maximum values exceeding a threshold difference taught by Tachino with a reasonable expectation of success. The reasoning for introducing detecting onset of the saturation recovery period is based on difference between the respective maximum values exceeding a threshold difference is to determine presence or absence of an abnormality in the light receiving unit using signals output during a dead time. Therefore, even in an environment where ambient light is not constant, occurrence of an abnormality in the light receiving unit can be accurately detected (Tachino; Fig. 4 step 60, 70, Fig. 5, [0038]-[0040]).
Claims 9 and 18 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Yeruhami modified in view of Seong, in view of Banks, in view of Lovering et al. (US 20200132841 A1, hereinafter “Lovering”).
Regrading claim 9, Yeruhami as modified above teaches the method as recited in claim 1, wherein excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period (Banks; [0152]-[0153], please see mapping of claim 1 above) comprises:
Yeruhami does not teach,
excluding the apparent object from the generated 3D representation of the FOV based on an apparent size of the apparent object indicated by data from the LIDAR device being less than a foreign object debris (FOD) detection threshold.
Lovering teaches, excluding the apparent object from the generated 3D representation of the FOV based on an apparent size of the apparent object indicated by data from the LIDAR device being less than a foreign object debris (FOD) detection threshold (Lovering; [0037], the filter 256 may be configured to filter data from the sensors 20, 30 to remove data indicative of small particles (e.g., objects having a dimension below a predefined threshold) such as dust, vapor, small debris and pollutants, and provide the filtered data to the sense and avoid element 207).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks, include excluding the apparent object from the generated 3D representation of the FOV based on an apparent size of the apparent object indicated by data from the LIDAR device being less than a foreign object debris (FOD) detection threshold taught by Lovering with a reasonable expectation of success. The reasoning for this is to identify and remove data indicative of small particles (below predefined threshold) which is unwanted and keeps the large objects such as other aircraft, birds …which may pose a collision threat to the aircraft 10 in order to make control decision to avoid such collision treats (Lovering; [0037]).
Regrading claim 18, Yeruhami as modified above teaches the method as recited in claim 15, wherein excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period (Banks; [0152]-[0153], please see mapping of claim 1 above) comprises:
Yeruhami does not teach,
excluding the apparent object from the generated 3D representation of the FOV based on an apparent size of the apparent object indicated by data from the LIDAR device being less than a foreign object debris (FOD) detection threshold.
Lovering teaches, excluding the apparent object from the generated 3D representation of the FOV based on an apparent size of the apparent object indicated by data from the LIDAR device being less than a foreign object debris (FOD) detection threshold (Lovering; [0037], the filter 256 may be configured to filter data from the sensors 20, 30 to remove data indicative of small particles (e.g., objects having a dimension below a predefined threshold) such as dust, vapor, small debris and pollutants, and provide the filtered data to the sense and avoid element 207).
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks, include excluding the apparent object from the generated 3D representation of the FOV based on an apparent size of the apparent object indicated by data from the LIDAR device being less than a foreign object debris (FOD) detection threshold taught by Lovering with a reasonable expectation of success. The reasoning for this is to identify and remove data indicative of small particles (below predefined threshold) which is unwanted and keeps the large objects such as other aircraft, birds …which may pose a collision threat to the aircraft 10 in order to make control decision to avoid such collision treats (Lovering; [0037]).
Claims 22 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Yeruhami, modified in view of Seong, in view of Banks, in view of Karadeniz et al. (US 20200158876 A1, hereinafter “Karadeniz”).
Regarding claim 22, Yeruhami as modified above teaches the method as recited in claim 1.
Yeruhami does not teach,
wherein the apparent object is a bump in a road.
Karadeniz disclosed in paragraph [0021], the sensor data (the intensity and depth information or the like [0024]) of sensor systems 104 (TOF sensors or Lidar sensors…) may be processed to identify and/or classify objects in the environment, e.g., tree, vehicles, pedestrians, buildings, road surfaces, road marking, or the like; [0026], disclosed when the pixels that may be unreliable, e.g., because they may be oversaturated/underexposed, in either the 1st frame 110(1) or the 2nd frame 110(2) can be replaced with pixels from the other of the 1st frame 110(1) or the 2nd frame 110(2) to generate the resolved frame 120. For instance, the pixels that are over 80% or 90% saturated may be disregarded in favor of corresponding pixels from another frame that are below that threshold. Sine in paragraph [0021] stated that the sensor can be used to identify/classify object in the road surface, and paragraph [0026] stated that if oversaturated/underexposed happened during the measurement, it would have been obvious to one of ordinary skill in the art to recognize that during the identify/classify of the road surface, if oversaturated happen (due to the degraded of the light detector), the data will show an apparent object such as bump in a road as expected.
It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the method taught by Yeruhami to include comparing a first scan of the range of angles obtained using the light detector with a second scan subsequent to the first scan; and based on the comparison, detecting onset of a saturation recovery period of the light detector during the first scan or the second scan taught by Seong, include detecting an apparent object in a region of the FOV scanned during the saturation recovery period; and excluding the apparent object from the generated 3D representation of the FOV based on at least the apparent object being in the region of the FOV scanned during the saturation recovery period taught by Banks, include the apparent object such as a bump in a road due to the degraded of the light detector when saturation happened taught by Karadeniz with a reasonable expectation of success. The reasoning for this is that during the identify/classify of the road surface, if oversaturated happen (due to the degraded of the light detector), predictably the data will show an apparent object such as bump in a road as expected.
Allowable Subject Matter
Claims 11 and 20 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Regarding claim 11, the prior art of record does not explicitly teach nor render obvious the following element, along with all other claimed feature:
The method as recited in claim 9, further comprising:
in response to detecting the onset of the saturation recovery period, adjusting the FOD detection threshold for one or more scans of the range of angles obtained during the saturation recovery period of the light detector.
Regarding claim 20, the prior art of record does not explicitly teach nor render obvious the following element, along with all other claimed feature:
The method as recited in claim 18, further comprising:
adjusting the FOD detection threshold for the identified one or more scans of the range of angles obtained during the saturation recovery period of the light detector.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHIA-LING CHEN whose telephone number is (571)272-1047. The examiner can normally be reached Monday thru Friday 8-5 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached at (571)270-3630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHIA-LING CHEN/Examiner, Art Unit 3645
/YUQING XIAO/Supervisory Patent Examiner, Art Unit 3645