Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-6, 10, 14, 16-25, and 29 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Hicks et al. (United States Patent Application Publication 20200018854 A1), hereinafter Hicks.
Regarding claim 1, Hicks teaches a LIDAR system ([0035] a light detection and ranging (lidar) system 100), comprising:
at least one light source configured to project laser light toward a field of view of the LIDAR system ([Fig. 1]; [0036] The lidar system 100 may include a light source 110);
at least one sensor configured to detect the laser light of the at least one light source reflected from objects in the field of view of the LIDAR system ([Fig. 1]; [0036] a receiver 140); and
at least one processor ([0036] a controller 150; [0040] Depending on the implementation, the controller 150 may include one or more processors,) configured to:
control the at least one light source to scan at least a portion of the field of view of the LIDAR system ([0037] The output beam of light 125 is directed downrange toward a remote target 130 located a distance D from the lidar system 100 and at least partially contained within a field of regard of the system 100.);
receive, from the at least one sensor, reflection signals indicative of received laser light reflected from objects in the at least a portion of the field of view of the LIDAR system ([0040] The receiver 140 may receive or detect photons from the input beam 135 and generate one or more representative signals); and
use the reflection signals to generate a point-cloud representation of an environment of the LIDAR system within the at least a portion of the field of view of the LIDAR system ([0052] A collection of pixels captured in succession (which may be referred to as a depth map, a point cloud, or a frame) may be rendered as an image or may be analyzed to identify or detect objects or to determine a shape or distance of objects within the FOR.);
and wherein the at least one processor is further configured to: receive from the at least one sensor, a first output signal associated with at least a first laser light pulse maximally incident upon an object in the at least a portion of the field of view of the LIDAR system ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.);
receive from the at least one sensor, a second output signal associated with a second laser light pulse partially incident upon the object ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.);
use the first output signal and the second output signal to determine a value indicative of a portion of the second laser light pulse that was incident upon the object; use the determined value to determine a location associated with an edge of the object ([0082] Then, the machine vision system 10 can calculate the distance to the proximal object 610 (see FIG. 8A) based on the time delay tp=½ (t3+t4)−t0, and calculate the distance to the distant object 612 based on the time delay td=½ (t5+t6)−t0. In other cases, the time corresponding to the peak value of the returned pulse signal may be used.); and
generate a point cloud data point representative of the determined location associated with the edge of the object ([0085] In another implementation, each of the multiple returns from solid targets may be used to generate a separate pixel or point in a three-dimensional point cloud.).
Regarding claim 2, Hicks teaches the LIDAR system of claim 1, wherein each point in the point cloud is associated with a spatial location in the field of view of the LIDAR system and a distance relative to at least a portion of the LIDAR system ([0072] In some implementations, the vehicle controller 372 receives point cloud data from the laser 352 or sensor heads 360 via the link 370 and analyzes the received point cloud data to sense or identify targets 130 and their respective locations, distances, speeds, shapes, sizes, type of target (e.g., vehicle, human, tree, animal), etc.; [0085] In another implementation, each of the multiple returns from solid targets may be used to generate a separate pixel or point in a three-dimensional point cloud.).
Regarding claim 3, Hicks teaches the LIDAR system of claim 1, wherein the at least one processor is further configured to selectively control a pulse rate of the at least one light source based on the determined location associated with the edge of the object ([0104] In addition to selecting a return for assigning a value to a lidar pixel, the lidar system can adjust one or more operational parameters in view of the data received from the camera... The lidar system also can modify the scan pattern, adjust the pulse energy, adjust the repetition rate, etc.).
Regarding claim 4, Hicks teaches the LIDAR system of claim 3, wherein a first pulse rate over a first portion of a scanning path associated with the location of the edge of the object is higher than a second pulse rate over a second portion of the scanning path not associated with the location of the edge of the object ([Fig. 8A-8C]).
Regarding claim 5, Hicks teaches the LIDAR system of claim 1, further including at least one deflector configured to rotate about at least one scanning axis to deflect the laser light from the at least one light source along a scanning pattern to scan the at least a portion of the field of view of the LIDAR system ([0045] The scanner 120 may include one or more scanning mirrors and one or more actuators driving the mirrors to rotate, tilt, pivot, or move the mirrors in an angular manner about one or more axes, for example.).
Regarding claim 6, Hicks teaches the LIDAR system of claim 5, wherein the at least one processor is further configured to selectively control an angular scanning rate of the at least one scanner based on the determined location associated with the edge of the object ([0046] the controller 150 which may control the scanning mirror(s) so as to guide the output beam 125 in a desired direction downrange or along a desired scan pattern; [0104] The lidar system also can modify the scan pattern, adjust the pulse energy, adjust the repetition rate, etc.).
Regarding claim 10, Hicks teaches the LIDAR system of claim 5, wherein the scanning pattern includes a series of horizontally oriented scan lines ([Fig. 3]).
Regarding claim 14, Hicks teaches the LIDAR system of claim 1, wherein the at least one processor is further configured to determine the value indicative of the portion of the second laser light pulse that was incident upon the object based on a known spatial energy distribution associated with the first laser light pulse and the second laser light pulse ([Fig. 8B]; [0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.; [0080] Next, FIG. 8B illustrates a voltage signal as a function of time corresponding to the pulses in FIG. 8A).
Regarding claim 16, Hicks teaches the LIDAR system of claim 1, wherein the first laser light pulse is emitted before the second laser light pulse ([0126] While operations may be depicted in the drawings as occurring in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order,).
Regarding claim 17, Hicks teaches the LIDAR system of claim 1, wherein the second laser light pulse is emitted before the first laser light pulse ([0126] While operations may be depicted in the drawings as occurring in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order,).
Regarding claim 18, Hicks teaches the LIDAR system of claim 1, wherein the at least one processor is further configured to receive from the at least one sensor, a plurality of output signals, in addition to the first output signal, associated with a laser light pulses maximally incident upon the object in the at least a portion of the field of view of the LIDAR system, and store in a memory at least one indicator of a maximal reflection characteristic associated with reflection pulses resulting from the laser light pulses maximally incident upon the object in the at least a portion of the field of view of the LIDAR system ([0040] the controller 150 may include one or more processors, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and/or other suitable circuitry configured to analyze one or more characteristics of the electrical signal 145 to determine one or more characteristics of the target 130,; [0077] The machine vision system 10 of FIG. 1 (or another suitable machine vision system of this disclosure) can determine which of one or more returns corresponding to an emitted pulse of light are spurious; [0082] In other cases, the time corresponding to the peak value of the returned pulse signal may be used; ).
Regarding claim 19, Hicks teaches the LIDAR system of claim 18, wherein the average reflection characteristic includes an average total received energy associated with the reflection pulses ([0090] One factor may include imagery captured by the camera, and another factor may include the characteristics of the pulses, such as magnitude or shape of the return.).
Regarding claim 20, Hicks teaches the LIDAR system of claim 1, wherein the second laser light pulse is partially incident upon the object and partially incident upon at least one additional object ([0079] the target occupies only a portion of the cross-section of the angular cone illuminated by the light source, so that the remaining portion of the cross-section can be projected onto a plane behind the small target.).
Regarding claim 21, Hicks teaches the LIDAR system of claim 20, wherein the at least one processor further receives from the at least one sensor, a third output signal associated with the incidence of the second laser light pulse upon the at least one additional object ([Fig. 8B]; [0080] A reference pulse signal 626 corresponds to the emitted pulse 606, for example, and may be generated by detecting a small fraction of the power of the emitted pulse 606 scattered by the window of the housing or diverted by other optical means toward the receiver.).
Regarding claim 22, Hicks teaches the LIDAR system of claim 21, wherein the third output signal is associated with a time-of-flight value different from a time-of-flight value associated with the second output signal ([Fig. 8B]; [0080] A reference pulse signal 626 corresponds to the emitted pulse 606, for example, and may be generated by detecting a small fraction of the power of the emitted pulse 606 scattered by the window of the housing or diverted by other optical means toward the receiver.).
Regarding claim 23, Hicks teaches the LIDAR system of claim 21, wherein the third output signal is indicative of a reflectivity of the at least one additional object that is different than a reflectivity of the object ([Fig. 8B]; [0080] A reference pulse signal 626 corresponds to the emitted pulse 606, for example, and may be generated by detecting a small fraction of the power of the emitted pulse 606 scattered by the window of the housing or diverted by other optical means toward the receiver.).
Regarding claim 24, Hicks teaches the LIDAR system of claim 21, wherein receiving the third output signal includes differentiating the second output signal from the third output signal using a digital signal processing technique ([0080] The delay of the reference pulse 626 may be considered negligible or, alternatively, the reference pulse 626 may be used to set the reference time for the voltage pulse signals 630 and 632, which correspond to the return pulses 620 and 622, respectively.).
Regarding claim 25, Hicks teaches a LIDAR system ([0035] a light detection and ranging (lidar) system 100), comprising:
at least one light source configured to project laser light toward a field of view of the LIDAR system ([Fig. 1]; [0036] The lidar system 100 may include a light source 110);
at least one sensor configured to detect the laser light of the at least one light source reflected from objects in the field of view of the LIDAR system ([Fig. 1]; [0036] a receiver 140); and
at least one processor ([0036] a controller 150; [0040] Depending on the implementation, the controller 150 may include one or more processors,) configured to:
control the at least one light source to scan at least a portion of the field of view of the LIDAR system ([0037] The output beam of light 125 is directed downrange toward a remote target 130 located a distance D from the lidar system 100 and at least partially contained within a field of regard of the system 100.);
receive, from the at least one sensor, reflection signals indicative of received laser light reflected from objects in the at least a portion of the field of view of the LIDAR system ([0040] The receiver 140 may receive or detect photons from the input beam 135 and generate one or more representative signals); and
use the reflection signals to generate a point-cloud representation of an environment of the LIDAR system within the at least a portion of the field of view of the LIDAR system ([0052] A collection of pixels captured in succession (which may be referred to as a depth map, a point cloud, or a frame) may be rendered as an image or may be analyzed to identify or detect objects or to determine a shape or distance of objects within the FOR.);
and wherein the at least one processor is further configured to: receive from the at least one sensor, a first output signal associated with a first laser light pulse maximally incident upon an object in the at least a portion of the field of view of the LIDAR system ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.);
receive from the at least one sensor, a second output signal associated with a second laser light pulse not incident upon the object, wherein the second laser light pulse sequentially follows the first laser light pulse ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.);
determine a location associated with an edge of the object based on a spatial relationship between the first laser light pulse and the second laser light pulse ([0082] Then, the machine vision system 10 can calculate the distance to the proximal object 610 (see FIG. 8A) based on the time delay tp=½ (t3+t4)−t0, and calculate the distance to the distant object 612 based on the time delay td=½ (t5+t6)−t0. In other cases, the time corresponding to the peak value of the returned pulse signal may be used.); and
generate a point cloud data point representative of the determined location associated with the edge of the object ([0085] In another implementation, each of the multiple returns from solid targets may be used to generate a separate pixel or point in a three-dimensional point cloud.).
Regarding claim 29, Hicks teaches a LIDAR system ([0035] a light detection and ranging (lidar) system 100), comprising:
at least one light source configured to project laser light toward a field of view of the LIDAR system ([Fig. 1]; [0036] The lidar system 100 may include a light source 110);
at least one sensor configured to detect the laser light of the at least one light source reflected from objects in the field of view of the LIDAR system ([Fig. 1]; [0036] a receiver 140); and
at least one processor ([0036] a controller 150; [0040] Depending on the implementation, the controller 150 may include one or more processors,) configured to:
control the at least one light source to scan at least a portion of the field of view of the LIDAR system ([0037] The output beam of light 125 is directed downrange toward a remote target 130 located a distance D from the lidar system 100 and at least partially contained within a field of regard of the system 100.);
receive, from the at least one sensor, reflection signals indicative of received laser light reflected from objects in the at least a portion of the field of view of the LIDAR system ([0040] The receiver 140 may receive or detect photons from the input beam 135 and generate one or more representative signals); and
use the reflection signals to generate a point-cloud representation of an environment of the LIDAR system within the at least a portion of the field of view of the LIDAR system ([0052] A collection of pixels captured in succession (which may be referred to as a depth map, a point cloud, or a frame) may be rendered as an image or may be analyzed to identify or detect objects or to determine a shape or distance of objects within the FOR.);
and wherein the at least one processor is further configured to: receive from the at least one sensor, a first output signal associated with a first laser light pulse partially incident upon an object in the at least a portion of the field of view of the LIDAR system ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.);
receive from the at least one sensor, a second output signal associated with a second laser light pulse partially incident upon the object ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.),
wherein the second laser light pulse sequentially follows the first laser light pulse and wherein a spot associated with the first laser light pulse at least partially overlaps with a spot associated with the second laser light pulse ([0092] For example, a lidar system, e.g. such as one with multiple sensor heads, may emit multiple pulses that are intended for multiple corresponding detectors);
determine a first reflected portion associated with an amount of the first laser light pulse reflected from the object ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.);
determine a second reflected portion associated with an amount of the second laser light pulse reflected from the object ([0079] This small target 610A partially scatters the emitted pulse, resulting in a return pulse 620. The large target 612 fully subtends the IFOV of the light source and scatters the remainder of the emitted light, resulting in another return pulse 622. The receiver of the lidar system detects the pulse 620 as well as the pulse 622.);
determine a location associated with an edge of the object based on a comparison of the first reflected portion and the second reflected portion and further based on a spatial relationship between the first laser light pulse and the second laser light pulse ([0082] Then, the machine vision system 10 can calculate the distance to the proximal object 610 (see FIG. 8A) based on the time delay tp=½ (t3+t4)−t0, and calculate the distance to the distant object 612 based on the time delay td=½ (t5+t6)−t0. In other cases, the time corresponding to the peak value of the returned pulse signal may be used.); and
generate a point cloud data point representative of the determined location associated with the edge of the object ([0085] In another implementation, each of the multiple returns from solid targets may be used to generate a separate pixel or point in a three-dimensional point cloud.).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 7-9, 15, and 26-28 are rejected under 35 U.S.C. 103 as being unpatentable over Hicks in view of Vlaiko et al. (United States Patent Application Publication 20180143308 A1), hereinafter Vlaiko
Regarding claim 7, Hicks teaches the LIDAR system of claim 6,
Hicks fails to teach the system wherein a first angular scanning rate over a first portion of a scanning path associated with the location of the edge of the object is lower than a second angular scanning rate over a second portion of the scanning path not associated with the location of the edge of the object,
However, Vlaiko teaches a system wherein a first angular scanning rate over a first portion of a scanning path associated with the location of the edge of the object is lower than a second angular scanning rate over a second portion of the scanning path not associated with the location of the edge of the object ([0363] For example, in some cases, it may be more important and useful to have well-defined information regarding the location of an edge of an object (such as the outer edge or envelope of the object, such as a vehicle). Thus, it may be desirable to allocate more light flux toward FOV regions where edges of the vehicle reside and less light flux toward FOV regions that include portions of the object residing within the external envelope. As just one illustrative example, as shown in FIG. 5C, the at least one processor 118 may be configured to cause emission of two light pulses of projected light for use in analyzing FOV regions including edges of an object),
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the varied scan rate based on object detection, similar to Vlaiko, with a reasonable expectation of success. This would have the predictable result of ensuring a more detailed scan of a region identified to have an object edge in the field of view.
Regarding claim 8, Hicks teaches the LIDAR system of claim 5,
Hicks fails to teach the system wherein the scanning pattern includes a plurality of scan lines, and wherein first laser light pulse and the second laser light pulse are emitted in a single scan line,
However, Vlaiko teaches a system wherein the scanning pattern includes a plurality of scan lines, and wherein first laser light pulse and the second laser light pulse are emitted in a single scan line ([0258] Alternatively, deflector 114 may move continuously (or semi continuously) through a plurality of instantaneous positions during a scan of FOV 120. During such a continuous or semi-continuous scan, light may be projected to instantaneous portions 122 of FOV 120 in a continuous wave, a single pulse, multiple pulses, etc.; [0294] Of course, some regions may receive only one light emission (or even no light emission at all), while other regions may receive multiple light emissions. As a result, a particular scan of the LIDAR FOV may include objects detected based on first light emissions, second light emissions, and/or third light emissions, etc. depending on how many light emissions were projected toward a particular region.),
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the arrangement of scan lines, wherein the pulses are emitted in a single scan line similar to Vlaiko, with a reasonable expectation of success. This would have the predictable result of using a known method of scanning that would increase the quality and efficiency of a scan within a given sweep.
Regarding claim 9, Hicks teaches the LIDAR system of claim 5,
Hicks fails to teach the system wherein the scanning pattern includes a plurality of scan lines, and wherein first laser light pulse and the second laser light pulse are emitted in different scan lines,
However, Vlaiko teaches a system wherein the scanning pattern includes a plurality of scan lines, and wherein first laser light pulse and the second laser light pulse are emitted in different scan lines ([0258] Alternatively, deflector 114 may move continuously (or semi continuously) through a plurality of instantaneous positions during a scan of FOV 120. During such a continuous or semi-continuous scan, light may be projected to instantaneous portions 122 of FOV 120 in a continuous wave, a single pulse, multiple pulses, etc.; [0294] Of course, some regions may receive only one light emission (or even no light emission at all), while other regions may receive multiple light emissions. As a result, a particular scan of the LIDAR FOV may include objects detected based on first light emissions, second light emissions, and/or third light emissions, etc. depending on how many light emissions were projected toward a particular region.).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the arrangement of scan lines, wherein the pulses are emitted in a different scan lines similar to Vlaiko, with a reasonable expectation of success. This would have the predictable result of using a known method of scanning that would increase the quality and fidelity of a scan within a given sweep.
Regarding claim 15, Hicks teaches the LIDAR system of claim 1,
Hicks fails to teach the system wherein a point cloud resolution associated with the edge of the object is greater than the resolution associated with non-edges of the object,
However, Vlaiko teaches a system wherein a point cloud resolution associated with the edge of the object is greater than the resolution associated with non-edges of the object ([0196] In this scanning scheme, the edges of the vehicle and bus may be tracked with high power and the central mass of the vehicle and bus may be allocated with less light flux (or no light flux). Such light flux allocation enables concentration of more of the optical budget on the edges of the identified objects and less on their center which have less importance.),
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the higher resolution space associated with an object edge similar to Vlaiko, with a reasonable expectation of success. This would have the predictable result of ensuring a higher quality scan is allocated to a region of higher interest in a field of view.
Regarding claim 26, Hicks teaches the LIDAR system of claim 25,
Hicks fails to teach the system wherein the determined location associated with the edge of the object coincides with a first spot edge associated with the first laser light pulse and a second spot edge associated with the second laser light pulse
However, Vlaiko teaches a wherein the determined location associated with the edge of the object coincides with a first spot edge associated with the first laser light pulse and a second spot edge associated with the second laser light pulse ([0197] sector I on the right side of field of view 120...and sector III on the left side of field of view 120.)
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the first and second spot similar to Vlaiko, with a reasonable expectation of success. This would have the predictable result of indicating spots of higher interest.
Regarding claim 27, Hicks teaches the LIDAR system of claim 25,
Hicks fails to teach the system wherein the determined location associated with the edge of the object is within a space between a first spot edge associated with the first laser light pulse and a second spot edge associated with the second laser light pulse
However, Vlaiko teaches a system wherein the determined location associated with the edge of the object is within a space between a first spot edge associated with the first laser light pulse and a second spot edge associated with the second laser light pulse ([Fig. 5C]; [0369] sector II may represent an identified region of interest (e.g., because sector II is determined to have a high density of objects, objects of a particular type (pedestrians, etc.), objects at a particular distance range relative to the LIDAR system (e.g., within 50 m or within 100 m etc.), objects determined to be near to or in a path of a host vehicle, or in view of any other characteristics suggesting a region of higher interest than at least one other area within FOV 120))
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the separate spots associated with different light pulses similar to Vlaiko, with a reasonable expectation of success. This would have the predictable result of discerning, more clearly, the areas of interest based on the different light pulses.
Regarding claim 28, Hicks teaches the LIDAR system of claim 27,
Hicks fails to teach the system wherein the determined location associated with the edge of the object is at a midpoint of the space between a first spot edge associated with the first laser light pulse and a second spot edge associated with the second laser light pulse
However, Vlaiko teaches a system wherein the determined location associated with the edge of the object is at a midpoint of the space between a first spot edge associated with the first laser light pulse and a second spot edge associated with the second laser light pulse ([0197] As shown, field of view 120 is divided into three sectors: sector I on the right side of field of view 120, sector II in the middle of field of view 120, and sector III on the left side of field of view 120. In this exemplary scanning cycle, sector I was initially allocated with a single light pulse per portion; sector II, previously identified as a region of interest, was initially allocated with three light pulses per portion; and sector III was initially allocated with two light pulses per portion.)
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the midpoint area of interest between spots similar to Vlaiko, with a reasonable expectation of success. This would have the predictable result of isolating the area of highest interest in a field of view relative to other areas of interest to more clearly define and clarify object edges.
Claims 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Hicks in view of Yavid (United States Patent No. 12399279 B1), hereinafter Yavid.
Regarding claim 11, Hicks teaches the LIDAR system of claim 10,
Hicks fails to teach the system wherein a spot shape associated with the projected laser light has a dimension along a horizontal axis that is greater than a dimension along a vertical axis,
However, Vlaiko teaches a system wherein a spot shape associated with the projected laser light has a dimension along a horizontal axis that is greater than a dimension along a vertical axis ([Col. 11, Line 22-24] It should be noted that the desired shape of laser spot 25 may be achieved by a variety of optical methods; [Col. 11, Line 33-34] Likewise, the laser spot may also be oblong, or have some other desirable shape),
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the spot of oblong shape similar to Yavid, with a reasonable expectation of success. This would have the predictable result of increasing the field of view within a spot to be oriented for an optimized scan result.
Regarding claim 12, Hicks teaches the LIDAR system of claim 5,
Hicks fails to teach the system wherein the scanning pattern includes a series of vertically oriented scan lines.
However, Vlaiko teaches a system wherein the scanning pattern includes a series of vertically oriented scan lines ([Col. 15, Line 22] Fast Scan (Vertical); [Fig. 9c]).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the vertical scan lines similar to Yavid, with a reasonable expectation of success. This would have the predictable result of ensuring a more thorough scan over a given region.
Regarding claim 13, Hicks teaches the LIDAR system of claim 5,
Hicks fails to teach the system wherein the scanning pattern includes a series of non-linear scan lines.
However, Vlaiko teaches a system wherein the scanning pattern includes a series of non-linear scan lines ([Col. 21, line 61-65] In this arrangement the scanned laser beam 108 can be symmetrical about the normal of the mirror 104 without reflecting back into the laser 103. However, with this arrangement, the laser scan line on the target 107 is non-linear (a curved (“smiley”) shape shown as 107c).).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Hicks to comprise the non-linear scan lines similar to Yavid, with a reasonable expectation of success. This would have the predictable result of ensuring a more thorough scan over a given region.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT WILLIAM VASQUEZ JR whose telephone number is (571)272-3745. The examiner can normally be reached Monday thru Thursday, Flex Friday, 8:00-5:00 PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, HELAL ALGAHAIM can be reached at (571)270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ROBERT W VASQUEZ/Examiner, Art Unit 3645
/HELAL A ALGAHAIM/SPE , Art Unit 3645