Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This is the first office action on the merits and is responsive to the papers filed 04/18/2023. Claims 1-28 are currently pending and examined below.
Information Disclosure Statement
The information disclosure statements submitted by Applicant are in compliance with the provision of 37 CFR 1.97, 1.98 and MPEP § 609. They have been placed in the application file and the information referred to therein has been considered as to the merits.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 23-24, 26, 28 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Pacala et al. (US 20180167602 A1, “Pacala”).
Regarding claim 23, Pacala teaches a Light Detection and Ranging (LIDAR) system (Para 15), comprising:
a lidar detector comprising one or more detector pixels configured to detect light incident thereon (Figs. 1-3, para 36 “a set of pixels 170 that detect incident photons….”; para 62 “The system includes a set of pixels 170…” para 64 “…. a first pixel 171 in the set of pixels 170 includes an array of single-photon avalanche diode detectors (hereinafter “SPADs”) …”);
at least one optical element that is configured to direct the light to the lidar detector, wherein the at least one optical element comprises first and second optical characteristics that provide different first and second fields of view, respectively (Pacala discloses that the receiver optics include multiple lens elements aligned with respective apertures, wherein each lens element has a different optical characteristic corresponding to a different angular region or field of view. For example, Pacala describes a lens array (lenses 150) in the receiver path, with each lens focusing received light from a different angular region onto the detector (Figs. 1-3, para 34, 43, 45. See also, para 81-82, Lens tubes / aperture structures define a field of view by geometry and para 91 Electrochromic or MEMS shutters 182 selectively permit or block light reaching the detector. Switching which channel/shutter is active yields different optical characteristics (open/closed or channel-selected) and therefore different fields of view directed to the detector.); and
a lidar emitter comprising one or more emitter elements configured to emit optical signals (Figs. 1, para 34, As shown in FIG. 1, a one-dimensional optical system 100 for collecting distance information within a field includes: a set of illumination sources 110 arranged along a first axis, each illumination source in the set of illumination sources 110 configured to output an illuminating beam…) over first and second fields of illumination corresponding to the first and second fields of view provided by the at least one optical element, respectively (Para 47, 71-72, Each illumination source emits a beam substantially coincident with the field of view of its corresponding aperture/channel. Different channels → different illumination fields aligned to different FOVs.).
Regarding claim 24, Pacala teaches the LIDAR system of claim 23, wherein the at least one optical element comprises first and second lens elements having the first and second optical characteristics, respectively (Para 34, 52-55, Multiple lenses (lens array / lens tubes) aligned with apertures, each defining a different optical path and field of view. Selection of different lens channels constitutes first and second lens elements with different optical characteristics).
Regarding claim 26, Pacala teaches the LIDAR system of claim 23, wherein a first angular resolution of the first field of view is less than a second angular resolution of the second field of view (Pacala discloses an aperture layer defining multiple apertures, each aperture defining a respective field of view (at least para 43” the system includes an aperture layer 140 coincident the focal plane, defining a set of apertures 144 in a line array parallel to the axes of the illumination sources, and defining a stop region 146 around the set of apertures 144, wherein each aperture in the set of apertures 144 defines a field of view in the field coincident a discrete spot output by a corresponding illumination source in the set of illumination sources 110” and 46 “the aperture layer 140 therefore defines a column of apertures that define multiple discrete, non-overlapping fields of view of substantially infinite depth of field, as shown in FIG. 2”). Pacala further teaches that decreasing the aperture diameter results in a narrower field of view and a sharper angular response, while larger apertures define wider fields of view (para 45 “At increasingly smaller diameters …. an aperture defines a narrower field of view (i.e., a field of view of smaller diameter) and passes a sharper but lower-intensity (attenuated) signal….”). Accordingly, Pacala teaches first and second fields of view having different angular resolutions (Different apertures → different fields of view; Narrower FOV → higher angular resolution; Wider FOV → lower angular resolution), as recited in Claim 6.).
Regarding claim 28, Pacala teaches the LIDAR system of claim 23, wherein the LIDAR system is configured to be coupled to a vehicle such that the lidar emitter and lidar detector are oriented relative to an intended direction of travel of the vehicle (Pacala (para 15) states embodiments for autonomous vehicles, obstacle detection, and navigation. Orientation of emitter/detector relative to vehicle travel is inherent in vehicle-mounted LiDAR. Also, vehicle mounting → intended use).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 6, 8-9, 15-22 are rejected under 35 U.S.C. 103 as being unpatentable over Pacala in view of Han et al. (US 20180231640 A1, “Han”).
Regarding claim 1, Pacala teaches a Light Detection and Ranging (LIDAR) system (Para 15), comprising:
a lidar detector comprising one or more detector pixels configured to detect light incident thereon (Figs. 1-3, para 36 “a set of pixels 170 that detect incident photons….”; para 62 “The system includes a set of pixels 170…” para 64 “…. a first pixel 171 in the set of pixels 170 includes an array of single-photon avalanche diode detectors (hereinafter “SPADs”) …”); and
Pacala fails to explicitly teach but Han teaches at least one switchable optical element that is configured to direct the light to the lidar detector (Han discloses, with respect to the receiver path, a second MEMS mirror (622) that is an optical element configured to direct reflected light to a light detector. Specifically, Han explains that, referring to Fig. 6, the RX unit includes a second MEMS mirror (622) and a light detector (621), and that light reflected from a target is received by the second MEMS mirror and reflected and redirected to the light detector (para 53-54. Thus, Han teaches a switchable optical element configured to direct light to a lidar detector.),
wherein the at least one switchable optical element is configured to be switched between first and second optical characteristics (Han further discloses that the second MEMS mirror (622) is actively controlled to swing or tilt in multiple directions. In particular, Han describes that a mirror controller controls the MEMS mirrors to swing or tilt in various directions, such as vertically, horizontally, or diagonally (Para 55). Different tilt orientations of the MEMS mirror correspond to different optical characteristics because each orientation changes the reflection geometry of the received light.)
that provide different first and second fields of view, respectively (Han teaches that switching the second MEMS mirror between different orientations results in the detector receiving reflected light from different angular directions, thereby providing different fields of view at the detector. As explained in Han, by rotating the MEMS mirrors, a single fixed light detector can obtain multiple angular resolutions of a target, and reflected light beams from different angular directions are captured and redirected to the detector (Para 55-56). Because the second MEMS mirror (622) is disposed in the receive path and controls which angular portions of reflected light are delivered to the detector, switching the mirror between different orientations changes the angular acceptance of the detector, i.e., the detector’s field of view.).
It would have been obvious to one of ordinary skill in the art at the time of the invention to incorporate Han’s switchable receive-path MEMS mirror into Pacala’s LiDAR system in order to selectively direct reflected light from different angular regions to the detector, because both references address LiDAR reception of reflected light and such a combination would predictably enable selectable receiver fields of view using a known optical switching technique.
Regarding claim 2, Pacala in view of Han, teaches the LIDAR system of claim 1, wherein the at least one switchable optical element comprises first and second lens elements having the first and second optical characteristics, respectively (Pacala discloses that the receiver optics include multiple lens elements aligned with respective apertures, wherein each lens element has a different optical characteristic corresponding to a different angular region or field of view. For example, Pacala describes a lens array (lenses 150) in the receiver path, with each lens focusing received light from a different angular region onto the detector (Figs. 1-3, para 34, 43, 45).
Pacala teaches that the LiDAR receiver includes multiple lens elements defining different optical characteristics, such as different acceptance angles and fields of view. In particular, Pacala discloses receiver optics including lens tubes and apertures, each lens tube being associated with a corresponding aperture that defines a distinct field of view and optical acceptance geometry (Para 43, 47, 81-82; Figs. 1–2). These structures constitute first and second lens elements, each having a different optical characteristic by virtue of their geometry and associated aperture size, which determines the field of view of light delivered to the detector.
Han teaches that the switchable optical element in the receiver path—specifically, the second MEMS mirror (622)—is actively controlled to redirect reflected light along different optical paths toward the detector (Para 53-55). When combined with Pacala’s multiple receiver lens elements, Han’s MEMS mirror functions to selectively direct reflected light through one or another of the lens elements, thereby switching between first and second optical characteristics corresponding to the different lens elements.
Accordingly, the combination of Pacala and Han teaches a LiDAR system in which the switchable optical element comprises first and second lens elements having different optical characteristics, as recited in claim 2.
Regarding claim 3, Pacala in view of Han, teaches the LIDAR system of claim 2, wherein the at least one switchable optical element further comprises one or more reflective elements that are configured to be switched between first and second positions, wherein the first position directs the light from the first lens element to the lidar detector, and the second position directs the light from the second lens element to the lidar detector.
Pacala discloses a LiDAR receiver including multiple lens elements defining different receiver fields of view (See rejection of claim 2), with selection among the receiver optical paths performed using electrochromic shutters (Para 91). Han discloses movable reflective elements (e.g., MEMS mirrors) that are switchable between different positions to redirect received light along different optical paths toward receiver optics and a detector (See rejection of claim 1).
It would have been obvious to one of ordinary skill in the art at the time of the invention to employ a movable reflective element as a selector for Pacala’s multiple receiver lens paths, because both references address the same technical problem of selectively directing received LiDAR light from one of multiple receiver optical paths to a detector, and movable reflective elements were known to provide fast, precise, and repeatable optical path selection in compact optical systems.
A person of ordinary skill would have been motivated to use a movable reflective element in place of, or in addition to, Pacala’s electrochromic shutters to reduce optical loss, improve wavelength-independent performance, and enable precise angular redirection of received light, while still achieving the predictable result of directing light from a selected receiver lens element to the lidar detector.
Regarding claim 4, Pacala in view of Han, teaches the LIDAR system of claim 2, wherein the at least one switchable optical element further comprises one or more electrochromic elements that are configured to be switched between first and second states, wherein the first state allows passage of the light from the first or second lens element to the lidar detector, and the second state blocks passage of the light from the first or second lens element to the lidar detector (Pacala, para 97; Electrochromic shutter 182 (allow (first state)/block light (second state)).).
Regarding claim 6, Pacala in view of Han, teaches the LIDAR system of claim 1, wherein a first angular resolution of the first field of view is less than a second angular resolution of the second field of view (Pacala discloses an aperture layer defining multiple apertures, each aperture defining a respective field of view (at least para 43” the system includes an aperture layer 140 coincident the focal plane, defining a set of apertures 144 in a line array parallel to the axes of the illumination sources, and defining a stop region 146 around the set of apertures 144, wherein each aperture in the set of apertures 144 defines a field of view in the field coincident a discrete spot output by a corresponding illumination source in the set of illumination sources 110” and 46 “the aperture layer 140 therefore defines a column of apertures that define multiple discrete, non-overlapping fields of view of substantially infinite depth of field, as shown in FIG. 2”). Pacala further teaches that decreasing the aperture diameter results in a narrower field of view and a sharper angular response, while larger apertures define wider fields of view (para 45 “At increasingly smaller diameters …. an aperture defines a narrower field of view (i.e., a field of view of smaller diameter) and passes a sharper but lower-intensity (attenuated) signal….”). Accordingly, Pacala teaches first and second fields of view having different angular resolutions (Different apertures → different fields of view; Narrower FOV → higher angular resolution; Wider FOV → lower angular resolution), as recited in Claim 6.).
Regarding claim 8, Pacala in view of Han, teaches the LIDAR system of claim 1, further comprising: at least one control circuit that is configured to switch the at least one switchable optical element between the first and second optical characteristics in first and second imaging modes, respectively.
Pacala discloses a LiDAR system including receiver optics and a detector for detecting reflected light, providing the LiDAR system context in which different receiver configurations may be employed (at least Figs. 1-3, para 34, 43, 45. See also, the rejection of claim 2). Han discloses a LiDAR system including a control circuit configured to control a switchable optical element in the receiver path. Specifically, referring to Fig. 6, Han teaches a controller (603) including a mirror controller (632) configured to control a receive-side MEMS mirror (622) to switch between different orientations (Para 53-55). As explained in Han, switching the receive-side MEMS mirror between different orientations causes the receiver to collect reflected light from different angular directions, thereby providing different optical characteristics corresponding to different receiver configurations (Para 55-56). Accordingly, Han teaches a control circuit configured to switch the switchable optical element between different optical characteristics in different operating states. It would have been obvious to one of ordinary skill in the art to employ Han’s control circuit to operate the switchable optical element of the LiDAR system in Pacala so as to switch the receiver between different optical characteristics in different imaging modes, because both references address controlled operation of LiDAR receiver optics and such integration represents the predictable use of known control techniques.
Regarding claim 9, Pacala in view of Han, teaches the LIDAR system of claim 8, further comprising: a lidar emitter comprising one or more emitter elements configured to be switched to emit optical signals over first and second fields of illumination corresponding to the first and second fields of view, respectively.
Pacala discloses a LiDAR system including a lidar emitter having multiple emitter elements (“a set of illumination sources 110”), each configured to output an illuminating beam (Para 34, 70). Pacala further teaches that the illumination sources are configured to be switched, such that the system actuates a first illumination source during a first sampling period and then shuts down that source and actuates a second illumination source during a subsequent sampling period (Para 89). Pacala also teaches that each illuminating beam corresponds to a respective receiver field of view, because each aperture defines a field of view that is coincident with a discrete spot output by a corresponding illumination source, and each illuminating beam is substantially the same size and geometry as the field of view of the corresponding sense channel (Pacala 34, 71). Accordingly, Pacala teaches a lidar emitter configured to be switched to emit optical signals over first and second fields of illumination corresponding to first and second fields of view, as recited in Claim 9.
Regarding claim 15, Pacala in view of Han, teaches the LIDAR system of claim 8, wherein the at least one control circuit is configured to operate the at least one switchable optical element to provide the first and second fields of view for respective strobe windows between pulses of the optical signals that provide the first and second fields of illumination, respectively, wherein the respective strobe windows correspond to respective acquisition subframes of the lidar detector, wherein each acquisition subframe comprises data collected for a respective distance sub-range of a distance range, and wherein an image frame comprises the respective acquisition subframes for each of the distance sub-ranges of the distance range.
Pacala discloses a pulsed LiDAR system including a control circuit configured to selectively operate switchable optical elements, such as electrochromic or MEMS shutters (182), to control which optical channels are active during operation (Para 91). Pacala further teaches that each channel defines a distinct field of view and corresponding field of illumination via apertures and aligned illumination sources (Para 43, 49, 70–72), and that the system operates using pulsed illumination and time-of-flight detection in which pixels collect photon data during defined sampling periods between emitted pulses (Para 36, 62). Such sampling periods inherently correspond to strobe windows in which distance information is collected for respective distance sub-ranges of a total distance range, as time-of-flight directly maps detection time to distance. Pacala further teaches forming images from time-resolved distance data collected across multiple sampling periods (Para 36, 90). Han teaches a LiDAR system in which a controller synchronizes transmitter operation, optical switching, and detector acquisition such that the detector collects reflected light during defined detection intervals corresponding to different optical configurations (Para 54-56). It would have been obvious to one of ordinary skill in the art to incorporate Han’s synchronized detector acquisition control into the Pacala system in order to ensure that detector acquisition occurs during appropriate strobe windows corresponding to different optical fields of view, thereby improving timing coordination and distance measurement accuracy, resulting in the subject matter of claim 15.
Claims 16- 17, 19-22 are method claims corresponding to system claims 1-2, 6-7, 9-10. They are rejected for the same reasons.
Regarding claim 18, Pacala in view of Han teaches the method of claim 17, wherein operating the at least one switchable optical element further comprises:
operating one or more reflective elements to switch between first and second positions that direct the light from the first and second lens elements to the lidar detector, respectively (See rejection of claim 3);
operating one or more electrochromic elements to switch between first and second states that allow and block passage of the light, respectively, from the first or second lens element to the lidar detector (See rejection of claim 4); or
operating an actuatable lens element to switch between first and second discrete focal lengths to provide the first and second optical characteristics, respectively.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Pacala in view of Han and Ferreira et al. (US 2020/0284883 A1, “Ferreira”).
Regarding claim 5, Pacala in view of Han, fails to explicitly teach the LIDAR system of claim 1, wherein the at least one switchable optical element comprises an actuatable lens element that is configured to be switched between first and second discrete focal lengths to provide the first and second optical characteristics, respectively.
However, Ferreira discloses that, in the receiver path of a LiDAR system, adaptable or controllable optical components such as lenses (including movable lenses and liquid lenses) may be used to dynamically adjust optical characteristics of the receiver, including the angle of view, by adjusting the focal length of the lens (Para 1365-1366). Ferreira further teaches optics arrangements including controllable optical components whose optical properties, such as focal length, are adjusted under control of a controller to provide different optical characteristics mapped onto a sensor (Para 1407-1408; Figs. 101A–101B). These disclosures reasonably correspond to an actuatable lens element configured to be switched between different focal-length operating states, thereby providing different optical characteristics.
It would have been obvious to one of ordinary skill in the art at the time of the invention to incorporate Ferreira’s actuatable lens element with controllable focal length into the receiver optical path of Pacala in order to provide selectable optical characteristics or fields of view, because both references address adjusting receiver optics to adapt sensing performance, and using a controllable lens focal length is a known, predictable technique for modifying receiver field of view and optical characteristics. The combination merely substitutes one known receiver-side optical adjustment mechanism (controllable lens focal length) for fixed optics in Pacala to achieve the predictable result of selectable receiver optical characteristics.
Claims 7, 10-13 are rejected under 35 U.S.C. 103 as being unpatentable over Pacala in view of Han and Arlen E. Breiholz (US 8072581 B1, “Breiholz”).
Regarding claim 7, Pacala in view of Han, fails to explicitly teach the LIDAR system of claim 6, wherein the at least one switchable optical element is configured to direct the light to first and second subsets of the detector pixels to provide the first and second angular resolutions, respectively, wherein the first subset is smaller than the second subset.
However, Breiholz discloses a LiDAR system including a switchable optical element in the form of zoom optics that are selectively operated between different zoom positions (see, e.g., Breiholz, col 6: lines 1-6). Breiholz teaches that zooming the transmitted light pulses results in the received input light signals illuminating a relatively reduced area of the receiver focal plane array, such that in a narrow zoom position only a small portion of the detector pixels is illuminated, whereas in a wide zoom position a larger portion of the detector pixels is illuminated (see, e.g., Breiholz, col 6: lines 30-51; Figs. 3–4. See also, claim 1). Breiholz further explains that illuminating a reduced area of the focal plane array provides a flash LIDAR image of relatively reduced resolution (see, e.g., Breiholz, col 6: lines 64-67. See also, claim 1). Because, in an imaging LiDAR system, the angular resolution of the received scene is determined by the spatial sampling of the received light on the detector array, directing received light to a smaller subset of detector pixels provides a first angular resolution, and directing received light to a larger subset of detector pixels provides a second angular resolution. Accordingly, Breiholz teaches directing light to first and second subsets of detector pixels, wherein the first subset is smaller than the second subset, to provide different angular resolutions, as recited in Claim 7.
It would have been obvious to one of ordinary skill in the art at the time of the invention to incorporate the switchable zoom optics of Breiholz into the LiDAR receiver of Pacala in order to selectively illuminate different-sized subsets of detector pixels for different operating modes, because both references address controlling receiver resolution characteristics and such a combination would represent the predictable use of known optical techniques to vary detector illumination area and thereby provide different angular resolution modes. This combination would have yielded the predictable result of enabling a LiDAR system to trade off angular resolution and performance by directing received light to smaller or larger portions of the detector array.
Regarding claim 10, Pacala in view of Han, fails to explicitly teach the LIDAR system of claim 9, wherein the at least one control circuit is configured to activate a first subset of the emitter elements to provide the first field of illumination in the first imaging mode, and to activate a second subset of the emitter elements to provide the second field of illumination in the second imaging mode (Pacala teaches activating different subsets of emitter elements in different operating periods by actuating a first illumination source (a first subset of the emitter elements) during a first sampling period and then actuating a second illumination source (a second subset of the emitter elements) during a subsequent sampling period (Para 89), thereby providing different fields of illumination in different imaging modes. ).
Pacala in view of Han, fails to explicitly teach wherein the first imaging mode is configured to image a farther distance range than the second imaging mode.
However, Breiholz teaches operating a LiDAR system in different illumination modes in which a narrow illumination mode illuminates a reduced area of the frame and provides enhanced (farther) range, while a wide illumination mode illuminates a larger area but provides a shorter range (col 6: lines 30-51; Figs. 3–4. See also, claim 1). It would have been obvious to one of ordinary skill in the art to configure Pacala’s selective activation of emitter subsets to implement Breiholz’s wide and narrow illumination modes, such that a first imaging mode is configured to image a farther distance range than a second imaging mode, because both references address controlling transmitted illumination patterns and it was a known and predictable design choice to trade illumination coverage for increased ranging distance in LiDAR systems.
Regarding claim 11, Pacala in view of Han and Breiholz, teach the LIDAR system of claim 10, wherein the first subset comprises centrally-arranged ones of the emitter elements, and wherein the second subset comprises peripherally-arranged ones of the emitter elements (Pacala discloses a LiDAR system having multiple illumination sources spatially arranged to correspond to different receiver channels and apertures (As shown in FIG. 1, para 34, a one-dimensional optical system 100 … includes: a set of illumination sources 110 … each aperture … defining a field of view … coincident [with] a discrete spot output by a corresponding illumination source. This teaches that illumination sources are spatially arranged relative to the optical axis and corresponding apertures/channels. In such an arrangement, illumination sources aligned with central apertures are centrally arranged, while illumination sources aligned with outer apertures are peripherally arranged under the broadest reasonable interpretation.).).
Regarding claim 12, Pacala in view of Han and Breiholz, teach the LIDAR system of claim 11, wherein a first population density of the first subset of the emitter elements is greater than a second population density of the second subset of the emitter elements.
As discussed above, Pacala teaches a LIDAR system in which multiple illumination sources are arranged in an array and subsets of those emitters may be selectively activated to illuminate a reduced spatial region of the field of view (71-73). Pacala further teaches configuring illumination geometry such that emitted beams are substantially aligned with and overlap a desired sensing field of view, while illumination of regions outside that field of view is reduced, thereby improving signal efficiency and reducing noise (Para 36 (The system therefore selectively projects illuminating beams into a field ahead of the system according to an illumination pattern that substantially matches—in size and geometry across a range of distances—the fields of view of the apertures.) , 47 (…light output by the first illumination source 111—paired with the first aperture 141—and projected into the field of view of the first aperture 141 can remain substantially outside the fields of view of other apertures…), 49 (…a signal passed into the first lens 151… can exhibit a relatively high ratio of light rays originating from the first illumination source 111 to light rays originating from other illumination sources in the system.)).
Breiholz explicitly teaches that, in a narrowed illumination mode used to extend operational range, emitted light is concentrated into a smaller portion of the receiver frame, resulting in illumination of fewer pixels but increased range capability (Fig. 4, col 6: lines 42-51; illuminating only a small portion of the receiver focal plane… the range is extended). Concentrating illumination into a smaller spatial region inherently corresponds to a higher population density of active emitters within that region compared to surrounding regions.
It would have been obvious to one of ordinary skill in the art to configure the centrally-arranged emitter subset of Pacala to have a greater population density than peripheral emitters, as taught by Breiholz, in order to concentrate optical power and extend sensing range. Therefore, claim 12 is unpatentable.
Regarding claim 13, Pacala in view of Han and Breiholz, teach the LIDAR system of claim 10, wherein the lidar emitter further comprises:
at least one first patterned surface aligned with the first subset of the emitter elements, wherein the first patterned surface is configured to alter the propagation of the optical signals from the first subset of the emitter elements to provide the first field of illumination; and at least one second patterned surface aligned with the second subset of the emitter elements, wherein the second patterned surface is configured to alter the propagation of the optical signals from the second subset of the emitter elements to provide the second field of illumination.
Pacala discloses a LIDAR system including a lidar emitter having multiple emitter elements (e.g., VCSELs or lasers) optically coupled to transmitter optics that shape and direct emitted optical signals to define a field of illumination (Paea 15, 34, 70-74). Pacala further teaches that each emitter is aligned with optical structures, including patterned optical elements such as aperture layers and bulk transmitter optics, that alter the propagation of emitted optical signals to provide a defined illumination field corresponding to a sensing channel (Para 47, 71-72).
Breiholz discloses configuring a LIDAR system to selectively provide different fields of illumination by altering the propagation of emitted light using different optical configurations, including a wide illumination field and a narrow illumination field, in order to trade spatial coverage for extended range (Figs. 3–4, col 6: lines 30-51. See also, claim 1). These configurations correspond to distinct optical surfaces or optical arrangements aligned with different subsets of emitted light.
It would have been obvious to one of ordinary skill in the art at the time of the invention to configure the lidar emitter of Pacala to include a first patterned optical surface aligned with a first subset of emitter elements to provide a first field of illumination, and a second patterned optical surface aligned with a second subset of emitter elements to provide a second field of illumination, as taught by Breiholz, in order to selectively illuminate different regions or ranges while maintaining efficient use of optical power and improving operational flexibility of the LIDAR system.
Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Pacala in view of Han, Breiholz and Burroughs et al. (US 20180301872 A1, “Burroughs”).
Regarding claim 14, Pacala in view of Han and Breiholz, teach the LIDAR system of claim 10, wherein the lidar emitter further comprises: a substrate having the one or more emitter elements on a surface thereof, wherein the surface of the substrate is (non-planar and) configured to orient the first and second subsets of the emitter elements to illuminate the first and second fields of illumination, respectively (Pacala discloses a lidar emitter comprising one or more emitter elements fabricated or mounted on a substrate as part of an integrated optical system (Para 15, 71). Pacala further teaches that emitter elements are optically aligned with transmitter optics to illuminate defined fields of illumination corresponding to different sensing channels (47,71).
Pacala fails to explicitly the surface of the substrate is non-planar. However, Burroughs teaches that a plurality of emitter elements (laser diodes) may be provided on a surface of a non-native substrate, and that the surface of the substrate may be non-planar (curved) (Para 16), such that emitter elements mounted on a central portion of the curved substrate face a forward direction while emitter elements mounted on peripheral portions face oblique directions, thereby orienting different subsets of emitter elements to emit light in different directions and illuminate different regions (Para 57, Fig. 3B). Thus, Burroughs teaches “a substrate having the one or more emitter elements on a surface thereof, wherein the surface of the substrate is non-planar and configured to orient the first and second subsets of the emitter elements to illuminate the first and second fields of illumination, respectively,” as recited in claim 14. It would have been obvious to one of ordinary skill in the art to implement Pacala’s first and second subsets of emitter elements on Burroughs’ non-planar (curved) substrate surface because doing so provides a compact, solid-state and predictable way to orient different subsets of emitters to emit in different directions (e.g., forward versus oblique) to realize the different illumination fields taught by Pacala, thereby reducing reliance on complex beam-steering/optical components while maintaining desired coverage and performance.
Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Pacala in view of Ferreira.
Regarding claim 25, Pacala teaches the LIDAR system of claim 24, wherein the at least one optical element comprises: one or more electrochromic elements configured to be switched between first and second states that allow and block passage of the light, respectively, from the first or second lens element to the lidar detector (Pacala discloses a LIDAR system including control circuitry configured to selectively operate optical elements in different operating modes, such as selectively opening and closing electrochromic or MEMS shutters (182) in coordination with illumination and detection timing to control which optical channels are active during operation (para 91).).
Pacala fails to explicitly teach one or more reflective elements configured to be switched between first and second positions that direct the light from the first and second lens elements to the lidar detector, respectively; or
an actuatable lens element configured to be switched between first and second discrete focal lengths to provide the first and second optical characteristics, respectively.
However, Ferreira discloses that, in the receiver path of a LiDAR system, adaptable or controllable optical components such as lenses (including movable lenses and liquid lenses) may be used to dynamically adjust optical characteristics of the receiver, including the angle of view, by adjusting the focal length of the lens (Para 1365-1366). Ferreira further teaches optics arrangements including controllable optical components whose optical properties, such as focal length, are adjusted under control of a controller to provide different optical characteristics mapped onto a sensor (Para 1407-1408; Figs. 101A–101B). These disclosures reasonably correspond to an actuatable lens element configured to be switched between different focal-length operating states, thereby providing different optical characteristics.
It would have been obvious to one of ordinary skill in the art at the time of the invention to incorporate Ferreira’s actuatable lens element with controllable focal length into the receiver optical path of Pacala in order to provide selectable optical characteristics or fields of view, because both references address adjusting receiver optics to adapt sensing performance, and using a controllable lens focal length is a known, predictable technique for modifying receiver field of view and optical characteristics. The combination merely substitutes one known receiver-side optical adjustment mechanism (controllable lens focal length) for fixed optics in Pacala to achieve the predictable result of selectable receiver optical characteristics.
Claim 27 is rejected under 35 U.S.C. 103 as being unpatentable over Pacala in view of Breiholz.
Regarding claim 27, Pacala teaches the LIDAR system of claim 26, further comprising:
at least one control circuit that is configured to switch the at least one optical element between the first and second optical characteristics in first and second imaging modes, respectively ( Pacala discloses a LIDAR system including control circuitry configured to selectively operate optical elements in different operating modes, such as selectively opening and closing electrochromic or MEMS shutters (182) in coordination with illumination and detection timing to control which optical channels are active during operation (para 91).).
Pacala fails to explicitly teach wherein the first imaging mode is configured to image a farther distance range than the second imaging mode.
However, Breiholz teaches that a LIDAR system may be operated in distinct imaging modes having different optical characteristics, wherein a first imaging mode configured with a narrower angular coverage (e.g., a “narrow” or zoomed configuration) images a farther distance range than a second imaging mode configured with a wider angular coverage, which trades spatial resolution for reduced maximum range (col 6: lines 30-51; Figs. 3–4. See also, claim 1). It would have been obvious to one of ordinary skill in the art to apply Breiholz’s distance-optimized imaging mode selection to Pacala’s controlled optical architecture in order to enable selectable imaging modes in which one mode images farther distance ranges than another, thereby improving system flexibility and performance depending on operational needs.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Pei et al. (US 20170307759 A1), teaches Multi-Range Three-Dimensional Imaging Systems
Send et al. (US 20180210064 A1), teaches detector for optically detecting at least one object
Pacala et al. (us 20190011562 a1), teaches electronically scanned light ranging device with multiplexed photosensors
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEMPSON NOEL whose telephone number is (571) 272-3376. The examiner can normally be reached on Monday-Friday 8:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached on (571) 270-3603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JEMPSON NOEL/Examiner, Art Unit 3645
/YUQING XIAO/Supervisory Patent Examiner, Art Unit 3645