DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements (IDS) submitted on 05/12/2022, 02/21/2024, 05/27/2024, 01/24/2025, 04/09/2025, and 07/31/2025 are being considered by the examiner.
Response to Arguments
The examiner acknowledges the amendments related to the 112(b) rejections of Claims 7, 8, 9, 15, 22-24, and 31-34, and withdraws the rejections.
Applicant's arguments filed 12/21/2025 have been fully considered.
Applicant’s arguments regarding the applicability of Bikumandla in view if Finkelstein to amended Claims 1, 24, 33, and 34 are not persuasive. The examiner notes that the embodiments in Bikumandla show a coupling junction (326, 526) between the first and second substrate layers. In [0028], Bikumandla notes this coupling junction need not utilize any additional material. In such cases the coupling junction is simply substrate-to-substrate contact. Given that the semiconductor substrate layers are necessarily electrically conductive during operation, under the broadest reasonable interpretation of the limitations, the further interconnect layer is in electrical contact with the first and second photo diode layers in this case.
Applicant’s arguments, with respect to the rejection of Claim 6 under 35 U.S.C. 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Abramovich (US 6,221,687 B1).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 2, 5, 7-10, 16, 17, 20-22, 24-27, and 33 are rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla (US 2013/0234029 A1) in view of Finkelstein (US 2020/0057151 A1).
Regarding Claim 1, Bikumandla discloses an optical component (Fig. 5, [0045]) comprising:
a first photo diode implementing a TOF sensor pixel in a first semiconductor structure ([0047]: “BSI image sensor 531 includes TOF photodetector array 523 disposed in second semiconductor substrate 525”) and configured to absorb received light in a first wavelength region ([0047]: “TOF photodetector array 523 is operable to detect longer wavelength light 310L.”);
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure ([0047]: “BSI image sensor 531 is coupled under FSI image sensor 530 and coupling junction 526.”) and configured to absorb received light in a second wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”);
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode ([0021]: “The interconnect portion 107 is used to convey signals to and from the pixel circuitry 103.”; [0046]: “FSI image sensor 530 also includes first semiconductor substrate 524, which is optically coupled to receive the light from the first interconnect portion 534”);
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region ([0046]: “Visible light photodetector array 522 is operable to detect shorter wavelength light 310S (e.g. visible light).”).
a further interconnect layer comprising an electrically conductive structure ([0021], [0045], the interconnects are electrically conductive) configured to electrically contact the second vertical photo diode and/or the first vertical photo diode ([0047]: “BSI image sensor 531 also includes a second interconnect portion 539 coupled underneath TOF photodetector array 523.”; [0028]: “In some examples, an additional material (e.g., an adhesive) may be included between the substrates in order to hold them together, whereas in other examples no such additional material may be used.” In the case where no additional materials are used in the coupling junction, the first and further interconnect layers would be in direct physical and electrical contact).
Bikumandla teaches that the optical component is integrated into a TOF device, but does not explicitly teach that the device is a LiDAR Sensor System. Finkelstein teaches a lidar sensor system (Abstract).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the TOF sensor system of Bikumandla with the teaching of Finkelstein to specifically use the LiDAR senor modality. Finkelstein describes a semiconductor device in the same field of endeavor as Bikumandla and the instant application, in which an RGB camera and LiDAR sensor are monolithically integrated into a LiDAR sensor system. As noted in Finkelstein, TOF systems are a somewhat broader category of measurements than LiDAR. A worker skilled in the art would be aware of the differences between the more general TOF category and LiDAR systems, and incorporating a sensor into a LiDAR system instead of a TOF system would have predictable results.
Regarding Claim 2, which depends from rejected Claim 1, Bikumandla further discloses wherein the second photo diode is vertically stacked over the first photo diode (Figure 5 shows the visible light photodetector directly over the TOF photodetector).
Regarding Claim 5, which depends from rejected Claim 1, Bikumandla further discloses a microlens over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode ([0048]: “As shown, light (e.g. reflected light 409) is directed along optical path 511 through a microlens of microlens array 532”; Figure 5 shows that the microlens completely covers the photodiodes in a lateral direction).
Regarding Claim 7, which depends from rejected Claim 1, Bikumandla further discloses wherein the received light of the second wavelength region has a wavelength in the range from 380 nm to 780 nm (Figure 8 showing the quantum efficiency of the detectors shows that they receive light in a wavelength range beginning at 350 nm and extending to at least 800 nm).
Bikumandla does not teach that the received light of the first wavelength region has a wavelength in the range from 800 nm to1800 nm, only that light is passed to the detectors up to about 1100 nm (Figure 8).
Finkelstein does teach the received light of the first wavelength region has a wavelength in the range from 800 nm to 1800 nm ([0065]: “The NIR wavelength range may include wavelengths of about 780 nm to about 2500 nm.”; This is inclusive of the range limitation in the instant application.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the second wavelength region of Bikumandla with the teaching of Finkelstein to extend the wavelength region to a longer wavelength. It is well-known in the art that common LiDAR wavelengths go up to about 1550 nm, so using optical components that are inclusive of that wavelength could provide better interoperability with a variety of emitter systems.
Regarding Claim 8, which depends from rejected Claim 1, Bikumandla does not teach that the received light of the first wavelength region has a wavelength in the range from 800 nm to 1800 nm, only that light is passed to the detectors up to about 1100 nm (Figure 8).
Finkelstein does teach the received light of the first wavelength region has a wavelength in the range from 800 nm to 1800 nm ([0065]: “The NIR wavelength range may include wavelengths of about 780 nm to about 2500 nm.”; This is inclusive of the range limitation in the instant application.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the second wavelength region of Bikumandla with the teaching of Finkelstein to extend the wavelength region to a longer wavelength. It is well-known in the art that common LiDAR wavelengths go up to about 1550 nm, so using optical components that are inclusive of that wavelength could provide better interoperability with a variety of emitter systems.
Regarding Claim 9, which depends from rejected Claim 1, Bikumandla further discloses wherein the received light of the second wavelength region has a shorter wavelength than re- ceived light of the first wavelength region by at least 50 nm, by at least 100 nm (Figure 8 shows that the first (infrared) wavelength region extents from about 800 nm to 1100 nm, and the second (visible) wavelength region extents from about 380 to 800 nm. Thus, the second wavelength region contains a wavelength (e.g., 400 nm) that is more than 100 nm shorter than a wavelength (e.g., 800 nm) from the first wavelength region).
Regarding Claim 10, which depends from rejected Claim 1, Bikumandla further discloses wherein the received light of the second wavelength region has a wavelength in the visible spectrum wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”).
Regarding Claim 16, which depends from rejected Claim 1, Bikumandla further discloses wherein the first photo diode is a pin photo diode; and wherein the second photo diode is a pin photo diode ([0030]: “Representative examples of suitable photodiodes include, but are not limited to, P-N photodiodes, PIN photodiodes, and avalanche photodiodes. In some examples, P-N photodiodes and other types of photodiodes used in CMOS APS are used”).
Regarding Claim 17, which depends from rejected Claim 1, Bikumandla further discloses wherein the first photo diode is an avalanche photo diode; and wherein the second photo diode is a pin photo diode ([0030]: “Representative examples of suitable photodiodes include, but are not limited to, P-N photodiodes, PIN photodiodes, and avalanche photodiodes. In some examples, P-N photodiodes and other types of photodiodes used in CMOS APS are used”).
Regarding Claim 20, which depends from rejected Claim 1, Bikumandla further discloses wherein the first photo diode is an avalanche photo diode; and wherein the second photo diode is an avalanche photo diode ([0030]: “Representative examples of suitable photodiodes include, but are not limited to, P-N photodiodes, PIN photodiodes, and avalanche photodiodes. In some examples, P-N photodiodes and other types of photodiodes used in CMOS APS are used”).
Regarding Claim 21, which depends from rejected Claim 2, Bikumandla further discloses an array of a plurality of photo diode stacks, each photo diode stack comprising a second photo diode vertically stacked over a first photo diode ( [0029] states that the first and second photodiodes are formed into array and that one is over the other; Figure 5 shows the visible light photodetector directly over the TOF photodetector).
Regarding Claim 22, which depends from rejected Claim 21, Bikumandla further discloses wherein at least one photo diode stack of the plurality of photo diode stacks comprises at least one further second photo diode in the second semiconductor structure adjacent to the sec- ond photo diode;
wherein the first photo diode of the at least one photo diode stack of the plurality of photo diode stacks has a larger lateral extension than the second photo diode and the at least one further second photo diode of the at least one photo diode stack so that the second photo diode and the at least one further second photo diode are arranged laterally within the lateral extension of the first vertical photo diode ([0057]: “In one example, TOF photodetector array 523 is a lower resolution than visible light photodetector array 522 and a single TOF photodetector receives charge carriers from reflected light 409 that travels through sixteen different color filters and sixteen different photodetectors included in visible light photodetector array 522.”; This example apparently describes a setup in which there are four visible light photodetectors arranged laterally within the lateral extension of the TOF/infrared photodetector, which satisfies the ‘at least two’ limitation).
Regarding Claim 24, Bikumandla discloses a sensor for a TOF system, the sensor comprising:
a plurality of optical components (Figure 5 discloses an array of stacked microlenses, visible light photodetectors, and TOF photodetectors, [0029], [0045]), each optical component of the plurality of optical components comprising:
a first photo diode implementing a TOF sensor pixel in a first semiconductor structure ([0047]: “BSI image sensor 531 includes TOF photodetector array 523 disposed in second semiconductor substrate 525”) and configured to absorb received light in a first wavelength region ([0047]: “TOF photodetector array 523 is operable to detect longer wavelength light 310L.”);
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure ([0047]: “BSI image sensor 531 is coupled under FSI image sensor 530 and coupling junction 526.”) and configured to absorb received light in a second wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”) and configured to absorb received light in a second wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”);
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode ([0021]: “The interconnect portion 107 is used to convey signals to and from the pixel circuitry 103.”; [0046]: “FSI image sensor 530 also includes first semiconductor substrate 524, which is optically coupled to receive the light from the first interconnect portion 534”);
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region ([0046]: “Visible light photodetector array 522 is operable to detect shorter wavelength light 310S (e.g. visible light).”).
a further interconnect layer comprising an electrically conductive structure ([0021], [0045], the interconnects are electrically conductive) configured to electrically contact the second vertical photo diode and/or the first vertical photo diode ([0047]: “BSI image sensor 531 also includes a second interconnect portion 539 coupled underneath TOF photodetector array 523.”; [0028]: “In some examples, an additional material (e.g., an adhesive) may be included between the substrates in order to hold them together, whereas in other examples no such additional material may be used.” In the case where no additional materials are used in the coupling junction, the first and further interconnect layers would be in direct physical and electrical contact).
wherein the plurality of optical components are monolithically integrated on the carrier as a common carrier (Figure 5, [0045]: “For simplicity, two pixels are shown, although typically the image sensor may include a two dimensional array with a multitude of pixels.”; [0047]: “BSI image sensor 531 is coupled under FSI image sensor 530 and coupling junction 526.”, The optical components are therefore coupled into a monolithic device).
Bikumandla teaches that the optical component is integrated into a TOF device, but does not explicitly teach that the device is a LiDAR Sensor System. Finkelstein teaches a lidar sensor system (Abstract).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the TOF sensor system of Bikumandla with the teaching of Finkelstein to specifically use the LiDAR senor modality. Finkelstein describes a semiconductor device in the same field of endeavor as Bikumandla and the instant application, in which an RGB camera and LiDAR sensor are monolithically integrated into a LiDAR sensor system. As noted in Finkelstein, TOF systems are a somewhat broader category of measurements than LiDAR. A worker skilled in the art would be aware of the differences between the more general TOF category and LiDAR systems, and incorporating a sensor into a LiDAR system instead of a TOF system would have predictable results.
Regarding Claim 25, which depends from rejected Claim 24, Bikumandla further discloses wherein the sensor is configured as a front-side illuminated sensor ([0050]: “visible light photodetector array 522 is disposed in FSI image sensor 530 and TOF photodetector array 523 is disposed in BSI image sensor in the illustrated embodiment”).
Regarding Claim 26, which depends from rejected Claim 24, Bikumandla further discloses wherein the sensor is configured as a back-side illuminated sensor ([0050]: “In one example, visible light photodetector array 522 could be disposed in a BSI image sensor and TOF photodetector array 523 could be disposed in a FSI image sensor, if the semiconductor substrates of each image sensor (BSI and FSI) were appropriately thinned.”).
Regarding Claim 27, which depends from rejected Claim 24, Bikumandla further discloses a color filter layer covering at least some optical components of the plurality of optical components (Figure 5, element 533; [0045]: “A color filter array 533 of FSI image sensor 530 is optically coupled to receive the focused light from microlens array 532 and operable to filter the light.”).
Regarding Claim 33, Bikumandla discloses a TOF Sensor System, comprising:
a sensor for a TOF Sensor System (Abstract), the sensor comprising:
a plurality of optical components (Figure 5 discloses an array of stacked microlenses, visible light photodetectors, and TOF photodetectors, [0029], [0045]), each optical component of the plurality of optical components comprising:
a first photo diode implementing a TOF sensor pixel in a first semiconductor structure ([0047]: “BSI image sensor 531 includes TOF photodetector array 523 disposed in second semiconductor substrate 525”) and configured to absorb received light in a first wavelength region ([0047]: “TOF photodetector array 523 is operable to detect longer wavelength light 310L.”);
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure ([0047]: “BSI image sensor 531 is coupled under FSI image sensor 530 and coupling junction 526.”) and configured to absorb received light in a second wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”) and configured to absorb received light in a second wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”);
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode ([0021]: “The interconnect portion 107 is used to convey signals to and from the pixel circuitry 103.”; [0046]: “FSI image sensor 530 also includes first semiconductor substrate 524, which is optically coupled to receive the light from the first interconnect portion 534”);
a further interconnect layer comprising an electrically conductive structure ([0021], [0045], the interconnects are electrically conductive) configured to electrically contact the second vertical photo diode and/or the first vertical photo diode ([0047]: “BSI image sensor 531 also includes a second interconnect portion 539 coupled underneath TOF photodetector array 523.”; [0028]: “In some examples, an additional material (e.g., an adhesive) may be included between the substrates in order to hold them together, whereas in other examples no such additional material may be used.” In the case where no additional material are used in the coupling junction, the first and further interconnect layers would be in direct physical and electrical contact).
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region ([0046]: “Visible light photodetector array 522 is operable to detect shorter wavelength light 310S (e.g. visible light).”); and
wherein the plurality of optical components are monolithically integrated on the carrier as a common carrier (Figure 5, [0045]: “For simplicity, two pixels are shown, although typically the image sensor may include a two dimensional array with a multitude of pixels.”; [0047]: “BSI image sensor 531 is coupled under FSI image sensor 530 and coupling junction 526.”, The optical components are therefore coupled into a monolithic device); and
a sensor controller configured to control the sensor (Figure 9, element 921, ‘control circuitry’; [0058]).
Bikumandla teaches that the optical component is integrated into a TOF device, but does not explicitly teach that the device is a LiDAR Sensor System. Finkelstein teaches a lidar sensor system (Abstract).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the TOF sensor system of Bikumandla with the teaching of Finkelstein to specifically use the LiDAR senor modality. Finkelstein describes a semiconductor device in the same field of endeavor as Bikumandla and the instant application, in which an RGB camera and LiDAR sensor are monolithically integrated into a LiDAR sensor system. As noted in Finkelstein, TOF systems are a somewhat broader category of measurements than LiDAR. A worker skilled in the art would be aware of the differences between the more general TOF category and LiDAR systems, and incorporating a sensor into a LiDAR system instead of a TOF system would have predictable results.
Claims 3 is rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla in view of Finkelstein as applied to Claim 1 above, and further in view of Liu (WO 2017/024121 A1).
Regarding Claim 3, which depends from rejected Claim 1, neither Bikumandla nor Finkelstein teach and Liu does teach wherein the photodiodes are vertical photodiodes (Figures 5 and 6 show vertical photodiodes; [0064]-[0066], [0068], [0069], and [0074] describe the configuration in detail).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor if Bikumandla in view of Finkelstein with the teaching of Liu to incorporate vertical photodiodes. Kaminski et al., teach that “enhancement of carrier collection efficiency in photodiodes is a major challenge” (Page 1, Col 1, Para 1). Kaminski notes in Page 1, Col 2, Para 2 that one possibility to increase carrier collection is to use vertical P/N junctions, and that “the carrier collection probability in depletion areas is nearly 100%, so the vertical cells are expected to have higher efficiencies.” Higher collection probabilities would be beneficial to sensors as this would result in better overall sensitivity.
Claims 11-13, 18, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla in view of Finkelstein as applied to Claim 1 above, and further in view of Meynants (US 2020/0194474 A1).
Regarding Claim 11, which depends from rejected Claim 1, Bikumandla in view of Finkelstein does not teach and Meynants does teach a mirror structure comprising a bottom mirror and a top mirror (Figure 11, elements R1 and R2 are ‘reflecting layers’”; [0046] describes the layers as mirrors or reflecting layers);
wherein the second semiconductor structure is arranged between the bottom mirror and the top mirror ([0046]: “The reflecting layers R1, R2 are arranged above the rear side RS as a top mirror or top reflecting layer R1 and above the active part of the image sensor, which is represented by a photodiode, as a bottom mirror or bottom reflecting layer R2.”; Figure 11, element CG; [0045]: “In the following description, the charge carrier generating component CG is represented by a photodiode, by way of example.”);
wherein the bottom mirror is arranged between the interconnect layer and the second semi- conductor structure (Figure 11, an interconnect passes through M1, M2, and M3 (metallized layers, [0053]) to which bond pads are connected.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Meynants to use a top and bottom mirror around the second semiconductor photodetector. This is equivalent to resonant cavity enhanced sensor, which Meynants notes in [0011] “provides high quantum efficiency and a reduced depth of the region where charge carriers are generated and detected.”
Regarding Claim 12, which depends from rejected Claim 11, Bikumandla in view of Finkelstein does not teach and Meynants does teach wherein the mirror structure comprises a Bragg mirror structure ([0014]: “The mirrors can especially be dielectric mirrors (Bragg mirrors, for example) tuned for optimized reflectance at the target wavelength.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to follow the teaching of Meynants to use Bragg mirrors in resonant cavity of Bikumandla in view of Finkelstein and further in view of Meynants. A skilled worker in the art would be familiar with Bragg mirrors and their properties, and would find the effects of such mirrors routine and predictable within the sensor of Claim 11.
Regarding Claim 13, which depends from rejected Claim 11, Bikumandla in view of Finkelstein does not teach and Meynants does teach wherein the mirror structure and the second vertical photo diode are configured so that the second vertical photo diode forms a resonant cavity photo diode ([0018]: “The top reflecting layer and the bottom reflecting layer are appropriate to form opposite boundaries of a resonant cavity, which may especially be tuned to a wavelength of infrared radiation.”; [0072]: “FIG. 11 shows another partial cross section of a front illuminated image sensor comprising a resonant cavity in a dedicated photoconversion layer”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Meynants to use a top and bottom mirror around the second semiconductor photodetector, which is a resonant cavity photodiode. Meynants notes in [0011] “provides high quantum efficiency and a reduced depth of the region where charge carriers are generated and detected.
Regarding Claim 18, which depends from rejected Claim 1, Bikumandla further discloses wherein the first photo diode is an avalanche photo diode ([0030]: “Representative examples of suitable photodiodes include, but are not limited to, P-N photodiodes, PIN photodiodes, and avalanche photodiodes. In some examples, P-N photodiodes and other types of photodiodes used in CMOS APS are used”).
Bikumandla in view of Finkelstein does not teach and Meynants does teach wherein the second photo diode is a resonant cavity photo diode ([0018]: “The top reflecting layer and the bottom reflecting layer are appropriate to form opposite boundaries of a resonant cavity, which may especially be tuned to a wavelength of infrared radiation.”; [0072]: “FIG. 11 shows another partial cross section of a front illuminated image sensor comprising a resonant cavity in a dedicated photoconversion layer”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching to use a resonant cavity enhanced photodiode. Bikumandla and Finkelstein both suggest that a variety of photodetector types can be used for the image sensor, and Meynants notes in [0011] “provides high quantum efficiency and a reduced depth of the region where charge carriers are generated and detected.
Regarding Claim 19, which depends from rejected Claim 1, Bikumandla does not teach and Finkelstein does teach wherein the first photo diode is a single-photon avalanche photo diode ([0037]: “Single-photon avalanche diodes (SPADs) may be used to detect single photons in time-of-flight sensing applications.”);
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla with the teaching of Finkelstein to incorporate a single photon avalanche photodiode. Bikumandla already suggests the use of an avalanche photodiode, and the single photon avalanche photodiode is well-known in the LiDAR arts as a means of detecting radiation reflected off a target object. Thus, a skilled worker would find the results of including a single photon avalanche photodiode predictable.
Bikumandla in view of Finkelstein does not teach and Meynants does teach wherein the second photo diode is a resonant cavity photo diode ([0018]: “The top reflecting layer and the bottom reflecting layer are appropriate to form opposite boundaries of a resonant cavity, which may especially be tuned to a wavelength of infrared radiation.”; [0072]: “FIG. 11 shows another partial cross section of a front illuminated image sensor comprising a resonant cavity in a dedicated photoconversion layer”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching to use a resonant cavity enhanced photodiode. Bikumandla and Finkelstein both suggest that a variety of photodetector types can be used for the image sensor, and Meynants notes in [0011] “provides high quantum efficiency and a reduced depth of the region where charge carriers are generated and detected.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla in view of Finkelstein as applied to Claim 5, and in view of Abramovich (US 6,221,687 B1).
Regarding Claim 6, which depends from rejected Claim 5, Bikumandla further discloses a filter layer over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode (Figures 5 and 6 show that the filter layer completely spans the width of the photo diodes in the first and second semiconductor layers) and is configured to transmit received light having a wavelength within the first wavelength region (Figure 8 shows the quantum efficiency of photo diodes as a function of wavelength with overlying filters, which shows that light is transmitted in the first wavelength region) and within the second wave- length region ([0053]: “Blue filter 611, green filters 613 and 715, and red filter 717 also pass at least a portion of near-infrared light 710NIR.), and block light outside the first wavelength region and outside the second wave- length region (Figure 8 shows that the filter layer blocks light below about 300 nm and above 1100 nm wavelength).
Bikumandla in view of Finkelstein does not teach and Abramovich does teach wherein the filter is over the microlens (Figure 4A; Column 5, Lines 48-50: “Lower CT layer 252, color filter layer 255 and upper CT layer 257 form a color filter structure (protective layer) 250 over microlens 245 that functions, in part, to protect microlens 245.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Abramovich to have the color filter layer above the microlens layer. Abramovich notes that light entering at oblique angles can cause problems in such photodiode devices, and that this problem can be mitigated by moving the microlens closer to the photodiode (Column 2, Lines 13-16). Abramovich further notes that one way to move the microlens closer to the photodiode is to form the microlens under the color filter layer (Column 2, Line 40). Thus, the arrangement taught by Abramovich can result in better performance of color image sensors with microlens arrays.
Claims 14 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla in view of Finkelstein as applied to Claim 1 above, and further in view of Chuang (US 2017/0054924 A1).
Regarding Claim 14, which depends from rejected Claim 1, Bikumandla in view of Finkelstein does not teach and Chuang does teach a reflector layer over the second semiconductor structure (Figure 3, element 116, “infrared filter layer”; [0040]: “In some embodiments, the infrared filter layer 116 includes a reflective infrared filter structure (not shown).”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Chuang to include a reflector layer over the semiconductor structure. Chuang notes in [0040] that including the reflector layer on the semiconductor, “an additional infrared filter element can be omitted.” Omitting a traditional infrared filter, which would likely be deposited on a glass substrate, may allow for the thickness of the imaging module to be decreased.
Regarding Claim 15, which depends from rejected Claim 14, Bikumandla in view of Finkelstein does not teach and Chuang does teach wherein the reflector layer is configured as an infrared reflector layer (Figure 3, element 116, “infrared filter layer”; [0040]: “In some embodiments, the infrared filter layer 116 includes a reflective infrared filter structure (not shown).”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Chuang to include a reflector layer over the semiconductor structure. Chuang notes in [0040] that including the reflector layer on the semiconductor, “an additional infrared filter element can be omitted.” Omitting a traditional infrared filter, which would likely be deposited on a glass substrate, may allow for the thickness of the imaging module to be decreased.
Claims 28-30 and 32 are rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla in view of Finkelstein as applied to Claim 27 above, and further in view of Sugiyama (US 2020/0249428 A1).
Regarding Claim 28, which depends from rejected Claim 27, Bikumandla teaches wherein the color filter layer comprises a first color filter sublayer (Figure 5, 533, “color filter layer”; [0045]: “A color filter array 533 of FSI image sensor 530 is optically coupled to receive the focused light from microlens array 532 and operable to filter the light.”) and a second color filter sublayer (Referring to semiconductor substrates 324 and 325, Bikumandla notes in [0033] that “the semiconductor material itself may serve as a color filter based on its wavelength-dependent absorption of light,” thus providing a second color filter sublayer directly below the first);
Bikumandla in view of Finkelstein does not teach and Sugiyama does teach wherein the first color filter sublayer is configured to transmit received light having a wave- length within the first wavelength region and within the second wavelength region, and block light outside the first wavelength region and outside the second wavelength region ([0185]: “The IR bandpass filter 142 is a bandpass filter that has transparency in a visible region and a portion of the wavelength region of the IR light”; [0192]: “the IR bandpass filter 142 completely (with a transmittance of 0) blocks light in wavelength regions other than a visible range and the wavelength region of the IR light”; Here the examiner takes the first wavelength region to be the infrared region and the second wavelength region to be the visible region); and
wherein the second color filter sublayer is configured to block received light having a wavelength outside the second wavelength region (Figure 19, “IR Cutoff Filter” which blocks light outside the second (visible) wavelength region; [0182]: “The IR cutoff filter blocks light in a wavelength region (for example, about 850 nm) of the IR light.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Sugiyama to have one color filter sublayer which transmits light in a first and second region and blocks light outside those regions, and a second color filter sublayer which blocks light outside the first region. Sugiyama notes in [0223] that with this teaching “it becomes possible to separate the visible signal and the TOF signal, which are signals having different characteristics.” This can allow for easier integration of image and TOF/LiDAR sensors onto a single chip.
Regarding Claim 29, which depends from rejected Claim 28, Bikumandla further discloses wherein the first color filter sublayer and/or the second color filter sublayer comprises a plu- rality of second sublayer pixels (Figure 7 shows a plurality of sublayer pixels in a color pixel sublayer; [0052]: “A pixel of image sensor 600 may include a Red-Green-Green-Blue ("RGGB") color filter configuration, as illustrated.”).
Regarding Claim 30, which depends from rejected Claim 29, Bikumandla further discloses wherein the first color filter sublayer and/or the second color filter sublayer comprises a plurality of second sublayer pixels in accordance with a Bayer pattern ([0052]: “A pixel of image sensor 600 may include a Red-Green-Green-Blue ("RGGB") color filter configuration, as illustrated.”; Figure 7 shows the RGGB pixel pattern of a Bayer Filter.).
Regarding Claim 32, which depends from rejected Claim 28, Bikumandla in view of Finkelstein does not teach and Sugiyama does teach wherein the first color filter sublayer comprises a plurality of first sublayer pixels having a size larger than the size of the second sublayer pixels; wherein one first sublayer pixels laterally substantially overlaps with a plurality of the second sublayer pixels (Figure 18 shows that one color filter sublayer 163 has a larger pixel size in the IR cutoff Filter pixel than the overlying color filter sublayer 164. In this case, three pixels in sublayer 164 correspond to one pixel in 163.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Sugiyama to have different pixel sizes in the color filter sublayers. Sugiyama notes in [0199] that that “it becomes possible to almost completely avoid mixing of the IR light projected onto the visible pixels.” This can be beneficial to suppress noise or crosstalk between the sensing modalities in the system.
Claim 31 is rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla in view of Finkelstein and in view of Sugiyama as applied to Claim 28 and further in view of Takita (US 2017/0048426 A1).
Regarding Claim 31, which depends from rejected Claim 28, Bikumandla in view of Finkelstein does not teach and Takita does teach wherein the first color filter sublayer comprises a plurality of first sublayer pixels having the same size as the second sublayer pixels wherein the first sublayer pixels and the second sublayer pixels coincide with each other ([0095]: “in this embodiment, each pixel 434 includes two pixel layers 436, one visible passband filter 442A, 442B, 442C, and one infrared light passband filters 444A, 444B, 444C.” Thus, there are two color filter sublayers having the same size and which coincide with each other, which can also be seen in Figures 4A-4C.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor of Bikumandla in view of Finkelstein with the teaching of Takita to use two color filter sublayers with coinciding pixels of the same size. Takita notes in [0091] the plurality of stacked pixel layers allow for the detection and sensing of wavelengths of light from the visible to the longwave infrared. The ability to sense, in effect, a variety of different channels based on the various possible combinations between the two color filter sublayers provides more pieces of unique spectral information, which can be used to better evaluate target objects.
Claim 34 is rejected under 35 U.S.C. 103 as being unpatentable over Bikumandla in view of Finkelstein and further in view of Cherevatsky (US 10,699,421 B1).
Regarding Claim 34, Bikumandla discloses a method for a TOF Sensor System, wherein the TOF Sensor System comprises:
a sensor for a TOF Sensor System (Abstract), the sensor comprising:
a plurality of optical components (Figure 5 discloses an array of stacked microlenses, visible light photodetectors, and TOF photodetectors, [0029], [0045]), each optical component of the plurality of optical components comprising:
a first photo diode implementing a TOF sensor pixel in a first semiconductor structure ([0047]: “BSI image sensor 531 includes TOF photodetector array 523 disposed in second semiconductor substrate 525”) and configured to absorb received light in a first wavelength region ([0047]: “TOF photodetector array 523 is operable to detect longer wavelength light 310L.”);
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure ([0047]: “BSI image sensor 531 is coupled under FSI image sensor 530 and coupling junction 526.”) and configured to absorb received light in a second wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”) and configured to absorb received light in a second wavelength region ([0046]: “A visible light photodetector array 522 is disposed in a frontside portion of first semiconductor substrate 524.”);
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode ([0021]: “The interconnect portion 107 is used to convey signals to and from the pixel circuitry 103.”; [0046]: “FSI image sensor 530 also includes first semiconductor substrate 524, which is optically coupled to receive the light from the first interconnect portion 534”);
a further interconnect layer comprising an electrically conductive structure ([0021], [0045], the interconnects are electrically conductive) configured to electrically contact the second vertical photo diode and/or the first vertical photo diode ([0047]: “BSI image sensor 531 also includes a second interconnect portion 539 coupled underneath TOF photodetector array 523.”; [0028]: “In some examples, an additional material (e.g., an adhesive) may be included between the substrates in order to hold them together, whereas in other examples no such additional material may be used.” In the case where no additional materials are used in the coupling junction, the first and further interconnect layers would be in direct physical and electrical contact).
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region ([0046]: “Visible light photodetector array 522 is operable to detect shorter wavelength light 310S (e.g. visible light).”); and
wherein the plurality of optical components are monolithically integrated on the carrier as a common carrier (Figure 5, [0045]: “For simplicity, two pixels are shown, although typically the image sensor may include a two dimensional array with a multitude of pixels.”; [0047]: “BSI image sensor 531 is coupled under FSI image sensor 530 and coupling junction 526.”, The optical components are therefore coupled into a monolithic device); and
a sensor controller configured to control the sensor (Figure 9, element 921, ‘control circuitry’; [0058])
Bikumandla teaches that the optical component is integrated into a TOF device, but does not explicitly teach that the device is a LiDAR Sensor System. Finkelstein teaches a lidar sensor system (Abstract).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the TOF sensor system of Bikumandla with the teaching of Finkelstein to specifically use the LiDAR senor modality. Finkelstein describes a semiconductor device in the same field of endeavor as Bikumandla and the instant application, in which an RGB camera and LiDAR sensor are monolithically integrated into a LiDAR sensor system. As noted in Finkelstein, TOF systems are a somewhat broader category of measurements than LiDAR. A worker skilled in the art would be aware of the differences between the more general TOF category and LiDAR systems, and incorporating a sensor into a LiDAR system instead of a TOF system would have predictable results.
Bikumandla does not teach and Finkelstein does teach integrating the LIDAR Sensor System into a LIDAR Sensor Device ([0041], Figure 1 shows the detector array 110 (sensor system) integrated into a complete LiDAR Sensor Device.)
Bikumandla in view of Finkelstein does not teach and XYZ does teach a method comprising communicating with a second Sensor System (Column 3, lines 22-36 disclose that RGB (second sensor system) and RGBD (lidar sensor device) may be synchronized, and thus must communicate. Figure 1 shows that the cameras are connected to a computer, for example, facilitating communication) and using an object classification measured by the second Sensor System (Column 13, line 21 through Column 14, line 46 describe using the RGB camera images (second sensor system) for object classification, which is also an evaluation of current measurements) for evaluation of current and future measurements (Column 15, Lines 20-27 describe further training the tracking algorithm for subsequently captured visual images) and derived LIDAR Sensor Device control parameters as a function of these factors (Column 32, Lines 38-56 disclose synchronizing an RGB and RGBD camera based on, for example, a triggering event, which corresponds to a control parameter relating to time. Column 24, lines 5-18 notes that the seed box defined after the triggering event may be defined based on visual imaging data, such as outlines of the object. Thus, the visual imaging data may be used to define a control parameter for the RGBD/LiDAR sensor.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the sensor device of Bikumandla in view of Finkelstein with the teachings of Cherevatsky to use the device in a method of object classification which sets the sensor control parameters. Cherevatsky notes in Column 15, Lines 28-35 that “by using visual images and depth images to determine positions in 3D space, and training tracking algorithms to recognize objects based on such determined positions, some implementations of the systems and methods of the present disclosure may improve upon the computer-based tracking of target objects, thereby solving a fundamental computer vision problem.” Improved target tracking can result in increased safety in a warehouse facility or vehicle, for example.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sato (US 2021/0006756 A1) discloses a pixelated filter layer with sub-pixels in both the visible and infrared regions.
Dal Mutto (US 10,691,979 B2) discloses a method for classifying physical objects using depth images and color images.
Borthakur (US 9,749,553 B2) discloses an imaging system in which a first image sensor die is stacked on a second image sensor die wherein photodiodes in each die may be optimized to detect visible or infrared radiation, respectively.
Qian (US 2017/0115436 A1) discloses color filter sublayers composed of pixels active in the visible and infrared, wherein the pixels of each sublayer coincide and can be stacked.
Wehner (US 9,536,917 B2) discloses methods and structures for providing single-color or multi-color photo-detectors leveraging cavity resonance for performance benefits.
Wang (US 8,203,155 B2) disclose a multispectral pixel structure that includes a plurality of stacked cavity arrangements for emitting or detecting a plurality of wavelengths.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENJAMIN WADE CLOUSER whose telephone number is (571)272-0378. The examiner can normally be reached M-F 7:30 - 5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ISAM ALSOMIRI can be reached at (571) 272-6970. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/B.W.C./ Examiner, Art Unit 3645
/ISAM A ALSOMIRI/ Supervisory Patent Examiner, Art Unit 3645