DETAILED ACTION
This is the First Action on the Merits for U.S. Patent Application No. 18/742,151, filed 14 June 2024.
Claims 1–20 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claims 3–7 and 14–16 are objected to for want of conformity to 37 C.F.R. § 1.75(i), which requires each element or step in a claim to be separated by a line indentation.
Claims 10 and 20 are objected to for the following informality: “tablet computing” should be “tablet computer”. Compare with the specification at paragraph 0032.
Claim Rejections - 35 U.S.C. § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. §§ 102 and 103 (or as subject to pre-AIA 35 U.S.C. §§ 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. § 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 4, 6, 7, 11, 12, and 14–16 are rejected under 35 U.S.C. § 102(a)(1) as being anticipated by U.S. Patent Application Publication No. 2014/0327783 A1 (“Nishiwaki”).
Nishiwaki, directed to an image sensor, teaches with respect to claim 1 an image sensor, comprising:
a pixel array (Figs. 1, 2; photodetector array) including:
a first photosensitive region configured to detect visible light within a color wavelength range (Fig. 7, ¶¶ 0084–88, visible light photodetector 5R, 5G, or 5B),
a second photosensitive region including a plurality of light scattering structures and configured to detect infrared light (id., plurality of infrared photodetectors 5IR under lenses 10),
a spectral router positioned over at least the first photosensitive region (Figs. 3–5, 7; micro lenses 4) and configured to:
route the visible light within the color wavelength range to the first photosensitive region (Fig. 5B, ¶ 0073; light-splitting element 30 including microlens 4 directs visible light beams onto photodetector type A), and
route the infrared light to the second photosensitive region (Fig. 5A, ¶ 0073; light-splitting element 30 including microlens 4 directs infrared light beams onto photodetector type B); and
a spectral filter positioned over the spectral router and configured to block visible light outside of the color wavelength range (Figs. 3–5, 7; phase filter 3 on microlens 4).
Regarding claim 4, Nishiwaki teaches the image sensor of claim 1, wherein the spectral filter is a first spectral filter (Figs. 3–5, 7; phase filter 3 on microlens 4), and
wherein the pixel array further includes a second spectral filter positioned over the second photosensitive region and configured to block visible light (Fig. 7, ¶ 0080; color filter 9IR that blocks visible light over infrared photodetector 5IR).
Regarding claim 6, Nishiwaki teaches the image sensor of claim 1, wherein the spectral router is a first spectral router (Figs. 3–5, 7; microlens 4),
wherein the spectral filter is a first spectral filter (Figs. 3–5, 7; phase filter 3 on microlens 4),
wherein the color wavelength range is a first color wavelength range (Fig. 7, one of red, green, or blue detected by photodetector 5R, 5G, or 5B respectively), and
wherein the pixel array further includes:
a third photosensitive region configured to detect visible light within a second color wavelength range (id., second of red, green, or blue photodetectors),
a second spectral router positioned over at least the third photosensitive region (Fig. 7, corresponding instance of microlens 4) and configured to:
route the visible light within the color second wavelength range to the third photosensitive region (Fig. 5B, ¶ 0073; light-splitting element 30 including microlens 4 directs visible light beams onto photodetector type A), and
route the infrared light to the second photosensitive region (Fig. 5A, ¶ 0073; light-splitting element 30 including microlens 4 directs infrared light beams onto photodetector type B), and
a second spectral filter positioned over at least the second spectral filter and configured to block visible light outside of the second color wavelength range (Fig. 7, corresponding phase filter 3R, 3G, or 3B).
Regarding claim 7, Nishiwaki teaches the image sensor of claim 1, wherein the pixel array further includes a third photosensitive region configured to detect visible light with the color wavelength range (Fig. 7, third of red, green, or blue detected by photodetector 5R, 5G, or 5B respectively),
wherein the second photosensitive region is positioned between the first photosensitive region and the third photosensitive region (Fig. 6, periodic arrangement of red, green, and blue photodetectors), and
wherein the spectral router and the spectral filter are further positioned over the second photosensitive region and the third photosensitive region (Fig. 7, microlens 4 and phase filter 3 over each of red, green, and blue photodetectors).
Regarding claim 11, Nishiwaki teaches a method for constructing an image sensor, the method comprising:
forming (¶¶ 0051, 0054, 0084, 0089–93, forming or manufacturing the structure shown in Figure 7) [the claim 1 structure] (claim 1 rejection supra).
Regarding claim 12, Nishikawi teaches the method of claim 11, further comprising forming (¶¶ 0051, 0054, 0084, 0089–93, forming or manufacturing the structure shown in Figure 7) a plurality of light scattering structures in the second photodetector (Fig. 7, ¶¶ 0084–88, plurality of infrared photodetectors 5IR under lenses 10).
Regarding claim 14, Nishikawi teaches the method of claim 11, wherein the spectral filter is a first spectral filter,
wherein the method further comprises forming (¶¶ 0051, 0054, 0084, 0089–93, forming or manufacturing the structure shown in Figure 7) a second spectral filter over the second photodetector (Fig. 7, ¶ 0080; color filter 9IR that blocks visible light over infrared photodetector 5IR), and
wherein the second spectral filter is configured to block visible light (id.).
Regarding claim 15, Nishikawa teaches the method of claim 11, wherein the spectral router is a first spectral router (Figs. 3–5, 7; microlens 4),
wherein the spectral filter is a first spectral filter (Figs. 3–5, 7; phase filter 3 on microlens 4),
wherein the color wavelength range is a first color wavelength range (Fig. 7, one of red, green, or blue detected by photodetector 5R, 5G, or 5B respectively), and
wherein the method further comprises:
forming (¶¶ 0051, 0054, 0084, 0089–93, forming or manufacturing the structure shown in Figure 7) a third photodetector configured to detect visible light within a second color wavelength range (Fig. 7, second of red, green, or blue photodetectors);
forming (¶¶ 0051, 0054, 0084, 0089–93, forming or manufacturing the structure shown in Figure 7) a second spectral router over at least the third photodetector (Fig. 7, corresponding instance of microlens 4), wherein the second spectral router is configured to:
route the visible light within the color second wavelength range to the third photosensitive region (Fig. 5B, ¶ 0073; light-splitting element 30 including microlens 4 directs visible light beams onto photodetector type A), and
route the infrared light to the second photosensitive region (Fig. 5A, ¶ 0073; light-splitting element 30 including microlens 4 directs infrared light beams onto photodetector type B), and
forming (¶¶ 0051, 0054, 0084, 0089–93, forming or manufacturing the structure shown in Figure 7) a second spectral filter over the second spectral router, wherein the second spectral filter is configured to block visible light outside of the second color wavelength range (Fig. 7, corresponding phase filter 3R, 3G, or 3B).
Regarding claim 16, Nishikawa teaches the method of claim 11, further comprising forming a third photodetector to detect visible light within the color wavelength range (Fig. 7, third of red, green, or blue detected by photodetector 5R, 5G, or 5B respectively),
wherein the second photodetector is positioned between the first photodetector and the third photodetector (Fig. 6, periodic arrangement of red, green, and blue photodetectors), and
wherein the spectral router and the spectral filter are further positioned over the second photodetector and the third photodetector (Fig. 7, microlens 4 and phase filter 3 over each of red, green, and blue photodetectors).
Claim Rejections - 35 U.S.C. § 103
The following is a quotation of 35 U.S.C. § 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 9 and 10 are rejected under 35 U.S.C. § 103 as being unpatentable over Nishikawa in view of U.S. Patent Application Publication No. 2018/0115730 A1 (“Velichko”).
Claims 9 and 10 recite the image sensor of claim 1 in situ. Claim 9 recites a lens and an image controller. Claim 10 presents a Markush groups of various applications for the claim 9 imaging system. This appears to be a boilerplate claim by assignee Semiconductor Components Industries, LLC; it is claimed in US 2024/0371898 A1, US 2024/0314417 A1, and US 2023/0230989 A1. The cited ‘730 publication has the advantage of being published more than one year before the filing date of the present application and so qualifies as prior art under 35 U.S.C. § 102(a)(1).
Velichko, directed to an image sensor, teaches in combination with Nishikawa with respect to claim 9 an imaging system, comprising:
a lens system (Fig. 12, ¶ 0105; lens 1114 on camera);
an imaging controller (id., CPU 1002); and
the image sensor of claim 1 (claim 1 rejection supra), wherein the image sensor is in operational relationship with the lens system and is electrically coupled to the imaging controller (id., imaging device 1008).
It would have been obvious to one of ordinary skill in the art at the time of effective filing to place the Nishikawa image sensor in a camera having a lens and controller, as taught by Velichko, in order to allow for various common functions including auto focus and image stabilization. Velichko ¶ 0104.
Regarding claim 10, Velichko teaches the imaging system of claim 9, wherein the imaging system is at least one selected from the group consisting of an automobile, a vehicle, a camera, a cellular telephone, a tablet computing [sic], a webcam, a video camera, a video surveillance system, and a video gaming system (¶ 0032, “Electronic device 10 of FIG. 1 may be a portable electronic device such as a camera, a cellular telephone, a tablet computer, a webcam, a video camera, a video surveillance system, an automotive imaging system, [or] a video gaming system with imaging capabilities”).
Allowable Subject Matter
Claims 18 and 19 are allowed.
Claims 2, 3, 5, 8, 13, and 17 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claim 20 is objected to under 37 C.F.R. § 1.52(b)(1)(ii) for a grammar error, but would be allowable if amended to correct the error.
The following is a statement of reasons for the indication of allowable subject matter: claims 2, 3, 5, and 13 require the second infrared sensors to have an identical structure to the first visible light sensors, while in Nishikawa, the infrared sensors do not have phase filters 3. Claims 8 and 17 require distinct microlenses over the spectral filter, while in Nishikawa, the microlenses are already mapped with the claimed spectral routers beneath the spectral filter. Claim 18 recites the photosensitive regions have pyramid trenches, which are not present in Nishikawa.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
US 2024/0107782 A1
US 2024/0053447 A1
US 2024/0021634 A1
US 2023/0403478 A1
US 2023/0283866 A1
US 2023/0230987 A1
US 2023/0170363 A1
US 2023/0037261 A1
US 2022/0415949 A1
US 2022/0360759 A1
US 2022/0344388 A1
US 2022/0190016 A1
US 2022/0157869 A1
US 2022/0141399 A1
US 2022/0141427 A1
US 2022/0028909 A1
US 2021/0375977 A1
US 2021/0375975 A1
US 2021/0306580 A1
US 2021/0066383 A1
US 2019/0067346 A1
US 2018/0213205 A1
US 2018/0047773 A1
US 2017/0123452 A1
US 2016/0191822 A1
US 2016/0161332 A1
US 2015/0268392 A1
US 2013/0222603 A1
US 2013/0083214 A1
US 2009/0147101 A1
M. Miyata, M. Nakajima, & T. Hashimoto, “High-Sensitivity Color Imaging Using Pixel-Scale Color Splitters Based on Dielectric Metasurfaces”, 6 ACS Photonics 1442–1450 (25 March 2019).
The following prior art was found using an Artificial Intelligence assisted search using an internal AI tool that uses the classification of the application under the Cooperative Patent Classification (CPC) system, as well as from the specification, including the claims and abstract, of the application as contextual information. Where possible, English-language equivalents are given, and redundant results within the same patent families are eliminated. See “New Artificial Intelligence Functionality in PE2E Search”, 1504 OG 359 (15 November 2022), “Automated Search Pilot Program”, 90 F.R. 48,161 (8 October 2025).
US 2025/0277698 A1
US 2025/0280613 A1
US 2024/0371898 A1
US 2024/0314417 A1
US 2023/0230989 A1
CN 114650359 A
US 2020/0280659 A1
Any inquiry concerning this communication or earlier communications from the examiner should be directed to David N Werner whose telephone number is (571)272-9662. The examiner can normally be reached M--F 7:30--4:00 Central.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dave Czekaj can be reached at 571.272.7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/David N Werner/Primary Examiner, Art Unit 2487