Prosecution Insights
Last updated: April 19, 2026
Application No. 18/658,571

APPARATUS AND METHOD FOR OBTAINING IMAGE USING LENS ARRAY

Non-Final OA §102§103
Filed
May 08, 2024
Examiner
TISSIRE, ABDELAAZIZ
Art Unit
2638
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
84%
Grant Probability
Favorable
1-2
OA Rounds
2y 3m
To Grant
98%
With Interview

Examiner Intelligence

Grants 84% — above average
84%
Career Allow Rate
584 granted / 693 resolved
+22.3% vs TC avg
Moderate +13% lift
Without
With
+13.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
23 currently pending
Career history
716
Total Applications
across all art units

Statute-Specific Performance

§101
3.4%
-36.6% vs TC avg
§103
49.3%
+9.3% vs TC avg
§102
27.0%
-13.0% vs TC avg
§112
7.6%
-32.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 693 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (lDS) submitted are in compliance with the provisions of 37 CFR 1.97 and have been considered by the Examiner. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 5-6, 15-17 and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Hayasaka et al. (US 20100128152 A1, hereinafter “Hayasaka”). Regarding claim 1, Hayasaka teaches an image obtaining apparatus (Figs. 1&5, an image pickup apparatus 1) comprising: an image sensor (Figs. 1&5, an image pickup device 13) comprising a micro lens (Figs. 1-5, [0030]: a microlens array 12) and a plurality of neighboring pixels sharing the micro lens (Figs. 1-5, [0032]-[0034]: two-dimensional arrangement of microlenses in the microlens array 12 corresponds to a pixel arrangement in the image pickup device 13.); and a processor (Figs. 1-5, [0036] image processing section 14) configured to: generate, based on an input image received from the image sensor, a plurality of parallax images having a same parallax (Figs. 3-6, [0052]-[0053]&[0056]: parallax image producing section 143 produces a plurality of parallax images from different viewpoints based on the image pickup data D1. Pixel data corresponding to pixels located at the same position in the unit image formation regions 12D are extracted from the image pickup data D1 (as illustrated in (A) in Fig. 6); group, from among the plurality of parallax images, first parallax images having similar image characteristics (Figs. 3-6, [0052]-[0053]&[0056]-[0057]: extracted pixel data (pixel data corresponding to pixels indicated by the same reference numeral in FIG. 3) are synthesized. Thereby, image data D2 as parallax images are obtained. interpolation processing for each of R, G and B is performed on the parallax image data D21, thereby as illustrated in (B) to (D) in FIG. 6, red parallax image data D31R, green parallax image data D31G and blue parallax image data D31B are obtained. The obtained parallax image data D31R, D31G and D31B are outputted to the noise reduction section 146 as image data D3.); perform at least one image processing operation on the first parallax images (Figs. 1-5, [0052]-[0053]: The obtained image data D2 are outputted to image processing sections 146-149); and generate an output image by combining the first parallax images ([0067]: the parallax images obtained are used for such stereo system three-dimensional display, two parallax images for a right and left eyes are produced (claimed “combining”), and the produced parallax images are projected on a screen) on which the at least one image processing operation is performed ([0030], [0036] &[0050]: performs predetermined image processing on the image before outputting image data Dout). Regarding claim 5, Hayasaka teaches the image obtaining apparatus of claim 2, in addition Hayasaka discloses wherein the image sensor comprises: a quad Bayer pattern array in which pixels arranged in a 2×2 matrix comprise a color filter of a same color ([0035]: A color filter (not illustrated in FIG. 1) including regularly arranged filter elements of a plurality of colors corresponding to the arrangement of the pixels is arranged on a light-receiving surface of the image pickup device 13. As such a color filter, for example, a color filter in which filter elements of three primary colors, that is, red (R), green (G) and blue (B) are arranged at a predetermined ratio is used (as illustrated in Fig. 3)), or a quad square Bayer pattern array in which pixels arranged in a 4×4 matrix comprise a color filter of a same color ([0087]: number of pixels allocated to one microlens is increased to, for example, 3x3, 4x4). Regarding claim 6, Hayasaka teaches the image obtaining apparatus of claim 5, in addition Hayasaka discloses wherein the processor is further configured to generate, based on the input image received from the image sensor, an A parallax image, a B parallax image, a C parallax image, and a D parallax image (Figs. 3&6, [0056]: as illustrated in (A) in FIG. 6, parallax image data D21 produced by synthesizing pixel data extracted from pixels (for example, pixels indicated by the reference numeral 1) located at the same position in the unit image formation region U1 has the same color arrangement as that of the color filter 130 (claimed “A parallax image”). The same holds true in the case where pixel data extracted from pixels located at another position (for example, pixels indicated by the reference numeral 2 or 3) are synthesized. (claimed “B parallax image, a C parallax image, and a D parallax image”)), wherein the plurality of neighboring pixels sharing the one micro lens comprise four pixels (Figs. 3&6, [0056]: 2x2=4 pixels are allocated to one microlens in the microlens array 12, so in the image pickup device 13, as illustrated in FIG. 3), and wherein the A parallax image corresponds to a combination of data values of pixels arranged in a second quadrant, the B parallax image corresponds to a combination of data values of pixels arranged in a first quadrant, the C parallax image corresponds to a combination of data values of pixels arranged in a third quadrant, and the D parallax image corresponds to a combination of data values of pixels arranged in a fourth quadrant (Figs. 3&6, [0056]: as illustrated in (A) in FIG. 6, parallax image data D21 produced by synthesizing pixel data extracted from pixels (for example, pixels indicated by the reference numeral 1) located at the same position in the unit image formation region U1 has the same color arrangement as that of the color filter 130 (claimed “A parallax image”). The same holds true in the case where pixel data extracted from pixels located at another position (for example, pixels indicated by the reference numeral 2 or 3) are synthesized. (claimed “B parallax image, a C parallax image, and a D parallax image”)). Regarding claim 15, Hayasaka teaches the image obtaining apparatus of claim 6, in addition Hayasaka discloses wherein the processor is further configured to generate the output image by combining the A parallax image, the B parallax image, the C parallax image, and the D parallax image, and wherein the A parallax image, the B parallax image, the C parallax image, and the D parallax image are image-processed by group (Figs. 3&6, [0056]: as illustrated in (A) in FIG. 6, parallax image data D21 produced by synthesizing pixel data extracted from pixels (for example, pixels indicated by the reference numeral 1) located at the same position in the unit image formation region U1 has the same color arrangement as that of the color filter 130 (claimed “A parallax image”). The same holds true in the case where pixel data extracted from pixels located at another position (for example, pixels indicated by the reference numeral 2 or 3) are synthesized. (claimed “B parallax image, a C parallax image, and a D parallax image”)). Regarding claim 16, Hayasaka teaches the image obtaining apparatus of claim 1, wherein the similar image characteristics comprise at least one of a noise level, a color shift, and a color shading (Figs. 3-6, [0052]-[0053]&[0056]-[0058]: parallax image producing section 143 produces a plurality of parallax images from different viewpoints based on the image pickup data D1. Pixel data corresponding to pixels located at the same position in the unit image formation regions 12D are extracted from the image pickup data D1 (as illustrated in (A) in Fig. 6). The obtained image data D3 are outputted to the noise reduction section 146). Regarding claim 17, Method claim 17 is drawn to the method of using the corresponding apparatus claimed in claim 1. Therefore, method claim 17, corresponds to apparatus claim 1, is rejected for the same reasons of anticipation as used above. Regarding claim 20, Method claim 20 is drawn to the method of using the corresponding apparatus claimed in claim 16. Therefore, method claim 20, corresponds to apparatus claim 16, is rejected for the same reasons of anticipation as used above. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Hayasaka et al. (US 20100128152 A1, hereinafter “Hayasaka”), in view of CHIANG; Kuo-Ching (US 20120176480 A1, hereinafter “CHIAN”). Regarding claim 2, Hayasaka teaches the image obtaining apparatus of claim 1, in addition Hayasaka discloses wherein the first parallax images comprise a first parallax image group and a second parallax image group (Figs. 1-5, [0017]: images with parallax can be generated due to the simultaneously captured images from different viewing angle.), wherein the processor is further configured to perform the at least one image processing operation on the first parallax images by performing image processing on the first parallax image group and the second parallax image group (Figs. 1-5, [0017]: images with parallax can be generated due to the simultaneously captured images from different viewing angle.) Hayasaka does not teach wherein performing image processing using context switching. However, CHIANG discloses wherein performing image processing using context switching ([0020]: Referred to FIG. 3, a plurality of input images are transferred to multi-tasking module 500 for processing received images form a plurality of terminals. Before the image data signals transmitted to the display 160, aforementioned images are previously processed by the image segmentation unit 126. The image process unit 510 can be introduced to adjust the processed image before being displayed. The task multitasking 500 refer to a method where multiple tasks (also known as processes) share common processing resources such as a CPU. Multitasking solves the one process one task problem by scheduling which task may be the one running at any given time, and when another waiting task gets a turn. The multi-tasking allow reassign the control unit from one task to another different task to achieve parallelism or context switches. Thus, the different task multi-tasking module may reassign the control unit to switch from one process to another different process to facility context switches.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate wherein performing image processing using context switching as taught by CHIANG into Hayasaka image processing section. The suggestion/ motivation for doing so would be to achieve parallelism or context switches (CHIANG: [0030]). Regarding claim 18, Method claim 18 is drawn to the method of using the corresponding apparatus claimed in claim 2. Therefore, method claim 18, corresponds to apparatus claim 2, is rejected for the same reasons of obviousness as used above. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Hayasaka et al. (US 20100128152 A1, hereinafter “Hayasaka”), in view of KIM; Dong Ik (US 20240171871 A1, hereinafter “KIM”), or in view of IZAWA; Katsutosh (US 20140198188 A1, hereinafter “IZAWA”). Regarding claim 7, Hayasaka teaches the image obtaining apparatus of claim 6, except wherein the quad Bayer pattern array or the quad square Bayer pattern array comprises: a GRBG pattern comprising: red pixels in a first quadrant of a first unit pixel; green pixels in a second quadrant and a fourth quadrant of the first unit pixel; and blue pixels in a third quadrant of the first unit pixel; However, KIM discloses a GRBG pattern comprising: red pixels in a first quadrant of a first unit pixel; green pixels in a second quadrant and a fourth quadrant of the first unit pixel; and blue pixels in a third quadrant of the first unit pixel (Fig. 3A [0055] plurality of pixels may be arranged according to a quad Bayer pattern. The quad Bayer pattern may represent an arrangement in which pixels of the same color, among the red pixel R, the green pixel G, and the blue pixel B, are arranged in 2×2 in a unit of the pixel region 311b). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate wherein performing image processing using context switching as taught by CHIANG into Hayasaka image processing section. The suggestion/ motivation for doing so would be to provide a design that mimics the human eye's higher sensitivity to green light, allowing for enhanced image quality and brightness using a single, cost-effective sensor. This emphasis on green also enables better capture of luminance (brightness) information. Or, furthermore, IZAWA discloses an RGGB pattern comprising: green pixels in a first quadrant and a third quadrant of a second unit pixel; red pixels in a second quadrant of the second unit pixel; and blue pixels in a fourth quadrant of the second unit pixel (as illustrated by Fig. 15, [0155]: in an imaging element 16’, four photodiodes A, B, C and D forming a RGGB pattern are bidimensionally arranged, one microlens ML' arranged so as to cover the four photodiodes is assumed as one unit (four pixel in one microlens)). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate an RGGB pattern comprising: green pixels in a first quadrant and a third quadrant of a second unit pixel; red pixels in a second quadrant of the second unit pixel; and blue pixels in a fourth quadrant of the second unit pixel as taught by IZAWA into Hayasaka image processing section. The suggestion/ motivation for doing so would be to provide a design that mimics the human eye's higher sensitivity to green light, allowing for enhanced image quality and brightness using a single, cost-effective sensor. This emphasis on green also enables better capture of luminance (brightness) information. Allowable Subject Matter Claims 3-4, 8-14 and 19 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Contact Any inquiry concerning this communication or earlier communications from the examiner should be directed to ABDELAAZIZ TISSIRE whose telephone number is (571)270-7204. The examiner can normally be reached on Monday through Friday from 8 AM to 5 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ye Lin can be reached on 571-272-7372. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ABDELAAZIZ TISSIRE/ Primary Examiner, Art Unit 2638
Read full office action

Prosecution Timeline

May 08, 2024
Application Filed
Feb 06, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593518
IMAGE SENSOR AND OPERATION METHOD THEREOF
2y 5m to grant Granted Mar 31, 2026
Patent 12587749
CONTROL APPARATUS AND CONTROL METHOD THEREFOR
2y 5m to grant Granted Mar 24, 2026
Patent 12587757
SOLID-STATE IMAGING DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12581204
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
2y 5m to grant Granted Mar 17, 2026
Patent 12581177
CAMERA DEVICE
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
84%
Grant Probability
98%
With Interview (+13.2%)
2y 3m
Median Time to Grant
Low
PTA Risk
Based on 693 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month