Prosecution Insights
Last updated: April 19, 2026
Application No. 18/778,428

ELECTRONIC DEVICE INCLUDING IMAGE SENSOR AND OPERATING METHOD THEREOF

Non-Final OA §102§103
Filed
Jul 19, 2024
Examiner
DAGNEW, MEKONNEN D
Art Unit
2638
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
604 granted / 728 resolved
+21.0% vs TC avg
Strong +16% interview lift
Without
With
+15.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
29 currently pending
Career history
757
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
63.7%
+23.7% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
6.3%
-33.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 728 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawing filed on 07/19/24 is in compliance with MPEP 608.03 and therefore is accepted. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1 &11 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Lee et al.(US 20220359597 A1; hereinafter Lee). As of Claim 1: Lee teaches FIGS. 2&3 an electronic device (¶0044 and note that an image sensor 200 included in a camera module (e.g., a camera module 110 of FIG. 1)) and comprising: an image sensor (¶0044 and note that a 4PD image sensor in FIG. 2) comprising a plurality of unit pixels (¶0045 and note that one 210 of a plurality of pixels may include a microlens 205, a color filter 206, a first PD (or a first sub-pixel) (PD1) 211, a second PD (or a second sub-pixel) (PD2) 212, a third PD (or a third sub-pixel) (PD3) 213, and a fourth PD (or a fourth sub-pixel) (PD4) 214.); and at least one processor electrically connected with the image sensor, wherein a first unit pixel of the plurality of unit pixels comprises: a micro lens (¶0049 and exhibit microlens 205) ; and a first photodiode (PD), a second PD disposed in a first direction of the first PD (¶0053 and note that the two pixels (e.g., the first PD 211 and the second PD 212) separated horizontally ), a third PD disposed in a second direction of the first PD, and a fourth PD disposed in the second direction of the second PD, the first PD, the second PD, the third PD, and the fourth PD being disposed under the micro lens (¶¶0049-0053 and note that one 210 of a plurality of pixels may include a microlens 205, a color filter 206, a first PD (or a first sub-pixel) (PD1) 211, a second PD (or a second sub-pixel) (PD2) 212, a third PD (or a third sub-pixel) (PD3) 213, and a fourth PD (or a fourth sub-pixel) (PD4) 214.), wherein the at least one processor is configured to: acquire a first image frame from the image sensor; determine an operation mode of the image sensor, based on information related to at least part of the first image frame (¶0057 and note that the image sensor 410 may generate image data for taking a picture or a moving image or image data); in response to the operation mode of the image sensor being determined to a first mode, control the image sensor to output first auto focus (AF) data corresponding to a quantity of light inputted to at least one PD of the first PD or the third PD, and second AF data corresponding to a quantity of light inputted to at least one PD of the second PD or the fourth PD (¶¶0068-0070 and note that the PAF circuit 451 may perform PAF calculation using AF data extracted from pixels (e.g., the first PD and the second PD or the third PD and the fourth PD) separated horizontally or may perform PAF calculation using AF data extracted from pixels (e.g., the first PD and the third PD or the second PD and the fourth PD) separated vertically. Hereinafter, a description will be given of a 2PD PAF circuit), and perform an AF function based on a first phase difference of the first direction which is acquired based on the first AF data and the second AF data; and in response to the operation mode of the image sensor being determined to a second mode (¶¶0097,0112), control the image sensor to output third AF data corresponding to a quantity of light inputted to at least one PD of the first PD or the second PD, and fourth AF data corresponding to a quantity of light inputted to at least one PD of the third PD or the fourth PD, and perform an AF function based on a second phase difference of the second direction which is acquired based on the third AF data and the fourth AF data (¶¶0066-0069 and note that the AF processing unit 450 may perform AF (PAF) using a phase difference based on the received AF data. The AF processing unit 450 may include at least one or more PAF circuits 451). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 2-5 are rejected under 35 U.S.C. 103 as being unpatentable over Lee et al.(US 20220359597 A1; hereinafter Lee) in view of Chino et al. (US 20210067704 A1; hereinafter Chino). As of Claim 2: Chino is a similar or analogous system to the claimed invention as evidenced Chino teaches determining the tilt/focus driving/driving amount based on the distance information, such as a phase difference, or the method of acquiring the tilt angle, the focal length information, and the object distance information of the image sensor 100 that would have prompted a predictable variation of Lee by applying Chino’s known principal of the information related to at least part of the first image frame is contrast information of the first image frame (¶¶0022,0024,0034). In view of the motivations such as focus thereby further improving image quality by improving quality of images and one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Lee. Therefore, the claimed invention would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. As of Claim 3: Lee in view of Chino further teaches the at least one processor is configured to: identify a first contrast value corresponding to contrast information of the first direction associated with the first image frame (Chino ¶¶0022,0024,0034 and note that for each target object area, and acquires a contrast related evaluation value of a specific frequency); acquire a second contrast value corresponding to contrast information of the second direction associated with the first image frame from the image sensor; and based on the first contrast value and the second contrast value, determine the operation mode of the image sensor (Chino ¶¶0022,0024,0034 and note that FIG. 7 is mainly executed by the in-focus evaluation value calculator 112 that evaluates contrast as well ). As of Claim 4: Lee in view of Chino further teaches the at least one processor is configured to: identify a first contrast value corresponding to contrast information of the first direction associated with the first image frame (Chino ¶¶0022,0024,0034 and note that FIG. 7 is mainly executed by the in-focus evaluation value calculator 112 that evaluates contrast as well ); and determine the operation mode of the image sensor, based on whether the first contrast value is less than a threshold value sensor (Chino ¶¶0022,0024,0034 and FIG. 7). As of Claim 5: Lee in view of Chino further teaches the at least one processor is configured to: in response to the first contrast value being greater than or equal to the threshold value, determine the operation mode of the image sensor to the first mode (Lee ¶¶0097,0112); in response to the first contrast value being less than the threshold value, determine the operation mode of the image sensor to the second mode; while controlling the image sensor to operate in the second mode (Lee ¶¶0097,0112), acquire a second contrast value corresponding to contrast information of the second direction associated with the first image frame; and determine the operation mode of the image sensor, based on the first contrast value and the second contrast value (Chino ¶¶0022,0024,0034). Claims 6-9 are rejected under 35 U.S.C. 103 as being unpatentable over Lee et al.(US 20220359597 A1; hereinafter Lee) in view of LI et al.(US 20190082130 A1). As of Claim 6: LI is a similar or analogous system to the claimed invention as evidenced LI teaches providing phase detection auto-focus (PDAF) information based on edges having more than one orientation (e.g., vertical and horizontal edges), to improve PDAF performance (especially in low light conditions), to reduce or eliminate the need for signal correction, or to increase the resolution of an image sensor that would have prompted a predictable variation of Lee by applying LI’s known principal of the first unit pixel comprises: one floating diffusion (FD) (¶0067 and note that the set of charge transfer transistors 706 to simultaneously transfer charge from multiple photodetectors 702 (e.g., a pair of photodetectors) to the sense region 708 or floating diffusion node) connected with the first PD, the second PD, the third PD, and the fourth PD; and a first transfer gate (TG) configured to connect the first PD and the FD (¶0067 and note that the gates of first and second charge transfer transistors 706a (TX_A) and 706b (TX_B) (i.e., the charge transfer transistors of the first row) may be simultaneously driven to transfer charges accumulated by the first and second photodetectors 702a, 702b to the sense region 708), a second TG configured to connect the second PD and the FD (¶0067 and note that), a third TG configured to connect the third PD and the FD, and a fourth TG configured to connect the fourth PD and the FD (¶0067 and note that), and wherein the image sensor is configured to: when the operation mode of the image sensor is the first mode, switch at least one TG of the first TG or the third TG into an ON state and acquire the first AF data corresponding to at least one PD of the first PD or the third PD (¶0067 and note that), and switch at least one TG of the second TG or the fourth TG to an ON state, and acquire the second AF data corresponding to at least one PD of the second PD or the fourth PD, and when the operation mode of the image sensor is the second mode, switch at least one TG of the first TG or the second TG into an ON state and acquire the third AF data corresponding to at least one PD of the first PD or the second PD, and switch at least one TG of the third TG or the fourth TG into an ON state and acquire the fourth AF data corresponding to at least one PD of the third PD or the fourth PD (¶0067 and note that). In view of the motivations such as improved phase detection auto-focus (PDAF) performance for a device having a camera or other image capture device thereby further improving image quality by improving focus and one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Lee. Therefore, the claimed invention would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. As of Claim 7: Lee in view of LI further teaches the image sensor is configured to: acquire first readout data, second readout data, third readout data, and fourth readout data by reading out from the first PD, the second PD, the third PD, and the fourth PD, respectively; when the operation mode of the image sensor is the first mode (Lee ¶¶0169-0176), acquire the first AF data resulting from addition of the first readout data and the third readout data, and the second AF data resulting from addition of the second readout data and the fourth readout data; and when the operation mode of the image sensor is the second mode, acquire the third AF data resulting from addition of the first readout data and the second readout data, and the fourth AF data resulting from addition of the third readout data and the fourth readout data (Lee ¶¶0068 and note that the PAF circuit 451 may perform PAF calculation using AF data extracted from pixels (e.g., the first PD and the second PD or the third PD and the fourth PD) separated horizontally or may perform PAF calculation using AF data extracted from pixels (e.g., the first PD and the third PD or the second PD and the fourth PD) separated vertically. Hereinafter, a description will be given of a 2PD PAF circuit). As of Claim 8: Lee in view of LI further teaches a lens assembly which is electrically connected with the at least one processor and comprises at least one lens aligned along an optical axis (LI ¶0041 and note that an image capture device (e.g., a camera 200), including an image sensor 202, a lens 204, and an auto-focus mechanism 206), wherein the at least one processor is configured to: determine the operation mode of the image sensor to the first mode, based on the information related to at least part of the first image frame; acquire, from the image sensor, the first AF data (Lee ¶0138 and note that the PAF circuit 1235 performs determined calculation irrespective of information about directionality, it may perform PAF calculation depending on directionality of an edge to obtain a PAF result with high reliability. The PAF circuit 1235 may extract a statistical value for each direction and may perform AF determination), the second AF data, and a second image frame related to the first AF data and the second AF data; acquire the first phase difference of the first direction, based on the first AF data and the second AF data; control the lens assembly to move on the optical axis based on the first phase difference; and acquire, from the image sensor (LI ¶0044 and note that the auto-focus mechanism 206 may include (or the functions of the auto-focus mechanism 206 may be provided by) a processor. The auto-focus mechanism 206 may receive signals from the image sensor 202 and, in response to the signals, adjust a focus setting of the camera 200. In some embodiments, the signals may include PDAF information. The PDAF information may include both horizontal phase detection signals and vertical phase detection signals. In response to the PDAF information (e.g., in response to an out-of-focus condition identified from the PDAF information), the auto-focus mechanism 206 may adjust a focus setting of the camera 200 by, for example, adjusting a relationship between the image sensor 202 (or plurality of pixels) and the lens 204 (e.g., by adjusting a physical position of the lens 204 or the image sensor 202), a third image frame corresponding to light passing through the moved lens assembly. As of Claim 9: Lee in view of LI further teaches a lens assembly which is electrically connected with the at least one processor and comprises at least one lens aligned along an optical axis, wherein the at least one processor is configured to: determine the operation mode of the image sensor to the second mode based on the first image frame (LI ¶0087-0090 and note that analyzing horizontal phase detection signals output from a first set of the multiple pixels while capturing the one or more images. The operation(s) at 1104 may be performed, for example, by the auto-focus mechanism described with reference to FIG. 2, the image processor described with reference to FIG. 4, or the processor or image processor described with reference to FIG. 12); acquire, from the image sensor, the third AF data, the fourth AF data, a second image frame related to the third AF data and the fourth AF data, and correlation data between the third AF data and the fourth AF data; acquire the second phase difference of the second direction, based on the third AF data, the fourth AF data, and the correlation data; control the lens assembly to move on the optical axis, based on the second phase difference; and acquire, from the image sensor, a third image frame corresponding to light passing through the moved lens assembly (LI ¶¶0053-0057). Claims 11-15: Claims 11-15 are method claims for Claims 1,3,4,8,9 and are addressed above. Allowable Subject Matter Claim 10 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. As of Claim 10 : the prior art of record fails to teach or fairly suggest the limitations of claim 10, combination with claim 1, that includes, “further comprising an illuminance sensor configured to measure ambient illuminance of the electronic device, wherein the at least one processor is configured to: determine whether the ambient illuminance of the electronic device is greater than or equal to a threshold value by using the illuminance sensor; in response to the ambient illuminance of the electronic device being greater than or equal to the threshold value, control the image sensor operating in the first mode to output the first AF data corresponding to a quantity of light inputted to any one PD of the first PD or the third PD, and the second AF data corresponding to a quantity of light inputted to any one PD of the second PD or the fourth PD; and in response to the ambient illuminance of the electronic device being less than the threshold value, control the image sensor operating in the first mode to output the first AF data corresponding to a quantity of light inputted to the first PD and the third PD, and the second AF data corresponding to a quantity of light inputted to the second PD and the fourth PD.” Claim 16 is allowed, because the prior/related art fails to explicitly teach or suggest “a first PD, a second PD disposed in a first direction of the first PD, a third PD disposed in a second direction of the first PD, and a fourth PD disposed in the second direction of the second PD; determining an operation mode of the image sensor, based on information related to at least part of the first image frame; in response to the operation mode of the image sensor being determined to a first mode, controlling the image sensor to output first AF data corresponding to a quantity of light inputted to at least one PD of the first PD or the third PD, and second AF data corresponding to a quantity of light inputted to at least one PD of the second PD or the fourth PD, and performing an AF function based on a first phase difference of the first direction which is acquired based on the first AF data and the second AF data; and in response to the operation mode of the image sensor being determined to a second mode, controlling the image sensor to output third AF data corresponding to a quantity of light inputted to at least one PD of the first PD or the second PD, and fourth AF data corresponding to a quantity of light inputted to at least one PD of the third PD or the fourth PD, and performing an AF function based on a second phase difference of the second direction which is acquired based on the third AF data and the fourth AF data. Regarding claims 17-20: claims 17-20 are allowed based on its dependence to the allowed independent claim 16. Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.” Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Uenishi et al. (US 20180063412 A1) teaches in ¶0050 Each pixel of the image sensor 201 used in this embodiment includes two (a pair) of photodiodes A and B and one microlens that is provided for the pair of photodiodes A and B (that shares the photodiodes A and B). That is, the image sensor 201 includes the pair of photodiodes (first photoelectric converter and second photoelectric converter) for one microlens, and a plurality of microlenses are arrayed in two dimensions. Each pixel divides incident light by the microlens to form a pair of optical images on the pair of photodiodes A and B to output, from the pair of photodiodes A and B, a pair of pixel signals (A image signal and B image signal) that are used as AF signals described below. Further, by adding the outputs of the pair of photodiodes A and B, it is possible to obtain an imaging signal (A+B image signal). Murata (US 2015/0062391) discloses an image capture device (image sensor 12), comprising: an imaging area comprising: a plurality of pixels (pixel array), the plurality of pixels including multiple pixels in which each pixel of the multiple pixels comprises a two-dimensional array of photodetectors (PD1-PD4), each photodetector in the array of photodetectors electrically isolated from each other photodetector in the array of photodetectors (figures 2a-2b, 19-22; paragraphs 54-65, 124-132); and a microlens (40) disposed over the array of photodetectors; and a pixel readout circuit comprising, for each pixel in the multiple pixels; a shared readout circuit (FD and pixel readout circuitry) associated with the array of photodetectors (PD1-PD4) for the pixel; and a set of charge transfer transistors (TX1-TX4), each charger transfer transistor operable to connect a photodetector (PD1-PD4) in the array of photodetectors to the shared readout circuit (FD and pixel readout circuitry) (figures 2a-2b, 19-22; paragraphs 54-65, 124-132). Contacts Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEKONNEN D DAGNEW whose telephone number is (571)270-5092. The examiner can normally be reached on 8:00AM-5:00PM M-Th. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lin Ye can be reached on 571-272-7372. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MEKONNEN D DAGNEW/Primary Examiner, Art Unit 2638
Read full office action

Prosecution Timeline

Jul 19, 2024
Application Filed
Jan 09, 2026
Non-Final Rejection — §102, §103
Mar 19, 2026
Applicant Interview (Telephonic)
Mar 21, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593143
SOLID-STATE IMAGING DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12586142
IMAGE CAPTURING METHOD AND DISPLAY METHOD FOR RECOGNIZING A RELATIONSHIP AMONG A PLURALITY OF IMAGES DISPLAYED ON A DISPLAY SCREEN
2y 5m to grant Granted Mar 24, 2026
Patent 12585173
LENS BARREL
2y 5m to grant Granted Mar 24, 2026
Patent 12581022
DATA CREATION METHOD AND DATA CREATION PROGRAM
2y 5m to grant Granted Mar 17, 2026
Patent 12574662
THRESHOLD VALUE DETERMINATION METHOD, THRESHOLD VALUE DETERMINATION PROGRAM, THRESHOLD VALUE DETERMINATION DEVICE, PHOTON NUMBER IDENTIFICATION SYSTEM, PHOTON NUMBER IDENTIFICATION METHOD, AND PHOTON NUMBER IDENTIFICATION PROCESSING PROGRAM
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
99%
With Interview (+15.8%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 728 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month