Prosecution Insights
Last updated: April 19, 2026
Application No. 18/713,163

Monolithic Image Sensor, a Camera Module, an Electronic Device and a Method for Operating a Camera Module

Final Rejection §102
Filed
May 23, 2024
Examiner
GILES, NICHOLAS G
Art Unit
2639
Tech Center
2600 — Communications
Assignee
Telefonaktiebolaget Lm Ericsson (Pubi)
OA Round
2 (Final)
82%
Grant Probability
Favorable
3-4
OA Rounds
2y 6m
To Grant
98%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
683 granted / 834 resolved
+19.9% vs TC avg
Strong +16% interview lift
Without
With
+16.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
25 currently pending
Career history
859
Total Applications
across all art units

Statute-Specific Performance

§101
4.0%
-36.0% vs TC avg
§103
39.2%
-0.8% vs TC avg
§102
24.4%
-15.6% vs TC avg
§112
23.7%
-16.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 834 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 12/29/2025 have been fully considered but they are not persuasive. Regarding claim 1, applicant argues that Wong does not teach: “the change detector area is a distinct part of the image sensor which is separate from the pixel area.” and: “Since independent claim 31 specifies that the "pixel area" includes the "hybrid second image sensor pixels," the claimed "change detector area" must be "separate from" the "hybrid second image sensor pixels." ”, further stating that: “However, Wong [0067] and Fig. 6A describe and show "address event detection unit 400" as being included in or part of the "shared event detection and image sensing pixel 501," which is essentially the opposite of being a "distinct part" of the image sensor that is "separate from" the "hybrid second image sensor pixels." ” The examiner disagrees and points out that as stated in the rejection the pixels 310 with light-receiving area 330 including a photoelectric conversion element 333, such as a photodiode, was considered to be the pixel area. The examiner further notes that the address event detection unit (or readout circuit) 400 cited as the “change detector area” is distinct area that is different than the light-receiving area 330, as the address event detection unit (or readout circuit) 400 and light-receiving area 330 are neither touching or overlapping, as can be seen in Fig. 6A. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 32-51 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wong et al. (U.S. Pub. No. 20210185264). Regarding claim 32, Wong discloses: A monolithic image sensor (substrate on which the image sensor 200 is arranged, par. 46) comprising: a pixel area sensitive to electromagnetic irradiation (pixels 310 with light-receiving area 330 are arranged in the pixel array 300 and pixels 310 include a photoelectric conversion element 333, such as a photodiode, par. 54, 55, 66-72) and comprising a first pixel area comprising an array of synchronous first image sensor pixels (half of image sensing pixels 502 arrayed in Bayer array groups of image sensor 200 having an array of unit pixels 310 that includes a plurality of image sensing pixels 502 arrayed in Bayer array groups formed on the same light receiving chip or substrate 201, and event detection readout circuit 400 can trigger operation of the image signal generation readout circuit 320 based on charge generated by a photoelectric conversion element (or photoelectric conversion region) 333 and based on operation of the logic circuit 210, and in response to the event trigger, image frame capture begins, where the image frame capture can be a full frame image capture that involves all of the image sensing pixels 502 included in the pixel array 300, par. 65, 67, 68, 71, 77, 110) and further comprising a second pixel area comprising hybrid second image sensor pixels (shared event detection and image sensing pixels 501, where event detection operates asynchronously, where different shared event detection and image sensing pixels 501 can respond to light within different wavelength ranges using color filters that can be configured as a Bayer array, and where shared event detection and image sensing pixels 501 can be selectively operated in event detection or image sensing modes, and where the second pixel area further includes the other half of image sensing pixels 502 arrayed in Bayer array groups, par. 42, 68, 71, 77, 78, 99-107, 110, 134, and Fig. 10A), a change detector area comprising multiple asynchronous change detectors (address event detection unit (or readout circuit) 400 for each unit pixel 310, where after TG2 is raised to a high level at timing T0 an address event detection unit 400 detects when address event ignition eventually occurs and raises a detection signal at T1, par. 71, 78, 99-107, 134, and Figs. 6A and 10A), and a synchronous intensity read-out circuitry (pixel imaging signal generation unit (or readout circuit) 320, where selection transistor 323 of pixel imaging signal generation unit (or readout circuit) 320 switches a connection between the amplification transistor 322 and the vertical signal line VSL in accordance with a selection signal SEL transmitted from the drive circuit 211, where the analog pixel signal that appears in the vertical signal line VSL is read out by the column ADC 220 and is converted into a digital pixel signal, par. 71-77 and Fig. 6A), wherein a first photoreceptor of a respective first image sensor pixel is electrically coupled to the synchronous intensity read-out circuitry (first transmission transistor 331a transmits a charge generated in the photoelectric conversion element 333a to the floating diffusion layer 324 of the image signal generation readout circuit 320 in accordance with the first control signal TG1a, par. 76, 84, and Figs. 6A and 6B), and wherein a second photoreceptor of a respective hybrid second image sensor pixel is electrically coupled to the synchronous intensity read-out circuitry with a first connection (first transmission transistor 331 transmits a charge generated in the photoelectric conversion element 333 to the floating diffusion layer 324 of the image signal generation readout circuit 320 in accordance with the first control signal TG1, par. 71, 76, and Fig. 6A) and electrically coupled to a respective asynchronous change detector out of the multiple asynchronous change detectors with a second connection (a photocurrent generated in the photoelectric conversion element 333 of the light-receiving unit 330 is supplied to the address event detection unit 400 of each unit pixel 310 through the second transmission transistor 332, par. 71, 78, and Fig. 6A), wherein the change detector area is a distinct part of the image sensor which is separate from the pixel area (address event detection unit (or readout circuit) 400 for each unit pixel 310 is separate from and next to light-receiving unit 330, par. 71 and Fig. 6A). Regarding claim 33, Wong further discloses: change detector area is arranged outside an imaging area of the image sensor (address event detection unit (or readout circuit) 400 for each unit pixel 310 is separate from and next to light-receiving unit 330, par. 71 and Fig. 6A). Regarding claim 34, Wong further discloses: change detector area is arranged to at least partly surround the pixel area (address event detection unit (or readout circuit) 400 for each unit pixel 310 is separate from and next to light-receiving unit 330, par. 71 and Fig. 6A). Regarding claim 35, Wong further discloses: first pixel density of the first pixel area equals a second pixel density of the second pixel area (groups of shared event detection and image sensing pixels 501, and groups of image sensing pixels 502, formed on the same light receiving chip or substrate 201, where the individual groups can be configured as Bayer arrays that alternate between Bayer array groups of shared event detection and image sensing pixels 501, and Bayer array groups of image sensing pixels 502, and where it can be seen in Fig. 5D that the density of the shared event detection and image sensing pixels 501 and image sensing pixels 502 are the same, par. 68 and Fig. 5D). Regarding claim 36, Wong further discloses: second pixel area further comprises third image sensor pixels, wherein a third photoreceptor of a respective third image sensor pixel is electrically coupled to the synchronous intensity read-out circuitry but not electrically coupled to the asynchronous change detectors (other half of image sensing pixels 502 arrayed in Bayer array groups, which are not shared event detection and image sensing pixels 501 and not coupled to address event detection unit 400 and are operated the same way as the half of image sensing pixels 502 previously cited, par. 68, 71, 77, 78, 99-107, 110, 134, and Fig. 10A). Regarding claim 37, Wong further discloses: respective second image sensor pixel comprises a green color filter (second pixel group includes a first plurality of pixels that generate signals based on an intensity of red light, a second plurality of pixels that generate signals based on an intensity of green light, and a third plurality of pixels that generate signals based on an intensity of blue light, par. 43, 62, 64, 168 and Figs. 4 and 5D). Regarding claim 38, Wong further discloses: respective second image sensor pixel comprises two or more different color filters (second pixel group includes a first plurality of pixels that generate signals based on an intensity of red light, a second plurality of pixels that generate signals based on an intensity of green light, and a third plurality of pixels that generate signals based on an intensity of blue light, par. 43, 62, 64, 168 and Figs. 4 and 5D). Regarding claim 39, Wong further discloses: image sensor is configured to operate the second image sensor pixels either in an asynchronous mode, in which the respective asynchronous change detector asynchronously outputs a signal if a significant change in illumination intensity of the corresponding photoreceptor is detected (the shared event detection and image sensing pixels 501 can be selectively operated in event detection or image sensing modes, where a photocurrent generated in the photoelectric conversion element 333 of the light-receiving unit 330 is supplied to the address event detection unit 400 of each unit pixel 310 through the second transmission transistor 332, par. 68, 71, 78, and Fig. 6A), or in a synchronous mode, in which the synchronous intensity read-out circuitry synchronously outputs a respective pixel value corresponding to a respective illumination intensity of the corresponding photoreceptor (the shared event detection and image sensing pixels 501 can be selectively operated in event detection or image sensing modes, where first transmission transistor 331 transmits a charge generated in the photoelectric conversion element 333 to the floating diffusion layer 324 of the image signal generation readout circuit 320 in accordance with the first control signal TG1, par. 68, 71, 76, and Fig. 6A). Regarding claim 40, Wong discloses: A camera module (imaging device 200 implemented as connected light receiving 201 and logic 202 chips can include image sensor 200 components disposed as part of the light receiving chip 201, with some or all of the processor system 130 components disposed as part of the logic chip 202, par. 53) comprising the monolithic image sensor of claim 32 (see rejection of claim 32) and further comprising a Digital Processing Unit (DPU) (processor system 130 is, for example, constituted by a central processing unit (CPU), and processor system 130 can execute application programming or routines, stored as software or firmware in memory or data storage included in or interconnected to the processor system 130 to perform various functions and methods as described herein, par. 48) configured to: determine a setting of the image sensor based on output from the asynchronous change detectors comprised in the image sensor (determination of an object velocity based on the event detection, par. 119-120), and control the image sensor by implementing the setting (change the frame rate of the image sensing pixels based on the determined object velocity, par. 119-120). Regarding claim 41, Wong further discloses: DPU and the monolithic image sensor are monolithically integrated (imaging device 200 implemented as connected light receiving 201 and logic 202 chips can include image sensor 200 components disposed as part of the light receiving chip 201, with some or all of the processor system 130 components disposed as part of the logic chip 202, par. 53). Regarding claim 42, Wong further discloses: DPU is configured to: determine a characteristic of an object captured by the image sensor based on the output from the asynchronous change detectors, and then determine the setting of the image sensor based on the characteristic (velocity is determined and frame rate is set based on velocity, par. 119-120). Regarding claim 43, Wong further discloses: DPU is configured to: operate the camera module in an asynchronous operating mode in which the camera module reads output from the change detectors (after TG2 is raised to a high level at timing T0 an address event detection unit 400 detects when address event ignition eventually occurs for its respective shared event detection and image sensing pixel 501 and raises a detection signal at T1, par. 71, 78, 99-107, 134, and Figs. 6A and 10A), and control the image sensor by implementing the setting by being configured to change operating mode from the asynchronous operating mode to a synchronous operating mode, in which the camera module reads output from the synchronous intensity read-out circuitry, based on the output from the change detectors (event detection readout circuit 400 can trigger operation of the image signal generation readout circuit 320 based on charge generated by a photoelectric conversion element (or photoelectric conversion region) 333 and based on operation of the logic circuit 210, and in response to the event trigger, image frame capture begins, where the image frame capture can be a full frame image capture that involves all of the image sensing pixels 502 included in the pixel array 300, where the velocity is determined and frame rate is set based on velocity, par. 65, 67, 68, 71, 77, 110, 119, and 120). Regarding claim 44, Wong further discloses: DPU is configured to: capture, in the synchronous operating mode, a synchronous image frame based on the output from the synchronous intensity read-out circuitry (in response to the event trigger, image frame capture begins, where the image frame capture can be a full frame image capture that involves all of the image sensing pixels 502 included in the pixel array 300 at a frame rate, par. 65, 67, 68, 71, 77, 110), transmit the image to a host device and/or discard the image (images are transmitted from the imaging unit 12031 to display unit 12062 where the pedestrian is displayed, where the imaging unit 12031 can include an image sensor 200 incorporating a pixel array unit 300, par. 139, 152 and Fig. 15), and change operating mode from the synchronous operating mode to the asynchronous operating mode in response to transmitting or discarding the image (collection of image data from a region 1208 containing a stationary object can be discontinued after the acquisition of a single frame or image data and event detection pixels of the image sensor 200 are operated to detect the existence or nonexistence of address event ignition, par. 112, 119, 122-124). Regarding claim 45, Wong further discloses: A method for operating a camera module comprising the image sensor of claim 32 (see rejection of claim 32) and a Digital Processing Unit (DPU) (processor system 130 is, for example, constituted by a central processing unit (CPU), and processor system 130 can execute application programming or routines, stored as software or firmware in memory or data storage included in or interconnected to the processor system 130 to perform various functions and methods as described herein, par. 48), wherein the method comprises: determining, by the DPU, a setting of the image sensor based on output from the asynchronous change detectors of the image sensor (determination of an object velocity based on the event detection, par. 119-120), and controlling the image sensor by implementing the setting (change the frame rate of the image sensing pixels based on the determined object velocity, par. 119-120). Regarding claim 46, Wong further discloses: determining the setting comprises determining one or more of: a power setting, an exposure setting, a white balance setting, a focus setting, a resolution setting, an image size setting, and a frame rate (change the frame rate of the image sensing pixels based on the determined object velocity, par. 119-120). Regarding claim 47, Wong further discloses: determining the setting comprises determining a characteristic of an object captured by the image sensor based on the output from the asynchronous change detectors and determining a setting of the image sensor based on the determined characteristic (determination of an object velocity based on the event detection, par. 119-120). Regarding claim 48, Wong further discloses: operating the camera module in an asynchronous operating mode in which the camera module reads output from the change detectors (using shared event detection and image sensing pixels 501, after TG2 is raised to a high level at timing T0 an address event detection unit 400 detects when address event ignition eventually occurs and raises a detection signal at T1, par. 71, 78, 99-107, 134, and Figs. 6A and 10A), wherein controlling the image sensor by implementing the setting comprises changing operating mode from the asynchronous operating mode to a synchronous operating mode in which the camera module reads output from the synchronous intensity read-out circuitry (in response to the event trigger, image frame capture begins, where the image frame capture can be a full frame image capture that involves all of the image sensing pixels 502 included in the pixel array 300 at a frame rate, par. 65, 67, 68, 71, 77, 110), and wherein changing operating mode is based on the output from the change detectors (change the frame rate of the image sensing pixels based on the determined object velocity, par. 119-120). Regarding claim 49, Wong further discloses: capturing, in the synchronous operating mode, a synchronous image frame based on the output from the synchronous intensity read-out circuitry (in response to the event trigger, image frame capture begins, where the image frame capture can be a full frame image capture that involves all of the image sensing pixels 502 included in the pixel array 300 at a frame rate, par. 65, 67, 68, 71, 77, 110), transmitting the synchronous image frame to a host device and/or discarding the synchronous image frame (images are transmitted from the imaging unit 12031 to display unit 12062 where the pedestrian is displayed, where the imaging unit 12031 can include an image sensor 200 incorporating a pixel array unit 300, par. 139, 152 and Fig. 15), and changing operating mode from the synchronous operating mode to the asynchronous operating mode in response to transmitting or discarding the synchronous image frame (collection of image data from a region 1208 containing a stationary object can be discontinued after the acquisition of a single frame or image data and event detection pixels of the image sensor 200 are operated to detect the existence or nonexistence of address event ignition, par. 112, 119, 122-124). Regarding claim 50, Wong further discloses: output from the asynchronous change detectors comprises a first output associated with a first hybrid pixel followed by a second output from a neighbouring hybrid pixel (determining whether multiple events occur in adjacent or nearby event detection pixels, par. 42, 68, 71, 77, 78, 99-107, 110, 114, 134, and Fig. 10A), and wherein the method further comprises: capturing multiple synchronous image frames with the synchronous intensity read-out circuitry in response to detecting the output from the asynchronous change detectors (event detection readout circuit 400 can trigger operation of the image signal generation readout circuit 320 based on charge generated by a photoelectric conversion element (or photoelectric conversion region) 333 and based on operation of the logic circuit 210, and in response to the event trigger, image frame capture begins, where the image frame capture can be a full frame image capture that involves all of the image sensing pixels 502 included in the pixel array 300, where the velocity is determined and frame rate is set based on velocity, par. 65, 67, 68, 71, 77, 110, 119, and 120). Regarding claim 51, Wong further discloses: analyzing the synchronous image frame (data from the activated area of the image sensor, corresponding to the region of interest, can be analyzed, for example by a neural network or other decision making facility, to perform object recognition, object classification, gesture recognition, or the like, and image information can continue to be acquired for as long as a detected object remains within the field of view 114 of the imaging system 100, par. 10, 116, 122, and 129), determining, based on analyzing the synchronous image frame, to change how the setting of the image sensor is determined by the output from the asynchronous change detectors (acquisition of image information related to an object can be discontinued after the object is determined to have left the field of view of the imaging device 100, and after a determination that image sensing operations can be discontinued determination is made to not discontinue operation of the image sensor system 100 and the process can return to step 1108 where shared event detection and image sensing 501 or address event detection 503 pixels of the image sensor 200 are operated to detect the existence or nonexistence of address event ignition, par. 112, 122-124). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS G GILES whose telephone number is (571)272-2824. The examiner can normally be reached M-F 6:45AM-3:15PM EST (HOTELING). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Twyler Haskins can be reached at 571-272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NICHOLAS G GILES/Primary Examiner, Art Unit 2639
Read full office action

Prosecution Timeline

May 23, 2024
Application Filed
Sep 25, 2025
Non-Final Rejection — §102
Dec 29, 2025
Response Filed
Jan 29, 2026
Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604111
IMAGE SENSING DEVICE FOR OBTAINING HIGH DYNAMIC RANGE IMAGE AND IMAGING DEVICE INCLUDING THE SAME
2y 5m to grant Granted Apr 14, 2026
Patent 12598402
Partial Pixel Oversampling for High Dynamic Range Imaging
2y 5m to grant Granted Apr 07, 2026
Patent 12581213
SOLID-STATE IMAGING DEVICE AND METHOD OF CONTROLLING SOLID-STATE IMAGING DEVICE FOR SUPPRESSING DETERIORATION IN IMAGE QUALITY
2y 5m to grant Granted Mar 17, 2026
Patent 12581221
COMPARATOR AND IMAGE SENSOR INCLUDING THE SAME
2y 5m to grant Granted Mar 17, 2026
Patent 12581580
APPARATUSES AND METHODOLOGIES FOR FLICKER CONTROL
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
82%
Grant Probability
98%
With Interview (+16.5%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 834 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month