Prosecution Insights
Last updated: April 19, 2026
Application No. 18/847,851

SENSOR DEVICE AND METHOD FOR OPERATING A SENSOR DEVICE

Non-Final OA §102§112
Filed
Sep 17, 2024
Examiner
CUTLER, ALBERT H
Art Unit
2637
Tech Center
2600 — Communications
Assignee
Sony Semiconductor Solutions Corporation
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
811 granted / 1024 resolved
+17.2% vs TC avg
Strong +21% interview lift
Without
With
+21.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
33 currently pending
Career history
1057
Total Applications
across all art units

Statute-Specific Performance

§101
3.3%
-36.7% vs TC avg
§103
45.9%
+5.9% vs TC avg
§102
29.0%
-11.0% vs TC avg
§112
16.1%
-23.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1024 resolved cases

Office Action

§102 §112
DETAILED ACTION This office action is responsive to application 18/847,851 filed on September 17, 2024. Claims 1-15 are pending in the application and have been examined by the Examiner. Information Disclosure Statement The Information Disclosure Statements (IDS) filed on 9/17/2024 and 9/18/2025 were received and have been considered by the Examiner. Priority Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 12 and 13 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 12 recites events that have been detected “preferably after the end of the previous frame period”. It is unclear, based on this language, whether the events are required to be detected after the end of the previous frame period or not. As such, the Examiner is unable to determine the metes and bounds of claim 12. As recited in MPEP 2173.02, “During examination, after applying the broadest reasonable interpretation consistent with the specification to the claim, if the metes and bounds of the claimed invention are not clear, the claim is indefinite and should be rejected.” As such, claim 12 is deemed indefinite by the Examiner. Claim 13 is indefinite as depending from claim 12 and not remedying the deficiencies of claim 12. Due to the indefinite nature of claims 12 and 13, the Examiner is unable to determine whether a prior art rejection of these claims is appropriate at this time. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-11, 14 and 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Krishnappa et al. (US 2020/0137287). Consider claim 1, Krishnappa et al. teaches: A sensor device (electronic device, 200, figure 2A) comprising: a plurality of pixels (i.e. a “pixel array” (paragraph 0052) of a sensor (230), figures 2A and 4) each configured to receive light and perform photoelectric conversion to generate an electrical signal (The sensor (230) is a monocular CMOS image sensor, paragraphs 0064, 0051, 0052, 0058 and 0059.); event detection circuitry (event circuitry, 410, event signal processing, 4102, figure 4, paragraphs 0084-0086) that is configured to generate event data (“event data”, paragraph 0086) by detecting as events intensity changes above a predetermined threshold of the light received by each of event detecting pixels (“The event data of the scene may include a change in pixel intensity in an image” paragraph 0064. See also “threshold value”, paragraphs 0075 and 0087.) that form a first subset of the pixels (e.g. a “row” of pixels, paragraphs 0081, 0069, 0087 and 0099); pixel signal generating circuitry (RGB circuitry, 420, figure 4, paragraphs 0084 and 0088) that is configured to generate for each of a series of frame periods (e.g. RGB frame periods, paragraphs 0078 and 0095, see figure 5) pixel signals constituting a frame image that indicates intensity values of the light received by each of intensity detecting pixels (i.e. pixel signals constituting an RGB frame, paragraphs 0078 and 0095) that form a second subset of the pixels (i.e. for “rows in the image”, paragraph 0088) during respective exposure periods (e.g. during the three exposure periods shown in figure 5, paragraphs 0095-0097); and a control unit (RGB sensor controller, 430, figure 4, paragraphs 0084 and 0088) that is configured to associate with each other event detecting pixels and intensity detecting pixels that have a corresponding field of view (The sensor (230) including the event detecting pixels and intensity detecting pixels is a monocular sensor, paragraph 0064. The event detecting pixels and intensity detecting pixels are associated with each other, as a change is brightness level for each row is determined by the event detecting pixels (paragraph 0081) and the exposure time of each row of the RGB pixels is then set based on the determined change in brightness, paragraphs 0087 and 0088.) and to dynamically change the exposure periods of the intensity detecting pixels based on the events detected by the associated event detecting pixels (“An exposure time of each row may be determined based on the change in brightness level.” See paragraphs 0087 and 0088. See also the “exposure setting” in figure 6A, paragraphs 0099-0101.). Consider claim 2, and as applied to claim 1 above, Krishnappa et al. further teaches that the control unit is configured to deduce a brightness level from the events detected by the event detecting pixels; and a larger brightness level leads to a shorter exposure period, while a smaller amount of motion and/or a smaller brightness level leads to a longer exposure period (As shown in figure 6A, very bright regions have longer exposure times than low intensity regions, paragraphs 0099-0101.). Consider claim 3, and as applied to claim 1 above, Krishnappa et al. further teaches that the control unit is configured to adjust the exposure period of each intensity detecting pixel separately (i.e. such that different rows of intensity detecting pixels have different exposure periods, see figure 6A, paragraphs 0099-0101). Consider claim 4, and as applied to claim 1 above, Krishnappa et al. further teaches that the pixel signal generating circuitry generates during each frame period at least two sets of pixel signals with at least two differing exposure periods; and the control unit is configured to adjust the shorter exposure period, while the longer exposure period is fixed (For instance, as shown in figure 6B, a long high exposure period is fixed, whereas a short exposure period differs between low exposure and medium exposure, paragraphs 0102-0104.). Consider claim 5, and as applied to claim 4 above, Krishnappa et al. further teaches that the control unit is configured to set different frame periods for each set of pixel signal and to adjust the frame periods concurrently with the exposure period (As shown in figure 6A, images of rows having different exposure settings are captured during different frame periods, paragraphs 0099-0101.). Consider claim 6, and as applied to claim 4 above, Krishnappa et al. further teaches that the intensity detecting pixels are arranged in a two-dimensional array comprising a plurality of rows (“pixel array”, paragraph 0052); and the control unit is configured to read out pixel signals of the intensity detecting pixels in a row based manner such that for each row pixel signals of different exposure periods are generated simultaneously (see figure 6B, paragraphs 0102-0104); or the control unit is configured to read out pixel signals of the intensity detecting pixels in a row based manner such that for each row pixel signals of different exposure periods are read out consecutively (see figure 6A, paragraphs 0099-0101). Consider claim 7, and as applied to claim 4 above, Krishnappa et al. further teaches that the control unit is configured to execute a neural network (“Here, the single processor or the plurality of processors may include general-purpose processors, such as a central processing unit (CPU), an application processor (AP), a digital signal processor (DSP), etc., graphic-exclusive processors, such as a graphics processing unit (GPU), a vision processing unit (VPU), etc., or AI-exclusive processors, such a neural processing unit (NPU), etc.” paragraph 0049) that receives for each frame period all sets of pixel signals (“RGB data”, paragraph 0066) and the event data (“event data”, paragraph 0066) generated during the frame period (see paragraph 0066) and outputs a frame image (“RGB frames”, paragraph 0095, figure 5). Consider claim 8, and as applied to claim 1 above, Krishnappa et al. further teaches that the control unit is configured to set different exposure periods in different parts of a frame image (see figures 6A and 6B, paragraphs 0099-0104). Consider claim 9, and as applied to claim 1 above, Krishnappa et al. further teaches that the control unit is configured to adjust the frame periods concurrently with the exposure periods (i.e. such rows with the same exposure periods have the same frame periods, as shown in figure 6A and detailed in paragraphs 0099-0101). Consider claim 10, and as applied to claim 1 above, Krishnappa et al. further teaches that the control unit is configured to evaluate the events detected during a current frame period and to adjust the exposure periods within the next frame period based on the result of the evaluation (Event data (602, 604, figure 6A) is detected in a pre-capture phase prior to adjusting the exposure periods within the next frame period, paragraphs 0099-0101, figure 5, paragraphs 0094-0096.). Claim 11 is directed toward the operation of the control unit with respect to the number of events reaching the predetermined value. However, parent claim 10 recites “wherein the control unit is configured to evaluate the events detected during a current frame period and to adjust the exposure periods within the next frame period based on the result of the evaluation; or the control unit is configured to count events detected during a current exposure period and to end the exposure period, when the number of events reaches a predetermined value”. With regard to claim 10, Krishnappa et al. teaches that the control unit is configured to evaluate the events detected during a current frame period and to adjust the exposure periods within the next frame period based on the result of the evaluation (see claim 10 rationale). As such, Krishnappa et al. is not required to additionally teach that the control unit is configured to count events detected during a current exposure period and to end the exposure period, when the number of events reaches a predetermined value. Because claim 11 modifies a limitation not required in claim 10, Krishnappa et al. anticipates claim 11 for the reasons provided with respect to claim 10. Consider claim 14, and as applied to claim 1 above, Krishnappa et al. further teaches that the first subset of pixels overlaps or is equal to the second subset of pixels (The sensor (230) is a monocular sensor, paragraph 0064. The fields of view of the event data and the image data overlap, as shown in figures 6A and 6B, and detailed in paragraphs 0099-0104.). Consider claim 15, Krishnappa et al. teaches: A method for operating a sensor device (electronic device, 200, figure 2A), the method comprising: receiving light and performing photoelectric conversion with each of a plurality of pixels of the sensor device (i.e. a “pixel array” (paragraph 0052) of a sensor (230), figures 2A and 4) to generate an electrical signal (The sensor (230) is a monocular CMOS image sensor, paragraphs 0064, 0051, 0052, 0058 and 0059.); generating, with event detection circuitry of the sensor device (event circuitry, 410, event signal processing, 4102, figure 4, paragraphs 0084-0086), event data (“event data”, paragraph 0086) by detecting as events intensity changes above a predetermined threshold of the light received by each of event detecting pixels (“The event data of the scene may include a change in pixel intensity in an image” paragraph 0064. See also “threshold value”, paragraphs 0075 and 0087.) that form a first subset of the pixels (e.g. a “row” of pixels, paragraphs 0081, 0069, 0087 and 0099); generating, with pixel signal generating circuitry (RGB circuitry, 420, figure 4, paragraphs 0084 and 0088), for each of a series of frame periods (e.g. RGB frame periods, paragraphs 0078 and 0095, see figure 5) pixel signals constituting a frame image that indicates intensity values of the light received by each of intensity detecting pixels (i.e. pixel signals constituting an RGB frame, paragraphs 0078 and 0095) that form a second subset of the pixels (i.e. for “rows in the image”, paragraph 0088) during respective exposure periods (e.g. during the three exposure periods shown in figure 5, paragraphs 0095-0097); associating with each other event detecting pixels and intensity detecting pixels that have a corresponding field of view (The sensor (230) including the event detecting pixels and intensity detecting pixels is a monocular sensor, paragraph 0064. The event detecting pixels and intensity detecting pixels are associated with each other, as a change is brightness level for each row is determined by the event detecting pixels (paragraph 0081) and the exposure time of each row of the RGB pixels is then set based on the determined change in brightness, paragraphs 0087 and 0088.); and dynamically changing the exposure periods of the intensity detecting pixels based on the events detected by the associated event detecting pixels (“An exposure time of each row may be determined based on the change in brightness level.” See paragraphs 0087 and 0088. See also the “exposure setting” in figure 6A, paragraphs 0099-0101.). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Li et al. (US 2023/0039867) teaches a hybrid sensor (figure 3) having color pixels (112) and event sensing pixels (122), paragraph 0027. Kim et al. (US 2014/0009648) teaches multiple hybrid sensor configurations (see figures 2-11, 14-18 and 20-24). Niwa et al. (US 2020/0358977) teaches a hybrid sensor (figure 27) having normal pixels (312) and event sensing pixels (313), and of driving normal pixels in response to detecting an event from event sensing pixels (see figure 10). Oshima (US 2022/0141402) teaches calculating an exposure time on the basis of integrated data from an event detection unit (see paragraph 0050). Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALBERT H CUTLER whose telephone number is (571)270-1460. The examiner can normally be reached approximately Mon - Fri 8:00-4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sinh Tran can be reached at (571)272-7564. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALBERT H CUTLER/Primary Examiner, Art Unit 2637
Read full office action

Prosecution Timeline

Sep 17, 2024
Application Filed
Feb 06, 2026
Non-Final Rejection — §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592997
PERIPHERAL BUS VIDEO COMMUNICATION USING INTERNET PROTOCOL
2y 5m to grant Granted Mar 31, 2026
Patent 12587765
IMAGING DEVICE AND ELECTRONIC APPARATUS COMPRISING IMAGING DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12587763
COMPARISON CIRCUIT AND IMAGE SENSING DEVICE INCLUDING THE SAME
2y 5m to grant Granted Mar 24, 2026
Patent 12581765
ACTIVE PIXEL IMAGE SENSOR AND DISPLAY DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12563286
METHOD AND MOBILE DEVICE FOR CAPTURING AN IMAGE OF A FOOT USING AUGMENTED REALITY
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
99%
With Interview (+21.3%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 1024 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month