Prosecution Insights
Last updated: April 19, 2026
Application No. 18/855,384

SIGNAL PROCESSING CIRCUIT, SIGNAL PROCESSING METHOD, AND PROGRAM

Non-Final OA §101§103
Filed
Oct 09, 2024
Examiner
GILES, NICHOLAS G
Art Unit
2639
Tech Center
2600 — Communications
Assignee
Sony Interactive Entertainment Inc.
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
98%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
683 granted / 834 resolved
+19.9% vs TC avg
Strong +16% interview lift
Without
With
+16.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
25 currently pending
Career history
859
Total Applications
across all art units

Statute-Specific Performance

§101
4.0%
-36.0% vs TC avg
§103
39.2%
-0.8% vs TC avg
§102
24.4%
-15.6% vs TC avg
§112
23.7%
-16.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 834 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. The broadest reasonable interpretation of a claim drawn to a computer readable medium (also called machine readable medium and other such variations) typically covers forms of non-transitory tangible media and transitory propagating signals per se in view of the ordinary and customary meaning of computer readable media, particularly when the specification is silent. See MPEP 2111.01. When the broadest reasonable interpretation of a claim covers a signal per se, the claim must be rejected under 35 U.S.C. § 101 as covering non-statutory subject matter. See In re Nuijten, 500 F.3d 1346, 1356-57 (Fed. Cir. 2007) (transitory embodiments are not directed to statutory subject matter) and Interim Examination Instructions for Evaluating Subject Matter Eligibility Under 35 U.S.C. § 101, Aug. 24, 2009; p. 2. A claim drawn to such a computer readable medium that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 U.S.C. § 101 by adding the limitation "non-transitory" to the claim. Such an amendment would typically not raise the issue of new matter, even when the specification is silent because the broadest reasonable interpretation relies on the ordinary and customary meaning that includes signals per se. The limited situations in which such an amendment could raise issues of new matter occur, for example, is when the specification does not support a non-transitory embodiment because a signal per se is the only viable embodiment such that the amended claim is impermissibly broadened beyond the supporting disclosure. See, e.g., Gentry Gallery, Inc. v. Berkline Corp., 134 F.3d 1473 (Fed. Cir. 1998). For additional information, please see the Official Gazette notice published February 23, 2010 (1351 OG 212). Claim 12 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because a program is not a statutory class and the program is not embodied on a non-transitory computer readable medium. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-12 are is/are rejected under 35 U.S.C. 103 as being unpatentable over Watanabe et al. (U.S. Pub. No. 20180167575). Regarding claim 1, Watanabe discloses: A signal processing circuit that processes an event signal generated in an event-based vision sensor, the signal processing circuit comprising: a circuit for executing operation (control circuit 10 operating the device, par. 52-55), wherein the operation includes detecting a relation among at least either positions in a block obtained by dividing a detection region of the event-based vision sensor regarding the event signals generated in the block or clock times at which the event signals have been generated (entire region of the pixel array section 40 is divided into a plurality of blocks segmented according to the predetermined number of rows and the predetermined number of columns where a reading region of normal pixels 51 from which the pixel signals are read corresponding to motion detection pixels 52 having output event pixel signals may be determined (relation) in consideration of a region of division regions as the entire region of the above-described pixel array section 40, and where pixel 50 (pixel 50 configured to generate a charge signal for motion detection (i.e., the pixel 50 configured to output a pixel signal of motion detection) is referred to as a “motion detection pixel 52”) configured to generate a charge signal for motion detection asynchronously (clock times, and not synchronous driving of pixels) outputs a pulse signal indicating a change over time in the charge signal and a change direction and adds address information indicating a position of the pixel 50 itself to the pulse signal and outputs the pulse signal with the address information as a pixel signal, par. 50-53 and 150). Watanabe is silent with regards to a memory for storing a program code; and a processor for executing operation in accordance with the program code. Official Notice is taken that it was well known before the effective filing date of the claimed invention to include using a memory storing program code that the processor executes to operate a camera. This is advantageous in that the camera can be reprogrammed to operate in a desired manner. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include a memory for storing a program code; and a processor for executing operation in accordance with the program code. Regarding claim 2, Watanabe further discloses: relation includes a figure formed by a set of the positions of the event signals (object, where pixel signals of the normal photographing are output (read) from only the normal pixels 51 arranged in a predetermined small reading region around a position at which the motion detection pixel 52 having detected the motion of the object is arranged, par. 104-106). Regarding claim 3, Watanabe further discloses: figure includes at least one line segment (event pixel signals are output in the same period from a motion detection pixels 52a and 52b forming a line, where the event signals are from a moving object, par. 104-106, 108, 109, 115-117, and Figs. 6 and 7). Regarding claim 4, Watanabe further discloses: figure includes a curve (motion detection pixel distribution D1 shows a range in which the motion detection pixels 52 having output the event pixel signals in the same period among the plurality of motion detection pixels 52 arranged in the entire region of the pixel array section 40 are distributed, and where in Fig. 7 motion detection pixel distribution D1 is seen as a curved region, par. 115-116 and Figs. 6 and 7). Regarding claim 5, Watanabe further discloses: the relation includes a variance on a time series regarding the clock times at which the event signals have been generated (generate a charge signal for motion detection asynchronously, par. 50-53 and 150). Regarding claim 6, Watanabe further discloses: operation further includes setting a size of the block different between a first part and a second part of the detection region (the size of the predetermined reading region in which the pixel signal is output (read) from the normal pixel 51 is not limited to the above-described 5 rows and 5 columns and various sizes (numbers of rows/columns) are conceivable in accordance with an arrangement of the normal pixels 51 and the motion detection pixels 52 within the pixel array section 40 or the like, where each reading region is not limited to a reading region in which adjacent reading regions overlap, i.e., a reading region including the same normal pixel 51, and reading address control circuit 100 determines a reading region in which pixel signals are read so that pixel signals are not redundantly read from the normal pixels 51 overlapping each other in the reading region, par. 109-112). Regarding claim 7, Watanabe further discloses: dynamically changing a size of the block according to a detection result of the relation (the size of the predetermined reading region in which the pixel signal is output (read) from the normal pixel 51 is not limited to the above-described 5 rows and 5 columns and various sizes (numbers of rows/columns) are conceivable in accordance with an arrangement of the normal pixels 51 and the motion detection pixels 52 within the pixel array section 40 or the like, where each reading region is not limited to a reading region in which adjacent reading regions overlap, i.e., a reading region including the same normal pixel 51, and reading address control circuit 100 determines a reading region in which pixel signals are read so that pixel signals are not redundantly read from the normal pixels 51 overlapping each other in the reading region, par. 109-112). Regarding claim 8, Watanabe further discloses: blocks have a lattice shape (5 rows and 5 columns around the position of the motion detection pixel 52a, par. 91). Regarding claim 9, Watanabe further discloses: blocks include a first block and a second block adjacent to each other, and the first block and the second block partly overlap (normal pixels 51 arranged in the reading region overlap each other and the reading address control circuit 100 determines a reading region in which pixel signals are read so that pixel signals are not redundantly read from the normal pixels 51 overlapping each other in the reading region, par. 109-112). Regarding claim 10, Watanabe further discloses: blocks include a first block group and a second block group that overlaps with the first block group and has a boundary different from that of the first block group, and at least part of the detection region is covered by the first block group and the second block group in an overlapping manner (event pixel signals are output in the same period from a motion detection pixels 52a and 52b, where 5 rows and 5 columns around the position of the motion detection pixel 52a and 5 rows and 5 columns around the position of the motion detection pixel 52b are first and second block groups with different edges and an overlapping region, and reading address control circuit 100 determines a reading region in which pixel signals are read so that pixel signals are not redundantly read from the normal pixels 51 overlapping each other in the reading region, par. par. 91, 104-106, 108-112, 115-117, and Figs. 6 and 7). Regarding claim 11, see the rejection of claim 1 and note that the limitations of claim 11 were shown. Regarding claim 12, see the rejection of claim 1 and note that the limitations of claim 12 were shown. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS G GILES whose telephone number is (571)272-2824. The examiner can normally be reached M-F 6:45AM-3:15PM EST (HOTELING). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Twyler Haskins can be reached at 571-272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NICHOLAS G GILES/ Primary Examiner, Art Unit 2639
Read full office action

Prosecution Timeline

Oct 09, 2024
Application Filed
Jan 17, 2026
Non-Final Rejection — §101, §103
Apr 14, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604111
IMAGE SENSING DEVICE FOR OBTAINING HIGH DYNAMIC RANGE IMAGE AND IMAGING DEVICE INCLUDING THE SAME
2y 5m to grant Granted Apr 14, 2026
Patent 12598402
Partial Pixel Oversampling for High Dynamic Range Imaging
2y 5m to grant Granted Apr 07, 2026
Patent 12581213
SOLID-STATE IMAGING DEVICE AND METHOD OF CONTROLLING SOLID-STATE IMAGING DEVICE FOR SUPPRESSING DETERIORATION IN IMAGE QUALITY
2y 5m to grant Granted Mar 17, 2026
Patent 12581221
COMPARATOR AND IMAGE SENSOR INCLUDING THE SAME
2y 5m to grant Granted Mar 17, 2026
Patent 12581580
APPARATUSES AND METHODOLOGIES FOR FLICKER CONTROL
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
98%
With Interview (+16.5%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 834 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month