Prosecution Insights
Last updated: April 19, 2026
Application No. 18/557,466

IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD

Final Rejection §101§102
Filed
Oct 26, 2023
Examiner
YILMAKASSAYE, SURAFEL
Art Unit
2639
Tech Center
2600 — Communications
Assignee
Sony Semiconductor Solutions Corporation
OA Round
2 (Final)
50%
Grant Probability
Moderate
3-4
OA Rounds
2y 6m
To Grant
84%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
17 granted / 34 resolved
-12.0% vs TC avg
Strong +34% interview lift
Without
With
+33.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
31 currently pending
Career history
65
Total Applications
across all art units

Statute-Specific Performance

§101
2.4%
-37.6% vs TC avg
§103
58.7%
+18.7% vs TC avg
§102
34.3%
-5.7% vs TC avg
§112
4.5%
-35.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 34 resolved cases

Office Action

§101 §102
Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Acknowledgements 2. Applicant’s arguments, filed on 08/08/2025, are acknowledged. Amended claims 1, 3-14, and canceled claim 2 are acknowledged. Claims 1 and 3-14 remain pending and have been examined. Response to Arguments 3. With regards to the claims rejected under 35 U.S.C. 101: Applicant’s argument, see pg. 13 of 17, filed 8/8/2025, with respect to the rejection(s) of claims 9 and 14 under 35 U.S.C. 101, have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground of rejection is made in view of Burns (US 2018/0098082 A1; see below for rejection). 4. With regards to newly amended independent claims 1, 8-10, 13 and 14, previously rejected under 35 U.S.C. 102(a)(1); Applicant's arguments have been fully considered but they are not persuasive. Applicant argues (pg. 13-14 of 17): “…Burns does not expressly or inherently describe at least, for example, the features of "generate a first interpolation frame based on the generated first frame data and the generated first event data; integrate the generated first event data for a specific period of time to generate first integrated event data; add the generated first interpolation frame to the first integrated event data; and generate a second interpolation frame based on the addition of the first interpolation frame to the first integrated event data," as recited in amended independent claim 1…” Applicant further argues (pg. 16 of 17): “…Burns in its entirety does not describe that the system generates a second interpolated video frame by adding first interpolated video frame to the integrated pixel events…”. Burns, in [0020], teaches that events associated with pixel changes includes pixel address that provide correlation of the event to a pixel in the 2D image plane of a frame camera output; wherein each event also includes a timestamp used for integration and interpolation between frames. [0028] teaches that an event integration circuit integrates a subset of sequence of pixel events, occurring within the frame sampling period between pairs of captured image frames; wherein the integration generates a pixel motion vector. As such, when it is desired to insert a new interpolation frame between two existing frames, pixel events may be integrated so to generate motion vectors used to predict a new frame (wherein pixel events can be integrated at different amounts over a frame). [0034] further teaches circuit 112 is configured to perform frame rate up-conversion on a sequence of image frames using motion compensated interpolation based, at least in part, on estimated tile motion vectors. Burns further teaches circuit 502 which generate interpolated video frames corresponding to time periods between the captured video frames by applying the tile motion vectors to tiles of a captured video frame. As such, this may be viewed as generating a plurality of interpolated video frames which take place sequentially on a time frame. 5. Similarly, rejections regarding independent claims 8-10, 13-14 and dependent claim 3 and 11-12 are maintained (see below for further details). 6. With regards to amended dependent claim 4-7, the claims are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Claim Rejections - 35 USC § 102 7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 8. Claims 1, 3, and 8-14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Burns et al. (US 2018/0098082 A1; further referred to as Burns). 9. Regarding claim 1, an image processing device (…Burns teaches a system for hybrid motion estimation in [0015], Fig. 1 and Fig 8, system 800 [0044]…) comprising: a frame-based imaging element (…[0015] teaches a frame based video camera 104, Fig. 1…) confiqured to generate first frame data in a first exposure time (…wherein [0019], in accordance with Fig. 2 (202, e.g.), teaches frame camera output at a time sequence of frames at a time interval…) ; an event-based imaging element (…[0015] teaches an event based video camera 102, Fig. 1…) confiqured to generate first event data in the first exposure time (…wherein [0020], in accordance with Fig. 2 (204, e.g.) teaches event camera output at some frame rate of timing…); and a generation unit (…wherein [0034] teaches a motion compensated interpolation circuit…) confiqured to: generate a first interpolation frame based on the generated first frame data and the generated first event data (…[0015] teaches a hybrid motion estimation circuit 106 which accepts event based information and frame based information and generates motion vectors from a hybrid combination of the obtained information; further [0030] teaches tile mapping circuit 310 which spatially maps motion vectors to one or more tile frames. [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames by applying the tile motion vectors to tiles of a captured video frame to predict a new video frame at the next up-converted time period…), integrate the generated first event data for a specific period of time to generate first integrated event data (… [0040-0041] teach method 700 wherein a sequence of pixel events and a sequence of image frames are received (in subsequent operations) and at an operation 730 a subset of sequence of pixel events occurring within a frame sampling period are integrated to generate a pixel motion vector representing the motion of that pixel between two image frames…); add the generated first interpolation frame to the first integrated event data (…wherein [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames; by applying the tile motion vectors to tiles of a captured video frame (wherein motion vectors are resultant from an integration process of operation 730…); and generate a second interpolation frame based on the addition of the first interpolation frame to the first integrated event data (…[0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames by applying the tile motion vectors (wherein motion vectors are resultant from an integration process of operation 730) to tiles of a captured video frame to predict a new video frame at the next up-converted time period; as such, the process takes place on frame by frame basis wherein interpolated video frames are generated…). 10. Regarding claim 3, Burns teaches the image processing device according to claim 1 (see claim 1 above), wherein the generation unit is further confiqured to: generate a first plurality of pieces of integrated event data in the first exposure time (…wherein [0028] teaches that an integration circuit 306 integrates a subset (or all) of a sequence of pixel events which occur within a frame sampling period between pairs of captures image frames…), wherein the generated first plurality of pieces of integrated event data includes the first integrated event data (…wherein [0028] further teaches that integration is employed to generate a pixel motion vector representing motion of the pixel between the captured image frames, as such for example, Fig. 2 depicts a series of captured image frames…); and generate, based on the generated first frame data and the generated first plurality of pieces of integrated event data, a top interpolation frame in the first exposure time (…wherein [0028] also teaches that a subset of pixel events selected for integration depends on the application; for example, in a frame rate up-conversion application, as described below, where it is desired to insert a new interpolated frame between two existing frames (e.g., at a 2× up-convert rate), the pixel events may be integrated over half of the frame capture period to generate motion vectors used to predict the new frame at the halfway point…). 11. Regarding claim 8, claim 8 is rejected for reasons related to claim 1. 12. Regarding claim 9 a non-transitory computer-readable medium having stored thereon, computer-executable instructions which, when executed by a computer, cause the computer to execute operations (…Burns, in [0059], teaches a machine readable memory unit that may store instructions that when executed may cause the machine to perform; Fig. 8…), the operations comprising: controlling a frame-based imaging element to generate frame data in an exposure time (…[0015] teaches a frame based video camera 104, Fig. 1; wherein [0019], in accordance with Fig. 2 (202, e.g.), teaches frame camera output at a time sequence of frames at a time interval…); controlling an event-based imaging element to generate event data in the exposure time (…[0015] teaches an event based video camera 102, Fig. 1; wherein [0020], in accordance with Fig. 2 (204, e.g.) teaches event camera output at some frame rate of timing…); generating a first interpolation frame based on the generated frame data and the generated event data (…[0015] teaches a hybrid motion estimation circuit 106 which accepts event based information and frame based information and generates motion vectors from a hybrid combination of the obtained information; further [0030] teaches tile mapping circuit 310 which spatially maps motion vectors to one or more tile frames. [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames by applying the tile motion vectors to tiles of a captured video frame to predict a new video frame at the next up-converted time period…), integrating the generated event data for a specific period of time to generate integrated event data (… [0040-0041] teach method 700 wherein a sequence of pixel events and a sequence of image frames are received (in subsequent operations) and at an operation 730 a subset of sequence of pixel events occurring within a frame sampling period are integrated to generate a pixel motion vector representing the motion of that pixel between two image frames…); adding the generated first interpolation frame to the integrated event data (…wherein [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames; by applying the tile motion vectors to tiles of a captured video frame (wherein motion vectors are resultant from an integration process of operation 730…); and generating a second interpolation frame based on the addition of the first interpolation frame and the integrated event data (…[0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames by applying the tile motion vectors (wherein motion vectors are resultant from an integration process of operation 730) to tiles of a captured video frame to predict a new video frame at the next up-converted time period; as such, the process takes place on frame by frame basis wherein interpolated video frames are generated…). 13. Regarding claim 10, an image processing device comprising: an event-based imaging element confiqured to generate event data (…[0015] teaches an event based video camera 102, Fig. 1; wherein [0020], in accordance with Fig. 2 (204, e.g.) teaches event camera output at some frame rate of timing…): and a generation unit confiqured to: generate, based on the generated event data, a first interpolation frame that interpolates between a first frame and a second frame (…[0015] teaches a hybrid motion estimation circuit 106 which accepts event based information and frame based information and generates motion vectors from a hybrid combination of the obtained information; wherein [0020] teaches that each event includes a timestamp which is used for interpolation between frames; further [0030] teaches tile mapping circuit 310 which spatially maps motion vectors to one or more tile frames. [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames by applying the tile motion vectors to tiles of a captured video frame to predict a new video frame at the next up-converted time period…), integrate the generated event data for a specific period of time to generate integrated event data (… wherein [0020] teaches that each event also includes a timestamp which is used for integration between frames [0040-0041] teach method 700 wherein a sequence of pixel events and a sequence of image frames are received (in subsequent operations) and at an operation 730 a subset of sequence of pixel events occurring within a frame sampling period are integrated to generate a pixel motion vector representing the motion of that pixel between two image frames…); add the generated first interpolation frame to the integrated event data (…wherein [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames; by applying the tile motion vectors to tiles of a captured video frame (wherein motion vectors are resultant from an integration process of operation 730…); and generate a second interpolation frame based on the addition of the generated first interpolation frame to the integrated event data (…[0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between captured video frames by applying the tile motion vectors (wherein motion vectors are resultant from an integration process of operation 730) to tiles of a captured video frame to predict a new video frame at the next up-converted time period; as such, the process takes place on frame by frame basis wherein interpolated video frames are generated…). 14. Regarding claim 11, the image processing device according to claim 10 (see claim 10 above),further comprises a frame-based imaging element confiqured to capture the first frame and the second frame (…Burns, in [0015], teaches a frame based video camera 104, Fig. 1; wherein [0019], in accordance with Fig. 2 (202, e.g.), teaches frame camera output at a time sequence of frames at a time interval…). 15. Regarding claim 12, the image processing device according to claim 10 (see claim 10 above), wherein the generation unit is further confiqured to generate a plurality of interpolation frames, and the generated plurality of interpolation frames includes the first frame, the second frame, the generated first interpolation frame, and the generated second interpolation frame (…wherein Burns, in [0034] teaches to generate interpolated video frames corresponding to time periods between captured video frames…). 16. Regarding claim 13, claim 13 is rejected for reasons related to claim 10. 17. Regarding claim 14, a non-transitory computer-readable medium having stored thereon, computer-executable instructions which, when executed by a computer, cause the computer to execute operations (…Burns, in [0059], teaches a machine readable memory unit that may store instructions that when executed may cause the machine to perform; Fig. 8…), the operations comprising: controlling an event-based imaging element to generate event data (…Burns, in [0015], teaches an event based video camera 102, Fig. 1; wherein [0020], in accordance with Fig. 2 (204, e.g.) teaches event camera output at some frame rate of timing…); generating, based on the generated event data, a first interpolation frame that interpolates between a first frame and a second frame (…[0015] teaches a hybrid motion estimation circuit 106 which accepts event based information and frame based information and generates motion vectors from a hybrid combination of the obtained information; wherein [0020] teaches that each event includes a timestamp which is used for interpolation between frames; further [0030] teaches tile mapping circuit 310 which spatially maps motion vectors to one or more tile frames. [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames by applying the tile motion vectors to tiles of a captured video frame to predict a new video frame at the next up-converted time period…), integrating the generated event data for a specific period of time to generate integrated event data (… wherein [0020] teaches that each event also includes a timestamp which is used for integration between frames; [0040-0041] further teach method 700 wherein a sequence of pixel events and a sequence of image frames are received (in subsequent operations) and at an operation 730 a subset of sequence of pixel events occurring within a frame sampling period are integrated to generate a pixel motion vector representing the motion of that pixel between two image frames…): adding the generated first interpolation frame to the integrated event data (…wherein [0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between the captured video frames; by applying the tile motion vectors to tiles of a captured video frame (wherein motion vectors are resultant from an integration process of operation 730…); and generating a second interpolation frame based on the addition of the generated first interpolation frame to the integrated event data (…[0034] teaches that circuit 502 is configured to generate interpolated video frames corresponding to time periods between captured video frames by applying the tile motion vectors (wherein motion vectors are resultant from an integration process of operation 730) to tiles of a captured video frame to predict a new video frame at the next up-converted time period; as such, the process takes place on frame by frame basis wherein interpolated video frames are generated…). Allowable Subject Matter 18. Claims 4-7 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion 19. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SURAFEL YILMAKASSAYE whose telephone number is (703)756-1910. The examiner can normally be reached Monday-Friday 8:30am-5:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SURAFEL YILMAKASSAYE/Examiner, Art Unit 2639 /JAMES M HANNETT/Primary Examiner, Art Unit 2639
Read full office action

Prosecution Timeline

Oct 26, 2023
Application Filed
May 02, 2025
Non-Final Rejection — §101, §102
Aug 08, 2025
Response Filed
Nov 07, 2025
Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12538047
Ambient Light Sensing with Image Sensor
2y 5m to grant Granted Jan 27, 2026
Patent 12506981
PHOTOELECTRIC CONVERSION APPARATUS, METHOD FOR CONTROLLING PHOTOELECTRIC CONVERSION APPARATUS, AND STORAGE MEDIUM
2y 5m to grant Granted Dec 23, 2025
Patent 12495224
IMAGE SENSING DEVICE AND IMAGE PROCESSING METHOD OF THE SAME
2y 5m to grant Granted Dec 09, 2025
Patent 12470797
OPTICAL ELEMENT DRIVING MECHANISM
2y 5m to grant Granted Nov 11, 2025
Patent 12452534
CONTROL APPARATUS, LENS APPARATUS, IMAGE PICKUP APPARATUS, IMAGE PICKUP SYSTEM, CONTROL METHOD, AND A NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
50%
Grant Probability
84%
With Interview (+33.6%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 34 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month