Prosecution Insights
Last updated: April 19, 2026
Application No. 18/514,428

IMAGE GENERATION DEVICE, IMAGE GENERATION METHOD, AND PROGRAM

Final Rejection §103
Filed
Nov 20, 2023
Examiner
BITAR, NANCY
Art Unit
2664
Tech Center
2600 — Communications
Assignee
DENSO CORPORATION
OA Round
2 (Final)
83%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
91%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
786 granted / 946 resolved
+21.1% vs TC avg
Moderate +8% lift
Without
With
+8.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
32 currently pending
Career history
978
Total Applications
across all art units

Statute-Specific Performance

§101
13.3%
-26.7% vs TC avg
§103
62.1%
+22.1% vs TC avg
§102
6.4%
-33.6% vs TC avg
§112
8.9%
-31.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 946 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s response to the last Office Action, filed 10/02/2025, has been entered and made of record. Applicant has amended claims 1—3, and 5. Claims 1-5 are currently pending. Applicants arguments filed 12/31/2025have been fully considered but they are not persuasive. Applicant argues that Yousefian is silent regarding any form of objection detection among the sequence of images. Yousefian "provide a three-dimensional object detection device which has high precision in detecting other vehicles traveling in adjacent traffic lanes adjacent to the lane traveled by the vehicle. Additionally the applicant argues that Tsuchiya describes the detection of stationary and moving objects relative of the vehicle, but none of the detected objects are objects that move integrally with the vehicle. In response, The claim language requires “ detect whether an object that integrally moves with the vehicle appears in the image within the specific range” . Yousefian clearly teaches in paragraph [0032] image sequences (or video sequences) from perimeter view cameras (front view camera 125, side view cameras 120a and 120b, and rear view camera 130, for example) may be stored in a memory to enable the driver assist system to construct a view of the ground and terrain immediately below the vehicle (referred to herein as an “under view”) which is not normally captured live by any of the perimeter view cameras. In addition, superimposing tire and body positions (for example, a partially or fully transparent vehicle body) on an image of the ground and terrain below a vehicle may provide the driver with detailed knowledge of the vehicle's surroundings that may not be visible from Side the vehicle( i.e. detect whether an object is integrally move with the vehicle, note that the terrain includes the object detect among the sequence of images ) . Displaying information related to wheel position, wheel slippage, projected wheel path, vehicle pitch and roll, and vehicle location may provide the knowledge to maneuver a vehicle that would otherwise require spotters outside of the vehicle. Note that Yousefian rear camera can easily detect exterior component such as a step, a towing hitch, a bull bar, or a kangaroo bar may be retrofit to the vehicle VH in accordance with preference of the owner, a purpose of use of the vehicle, and the like. Note that integrally moves by definition is constituent move and the rear camera integrally take images of parts under the vehicle if they are connected or not. Examiner used a secondary reference Tsuchiya teaches the natural object assessment unit 38 creates an offset differential image taking into account the movement amount of the vehicle V. The offset amount d' corresponds to the movement amount in the bird's-eye view image data corresponding to the actual travel distance of the vehicle V shown in FIG. 4(a), and the offset amount d' is determined based on a signal from the vehicle speed sensor 20 and the time duration from immediately preceding point in time to the current time. The first integrated value is the total value of the predetermined areas or all the values plotted as the first differential waveform information. The three-dimensional object assessment unit 34 finds a second integrated value of second differential waveform information created by counting the pixel number representing a predetermined differential and creating a frequency distribution in the differential image of the first bird's-eye view image obtained at the first time and the second bird's-eye view image obtained at the second time which is after the first time. In other words, the natural object assessment unit 38 acquires a differential image in which the images are not offset. The second integrated value is the total value of the predetermined areas or all the values plotted as the second differential waveform information. The three-dimensional object assessment unit 34 assesses that the three-dimensional object detected by the three-dimensional object detection unit 33 is a moving object when an evaluation value, which corresponds to the number of times the first integrated value is assessed to be greater than the second integrated value, is equal to or greater than a predetermined evaluation threshold ( paragraph [0242-0250]).Tsuchiya clearly teach safe distance lines 325a and 325b that may show a computed safe distance from the vehicle body or bumpers(to help the driver avoid obstacles near the vehicle), may be superimposed on the display. Safe distance lines 325a and 325b may also be shown in a different (contrasting) color, or may be displayed by superimposing an extension of the sides of the vehicle, that may allow a driver to see how closely they have moved the vehicle to or from an obstacle. Meter 315 may be configured to show a number of vehicle parameters, including, for example, pitch, yaw, and roll and/or display limits 318. The limits 318 being a limit to which the vehicle should be bound regarding pitch, yaw, roll, and the like, paragraph [0030-0031][0047]. All remaining arguments are reliant on the aforementioned and addressed arguments and thus are considered to be wholly addressed herein. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yousefian et al (US 2019/0031101) in view of Tsuchiya et al (US 2014/0168440) As to claim 1, Yousefian et al teaches an image generation device for generating an underfloor image indicating at least a ground below a vehicle, the image generation device comprising: at least one processor configured to: take in an image within a specific range from a captured image captured by a camera enabled to capture at least a ground in a periphery of the vehicle, and to generate the underfloor image based on the taken-in image (Paragraph [0016] teaches images from perimeter view cameras used to construct a view of the ground and terrain below a vehicle; Paragraph [0026] teaches stitching images together to form an image mosaic at an on-board computer; Paragraphs [0027]-[0029] teach a CPU, computer, or microprocessor; Paragraph [0034] teaches computing the under view for any direction of vehicle travel); detect whether an object that integrally moves with the vehicle appears in the image within the specific range ((Paragraph [0016] teaches images from perimeter view cameras used to construct a view of the ground and terrain below a vehicle; Paragraph [0026] teaches stitching images together to form an image mosaic at an on-board computer; Paragraphs [0027]-[0029] teach a CPU, computer, or microprocessor; Paragraph [0034] teaches computing the under view for any direction of vehicle travel, note that (Paragraph [0032] teaches superimposing vehicle tire/body positions on the image of the ground below the vehicle; Paragraph [0048] teaches a partial under view including the position and orientation of tires; Paragraph [0050] teaches hashmarks; Figure 7, Item 730 show the striped hashmarks in relation to the tire guidelines showing the position of the tires) . While Yousefian teaches the limitation above, Yousefian fails to teach an offset processing module configured to execute offset processing of offsetting the specific range by a predetermined amount in the captured image when the object detection module has detected the object. However, Tsuchiya teaches the natural object assessment unit 38 creates an offset differential image taking into account the movement amount of the vehicle V. The offset amount d' corresponds to the movement amount in the bird's-eye view image data corresponding to the actual travel distance of the vehicle V shown in FIG. 4(a), and the offset amount d' is determined based on a signal from the vehicle speed sensor 20 and the time duration from immediately preceding point in time to the current time. The first integrated value is the total value of the predetermined areas or all the values plotted as the first differential waveform information. The three-dimensional object assessment unit 34 finds a second integrated value of second differential waveform information created by counting the pixel number representing a predetermined differential and creating a frequency distribution in the differential image of the first bird's-eye view image obtained at the first time and the second bird's-eye view image obtained at the second time which is after the first time. In other words, the natural object assessment unit 38 acquires a differential image in which the images are not offset. The second integrated value is the total value of the predetermined areas or all the values plotted as the second differential waveform information. The three-dimensional object assessment unit 34 assesses that the three-dimensional object detected by the three-dimensional object detection unit 33 is a moving object when an evaluation value, which corresponds to the number of times the first integrated value is assessed to be greater than the second integrated value, is equal to or greater than a predetermined evaluation threshold ( paragraph [0242-0250]) It would have been obvious to one skilled in the art before filing of the claimed invention to use the offset module taught by Tsuchiya et al. in order to prevent the images of natural objects including plants and snow located along the road traveled by the vehicle from being mistakenly detected as other vehicles traveling in adjacent traffic lanes adjacent to the lane traveled by the vehicle. As a result, it is possible to provide a three-dimensional object detection device which has high precision in detecting other vehicles traveling in adjacent traffic lanes adjacent to the lane traveled by the vehicle. . Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. As to claim 2, Tsuchiya et al teaches the image generation device according to claim 1, wherein at least one processor is further configured to repeat, when the offset processing is been executed and the object is detected in the image within the specific range that has been offset, the offset processing until object is no longer detected in the specific rang As to claim 3, Tsuchiya et al teaches the image generation device according to claim 2, wherein at least one processor is further configured to set a predetermined range apart from the vehicle in the captured image as a range inappropriate for the generation of the underfloor image, and to offset the specific range toward a direction away from the vehicle when the offset processing and wherein at least one processor is further configured to avoid the generation of the underfloor image in a case in which at least a part of the specific range that has been offset reaches the predetermined range when the offset processing is being executed as a result of the detection of the object by the object detection module ( Safe distance lines 325a and 325b that may show a computed safe distance from the vehicle body or bumpers (to help the driver avoid obstacles near the vehicle), may be superimposed on the display. Safe distance lines 325a and 325b may also be shown in a different (contrasting) color, or may be displayed by superimposing an extension of the sides of the vehicle, that may allow a driver to see how closely they have moved the vehicle to or from an obstacle. Meter 315 may be configured to show a number of vehicle parameters, including, for example, pitch, yaw, and roll and/or display limits 318. The limits 318 being a limit to which the vehicle should be bound regarding pitch, yaw, roll, and the like, paragraph [0030-0031][0047]) The limitation of claim 4-5 has been addressed above. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Contact information Any inquiry concerning this communication or earlier communications from the examiner should be directed to NANCY BITAR whose telephone number is (571)270-1041. The examiner can normally be reached Mon-Friday from 8:00 am to 5:00 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Mrs. Jennifer Mahmood can be reached at 571-272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. NANCY . BITAR Examiner Art Unit 2669 /NANCY BITAR/Primary Examiner, Art Unit 2664
Read full office action

Prosecution Timeline

Nov 20, 2023
Application Filed
Sep 30, 2025
Non-Final Rejection — §103
Dec 31, 2025
Response Filed
Mar 05, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599437
PRE-PROCEDURE PLANNING, INTRA-PROCEDURE GUIDANCE FOR BIOPSY, AND ABLATION OF TUMORS WITH AND WITHOUT CONE-BEAM COMPUTED TOMOGRAPHY OR FLUOROSCOPIC IMAGING
2y 5m to grant Granted Apr 14, 2026
Patent 12597132
IMAGE PROCESSING METHOD AND APPARATUS
2y 5m to grant Granted Apr 07, 2026
Patent 12597240
METHOD AND SYSTEM FOR AUTOMATED CENTRAL VEIN SIGN ASSESSMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12597189
METHODS AND APPARATUS FOR SYNTHETIC COMPUTED TOMOGRAPHY IMAGE GENERATION
2y 5m to grant Granted Apr 07, 2026
Patent 12591982
MOTION DETECTION ASSOCIATED WITH A BODY PART
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
83%
Grant Probability
91%
With Interview (+8.2%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 946 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month