Prosecution Insights
Last updated: April 19, 2026
Application No. 18/976,138

VIDEO DISPLAY DEVICE

Non-Final OA §103
Filed
Dec 10, 2024
Examiner
BLACK-CHILDRESS, RAJSHEED O
Art Unit
2685
Tech Center
2600 — Communications
Assignee
DENSO CORPORATION
OA Round
1 (Non-Final)
62%
Grant Probability
Moderate
1-2
OA Rounds
2y 9m
To Grant
86%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
279 granted / 448 resolved
At TC average
Strong +24% interview lift
Without
With
+23.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
39 currently pending
Career history
487
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
52.5%
+12.5% vs TC avg
§102
17.0%
-23.0% vs TC avg
§112
21.7%
-18.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 448 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Huebner (US 2012/0262580 A1) in view of Liu (US 2004/0001146 A1). Regarding claim 1, Huebner discloses a video display device (Huebner teaches a video display device (Huebner abstract, fig. 1)) comprising: a camera with a wide-angle lens configured to capture a two-dimensional wide-angle video around a vehicle (Huebner teaches a surround view system for a vehicle using multiple cameras with fisheye/wide-angle lenses. Huebner states that the surround view system can utilize fisheye cameras having a horizontal opening angle >170° and that the cameras can include image recording devices combined with a wide-angle lens, such as a fisheye lens; and further describes cameras positioned around a vehicle generating image data used for surround view generation (Huebner [0021], [0028]).); a sensor configured to measure a distance to an object in a real space around the vehicle (Huebner explicitly teaches a sensor that gathers distance information of objects located in the vehicle’s surroundings, including receiving distance information from a 3D sensor such as a PMD-sensor, and also teaches sensors measuring distance such as PMD-sensor, ultrasonic sensor, radar sensor (Huebner [0026], [0028]).); a controller connected to the camera and the sensor (Huebner teaches an image processing device/processing device that receives image data from the cameras and includes an interface that receives signals from distance sensors (Huebner [0028]).); and a display device configured to display video based on signals output from the controller (Huebner teaches that the processing device generates the surround view that “can be displayed on a display,” i.e., a display driven by the processing device output (Huebner [0028]).), wherein the controller calculates a position of the object in the wide-angle video captured by the camera based on the distance to the object in the real space measured by the sensor (Huebner teaches using received distance information of objects in the surroundings so that the processing device “can then render the objects in the surround view in a location representative of an actual location with respect to the vehicle,” which is a disclosure of determining/deriving an object’s location/position in the generated view based on measured distance (Huebner [0026]). Additionally, Huebner teaches that additional sensors can be used to “correct positions…of objects…to more accurately reflect spatial realities” (Huebner [0037]).). However, Huebner does not expressly disclose corrects a distortion of the wide-angle video so that the calculated position of the object in the wide-angle video becomes horizontal on a screen of the display device. Specifically, Huebner teaches that wide-angle lenses generate distorted images and that the system may use additional sensors to “correct positions and distortions of objects…to more accurately reflect spatial realities” (Huebner [0028], [0037]), but does not provide the specific technique for horizontal distortion correction. In an analogous art, Liu discloses a real-time wide-angle image correction system and method that generates a warp table from pixel coordinates of a wide-angle image and applies the warp table to the wide-angle image to create a corrected wide-angle image (Liu abstract, [0012]). Liu further discloses performing vertical and horizontal scaling using parametric warping functions to produce a preliminary warp table, and then performing a “horizontal distortion correction” on the preliminary warp table to correct for distortion affecting horizontal content (Liu [0012]). Liu expressly explains that the warping approach maintains vertical lines as vertical but causes “slanting and distorted horizontal lines,” and therefore performs horizontal distortion correction to correct the wide-angle image for horizontal distortion and produce a distortion-free corrected wide-angle image (Liu [0066]–[0068]). Liu further discloses a horizontal distortion correction module that outputs a warp table mapping corrected image pixel coordinates to original wide-angle image pixel coordinates (Liu [0058]–[0062]). Accordingly, Liu teaches correcting distortion of a wide-angle image via warp-table based processing including explicit horizontal distortion correction, thereby providing the technique for correcting distortion such that displayed content at the corrected location is horizontally aligned (i.e., not slanted/curved by horizontal distortion). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Huebner’s surround view system to implement distortion correction using Liu’s warp-table based wide-angle correction, including Liu’s horizontal distortion correction, because Huebner employs wide-angle/fisheye cameras that generate distorted imagery and expressly teaches correcting positions and distortions of objects to reflect spatial realities (Huebner [0028], [0037]), while Liu provides a known, real-time technique for correcting wide-angle distortion—specifically correcting horizontal distortion artifacts introduced by wide-angle warping—using a warp table (Liu [0012], [0058]–[0062], [0066]–[0068]). Incorporating Liu’s horizontal distortion correction into Huebner’s processing would have been a predictable use of known image correction techniques to improve the fidelity/interpretability of the displayed surround view, including at/around the object location determined using distance information. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Huebner (US 2012/0262580 A1) in view of Liu (US 2004/0001146 A1) as applied to claim 1 above, and further in view of Gupta (US 2015/0049193 A1). Regarding claim 2, Huebner in view of Liu discloses the video display device according to claim 1, but does not expressly disclose wherein the controller determines whether the object exists around the vehicle and corrects the distortion of the wide-angle video so that a predetermined position in the wide-angle video becomes horizontal on the screen of the display device when the object does not exist around the vehicle. In an analogous art, Gupta teaches a vehicular camera system in which a controller (processor) determines whether an object/feature is detected in an overlapping region and compares the detected pixel position to an expected/predetermined pixel position. Gupta further teaches that if the object is not detected at the predicted/expected pixels (i.e., the cue object/feature is absent/not found as expected), the system adjusts/shifts image processing to accommodate the condition (Gupta [0008], [0011], [0150]–[0153]). Gupta also teaches a predetermined reference condition that is horizontal, namely that when properly calibrated a vanishing line is “perfectly horizontal” and located at a preordained vertical pixel height, and that processing may be shifted/re-centered relative to expected pixel locations (Gupta [0072]–[0073], [0148]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Huebner's (as modified by Liu) distortion-correction controller to first determine whether an object/feature is present (i.e., exists) in the relevant surrounding scene portion (around the vehicle), and when the object/feature is not present/not detected, to perform the distortion correction using a predetermined/expected reference position as the horizontal alignment basis (i.e., relying on the preordained/expected reference when an object-based cue is unavailable), as taught by Gupta’s use of expected/predetermined pixel locations and preordained pixel positions for calibration/alignment, including a predetermined horizontal reference condition in which the vanishing line is perfectly horizontal at a preordained vertical pixel height (Gupta [0072]–[0073], [0148], [0150]–[0153]). This yields the predictable result of maintaining a stable horizontal dewarping/alignment reference when an object-based cue is unavailable, improving robustness of distortion correction under varying scene conditions. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAJSHEED O BLACK-CHILDRESS whose telephone number is (571)270-7838. The examiner can normally be reached M to F, 10am to 5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Quan-Zhen Wang can be reached at (571) 272-3114. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RAJSHEED O BLACK-CHILDRESS/Examiner, Art Unit 2685
Read full office action

Prosecution Timeline

Dec 10, 2024
Application Filed
Feb 20, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602983
SYSTEM, METHOD AND STORAGE MEDIUM FOR VEHICLE INSPECTION AUTHORIZATION INFORMATION MANAGEMENT
2y 5m to grant Granted Apr 14, 2026
Patent 12597901
RECONFIGURABLE INTELLIGENT SURFACE REALIZED WITH INTEGRATED CHIP TILING
2y 5m to grant Granted Apr 07, 2026
Patent 12592145
FIRE DETECTION SYSTEM TESTING
2y 5m to grant Granted Mar 31, 2026
Patent 12580074
METHODS, DEVICES AND SYSTEMS FOR MEDICAL CODE EVENT INFORMATION TRACKING
2y 5m to grant Granted Mar 17, 2026
Patent 12573273
Audio Assisted File Sharing
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
62%
Grant Probability
86%
With Interview (+23.9%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 448 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month