Prosecution Insights
Last updated: April 19, 2026
Application No. 17/783,403

DISPLAY CONTROL DEVICE AND HEAD-UP DISPLAY DEVICE

Final Rejection §102§103
Filed
Jun 08, 2022
Examiner
JORDAN, DANIEL JEFFERY
Art Unit
2872
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Nippon Seiki Co., Ltd.
OA Round
2 (Final)
62%
Grant Probability
Moderate
3-4
OA Rounds
3y 9m
To Grant
62%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
30 granted / 48 resolved
-5.5% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
41 currently pending
Career history
89
Total Applications
across all art units

Statute-Specific Performance

§103
51.9%
+11.9% vs TC avg
§102
22.9%
-17.1% vs TC avg
§112
25.2%
-14.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 48 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments 2. Applicant's arguments (see Remarks dated 08/08/2025) have been fully considered, but they are not persuasive. On page 7, the applicant argues that Wang does not disclose wherein “when the position of the at least one of right and left viewpoints is re-detected after the viewpoint lost period, the control unit disables at least one warping processing that uses the warping parameter, the at least one warping processing corresponding to the re-detected viewpoint position.” However, the audio disclosed in paragraph [0039] of Wang must inherently be disabled after a period of time when the user’s eye position is detected. Since [0039] discloses that the speaker will emit an audio prompt indicating when the driver’s eye position is being detected, it follows that the speaker will not continue indicating that the driver’s eye position is being detected when the driver’s eye position is no longer being detected. The applicant appears to suggest that the audio will play indefinitely or on a loop, which is not supported by Wang’s disclosure. Regarding the examiner’s previous 103 rejection citing Furui in view of Cui, the applicant has failed to argue against the combination of references provided. In response to applicant’s arguments against the references individually, one cannot show non-obviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Claim Objections 3. Claims 1 and 10 are objected to because of the following informalities: In claims 1 and 10 (line 3 of each claim), “a driver visually recognize” should read “a driver to visually recognize” Appropriate correction is required. Claim Rejections - 35 USC § 102 4. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. 5. Claims 1-11 are rejected under 35 USC 102(a)(1) as being anticipated by Wang et al. (CN 109507799 A, of record). Regarding claim 1, Wang discloses a display control device (Abstract, control HUD projection arrangement) that controls a head-up display (HUD) device mounted on a vehicle (Abstract, vehicle-mounted HUD display) and projecting an image onto a projection member provided in the vehicle to thereby allow a driver visually recognize a virtual image of the image (Fig. 4, 42), the display control device comprising a control unit that performs a viewpoint position warping control to update a warping parameter in accordance with a viewpoint position of the driver in an eye box (Abstract, target display location is set according to an eye position sensor) and to pre-distort an image to be displayed on a display unit with use of the warping parameter in such a manner that the image has a characteristic that counteracts an effect of a distortion characteristic of the virtual image of the image (claims 6 & 13, curvature correction), wherein when a viewpoint lost state in which a position of at least one of right and left viewpoints of the driver is unclear is detected, the control unit maintains, in a viewpoint lost period, the warping parameter set immediately before the viewpoint lost period ([0038], wherein the warping parameter inherently stays the same when an eye view is unclear, when set to), and when the position of the at least one of right and left viewpoints viewpoint is re-detected after the viewpoint lost period, the control unit disables at least one warping processing that uses the warping parameter, the at least one warping processing corresponding to the re-detected viewpoint position ([0039]). Regarding claims 2 and 3, Wang fails to disclose wherein the control unit compares the viewpoint lost period with a threshold value, and if the viewpoint lost period is shorter than the threshold value, the control unit performs a control to lengthen or shorten a period during which the warping processing is disabled as compared to a case where the viewpoint lost period is longer than the threshold value. However, due to the nature of optics/optical engineering, the process of optical system design includes manipulation of variables such as sizes/arrangements of components, lens shapes/materials, image processing procedures, and other similar concerns, in order to allow an optical system to meet its particular utility (usually based on aberration elimination). This manipulation would normally be considered routine experimentation since the results are governed by known optics/physics equations and are known to be result-effective (unless the particular range of values meets secondary considerations). Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to adjust the time period during which the warping processing of Wang is disabled, since it has been held that where the general conditions of a claim are disclosed in the prior art, discovering the optimum or workable ranges involves only routine skill in the art, In re Aller, 105 USPQ 233 (C.C.P.A. 1955). In this case, it would have been obvious to one of ordinary skill in the art as of the effective filing date of the invention to lengthen or shorten the period during which the warping processing was disabled, motivated by optimizing power consumption. Regarding claim 4, Wang discloses wherein when an update cycle of the warping parameter before the viewpoint lost state occurs and during the viewpoint lost period is defined as a first update cycle RT1 and an update cycle of the warping parameter during a period during which the warping processing is disabled is defined as a second update cycle RT2, the control unit changes a parameter update cycle in such a manner that a time length of the first update cycle RT1 is less than that of the second update cycle RT2 ([0038], when the audio prompt ends indefinitely). Regarding claim 5, Wang discloses wherein after changing the parameter update cycle from the RT1 to the RT2, at an end timing of a period during which the warping processing is disabled, the control unit restores the parameter update cycle from the RT2 to the RT1 ([0038], when eye position has been sensed and the audio prompt ends), or at a timing when a predetermined time has further elapsed from the end timing of the period during which the warping processing is disabled, the control unit restores the parameter update cycle from the RT2 to the RT1, or the control unit starts changing the parameter update cycle starting from the end timing of the period during which the warping processing is disabled, and gradually restores the parameter update cycle from the RT2 to the RT1 with a lapse of time. Regarding claim 6, Wang discloses a low-speed state determination unit that determines whether a speed of the vehicle is in a low-speed state, wherein the control unit lengthens a period during which the warping processing is disabled when the vehicle is in the low-speed state including a stopped state, as compared to a period during which the warping processing is disabled in a state where the vehicle is in a state faster than the low-speed state (column 40 lines 53-67). Regarding claim 7, Wang discloses wherein the control unit changes a period during which the warping processing is disabled in accordance with a vehicle speed of the vehicle, and in this case, when a speed of the vehicle is within a range of equal to or higher than a first speed value Ul (Ul > 0) and equal to or lower than a second speed value U2 that is higher than the first speed value, the control unit performs a control to reduce the period during which the warping processing is disabled with respect to the vehicle speed as the vehicle speed becomes fast, or performs a control to moderate a degree of the reduction when the vehicle speed is in a range close to the first speed value and to make the degree of the reduction steeper as the vehicle speed becomes away from the first speed value, or performs a control to moderate the degree of the reduction when the vehicle speed is in a range close to the first speed value, to make the degree of the reduction further steeper as the vehicle speed becomes away from the first speed value, and to moderate the degree of the reduction as the vehicle speed approaches the second speed value (column 40 lines 59-67). Regarding claim 8, Wang discloses wherein when adjusting a position of the eye box in accordance with a height position of the viewpoint of the driver, the head-up display device does not move an optical member and changes a reflection position of display light of the image in the optical member (claim 4, reflective optical member). Regarding claim 9, Wang discloses wherein a virtual image display surface corresponding to an image display surface of the display unit is arranged so as to be superimposed on a road surface in front of the vehicle (Fig. 5, 51), or is arranged at an angle with respect to the road surface in such a manner that a distance between a near end portion that is an end portion of the virtual image display surface on a side closer to the vehicle and the road surface is smaller, and a distance between a far end portion that is an end portion of the virtual image display surface on a side further from the vehicle and the road surface is larger (Fig. 2). Regarding claim 10, Wang discloses a head-up display device comprising: the display control device (Abstract, control HUD projection arrangement); a display unit that displays an image (Fig. 4, 41); and an optical system including an optical member that reflects (claim 4) and projects display light of the image onto the projection member (claim 4). Regarding claim 11, Wang discloses a display control device (Abstract, control HUD projection arrangement) that controls a head-up display (HUD) device mounted on a vehicle (Abstract, vehicle-mounted HUD display) and projecting an image onto a projection member provided in the vehicle to thereby allow a driver visually recognize a virtual image of the image (Fig. 4, 42), the display control device comprising a control unit that performs a viewpoint position warping control to update a warping parameter in accordance with a viewpoint position of the driver in an eye box (Abstract, target display location is set according to an eye position sensor) and to pre-distort an image to be displayed on a display unit with use of the warping parameter in such a manner that the image has a characteristic that counteracts an effect of a distortion characteristic of the virtual image of the image (claims 6 & 13, curvature correction), wherein when a viewpoint lost state in which a position of at least one of right and left viewpoints of the driver is unclear is detected, the control unit maintains, in a viewpoint lost period, the warping parameter set immediately before the viewpoint lost period ([0038], wherein the warping parameter inherently stays the same when an eye view is unclear, when set to), and when the position of the at least one of right and left viewpoints viewpoint is re-detected after the viewpoint lost period, the control unit disables at least one warping processing that uses the warping parameter for a predetermined time period after the viewpoint lost period ends and before updating to the warping parameter ([0039], the audio prompt is disabled for a period of time until it is played again in the future, to indicate that a driver’s eye position is being detected once more), the at least one warping processing corresponding to the re-detected viewpoint position ([0039]). Claim Rejections - 35 USC § 103 6. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. 7. Claims 1-3 and 9-10 are rejected under 35 USC 103 as being unpatentable over Furui (US 20150049117 A1, of record) in view of Cui et al. (US 8564502 B2, of record). Regarding claim 1, Furui discloses a display control device (Fig. 1, 120 & 131) that controls a display device ([0076], 100) and projecting an image onto a projection member to thereby allow a user visually recognize a virtual image of the image ([0076], user sees images), the display control device comprising a control unit to pre-distort an image to be displayed on a display unit with use of the warping parameter in such a manner that the image has a characteristic that counteracts an effect of a distortion characteristic of the virtual image of the image ([0077] & [0081], 132), wherein when a viewpoint lost in which a position of at least one of right and left viewpoints of the driver is unclear is detected, the control unit maintains, in a viewpoint lost period, the warping parameter set immediately before the viewpoint lost period ([0076]-[0077], wherein the warping parameter inherently stays the same when an eye view is unclear, unless otherwise set). Furui fails to disclose wherein a HUD device is mounted on a vehicle; wherein a control unit performs a viewpoint position warping control in accordance with a viewpoint position of the driver in an eye box; and when the position of the at least one of right and left viewpoints is re-detected after the viewpoint lost period, the control unit disables at least one warping processing that uses the warping parameter, the at least one warping processing corresponding to the re-detected viewpoint position. However, Cui teaches a similar projection system, and discloses a HUD device (Fig. 1, 150) mounted on a vehicle (Fig. 1, 100); wherein a control unit performs a viewpoint position warping control in accordance with a viewpoint position of the driver in an eye box (column 5 lines 22-40; columns 39-40 lines 63-38); and when a position of the at least one of right and left viewpoints is re-detected after the viewpoint lost period, the control unit disables at least one warping processing that uses the warping parameter, the at least one warping processing corresponding to the re-detected viewpoint position (column 40 lines 53-59, the graphic ends when a view is re-detected). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine Furui and Cui such that image projection was to correspond to a viewpoint position of a driver in an eye box, motivated by allowing a driver to glean information from a HUD without needing to look in different directions. Regarding claims 2 and 3, modified Furui fails to disclose wherein the control unit compares the viewpoint lost period with a threshold value, and if the viewpoint lost period is shorter than the threshold value, the control unit performs a control to lengthen or shorten a period during which the warping processing is disabled as compared to a case where the viewpoint lost period is longer than the threshold value. However, due to the nature of optics/optical engineering, the process of optical system design includes manipulation of variables such as sizes/arrangements of components, lens shapes/materials, image processing procedures, and other similar concerns, in order to allow an optical system to meet its particular utility (usually based on aberration elimination). This manipulation would normally be considered routine experimentation since the results are governed by known optics/physics equations and are known to be result-effective (unless the particular range of values meets secondary considerations). Therefore, it would have been obvious to one of ordinary skill in the art at the time the invention was made to adjust the time period during which the warping processing of modified Furui is disabled, since it has been held that where the general conditions of a claim are disclosed in the prior art, discovering the optimum or workable ranges involves only routine skill in the art, In re Aller, 105 USPQ 233 (C.C.P.A. 1955). In this case, it would have been obvious to one of ordinary skill in the art as of the effective filing date of the invention to lengthen or shorten the time period during which the warping processing was disabled, motivated by optimizing power consumption. Regarding claim 9, modified Furui discloses wherein a virtual image display surface corresponding to an image display surface of the display unit is arranged so as to be superimposed on a road surface in front of the vehicle (Furui - Abstract, superimposed), or is arranged at an angle with respect to the road surface in such a manner that a distance between a near end portion that is an end portion of the virtual image display surface on a side closer to the vehicle and the road surface is smaller, and a distance between a far end portion that is an end portion of the virtual image display surface on a side further from the vehicle and the road surface is larger (Cui - Fig. 1). Regarding claim 10, modified Furui discloses a head-up display device comprising: the display control device (Furui - Fig. 1, 120 & 131); a display unit that displays an image (Furui - [0040], 141 & 154); and an optical system including an optical member that reflects (Cui - column 7 lines 17-18, mirrors) and projects display light of the image onto the projection member (Cui - column 7 lines 17-18). Conclusion 8. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. 9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Daniel Jeffery Jordan whose telephone number is 571-270-7641. The examiner can normally be reached 9:30a-6:00p. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephone Allen can be reached at 571-272-2434. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /D. J. J./Examiner, Art Unit 2872 /STEPHONE B ALLEN/Supervisory Patent Examiner, Art Unit 2872
Read full office action

Prosecution Timeline

Jun 08, 2022
Application Filed
Apr 29, 2025
Non-Final Rejection — §102, §103
Aug 08, 2025
Response Filed
Dec 09, 2025
Final Rejection — §102, §103
Mar 23, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591113
LENS ASSEMBLY AND ELECTRONIC APPARATUS INCLUDING THE SAME
2y 5m to grant Granted Mar 31, 2026
Patent 12566316
CAMERA OPTICAL LENS
2y 5m to grant Granted Mar 03, 2026
Patent 12461343
OPTICAL IMAGING LENS
2y 5m to grant Granted Nov 04, 2025
Patent 12429711
OPHTHALMIC DEVICE WITH BUILT-IN SELF-TEST CIRCUITRY FOR TESTING AN ADJUSTABLE LENS
2y 5m to grant Granted Sep 30, 2025
Patent 12429715
Synthesis and Application of Light Management with Thermochromic Hydrogel Microparticles
2y 5m to grant Granted Sep 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
62%
Grant Probability
62%
With Interview (+0.0%)
3y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 48 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month