Prosecution Insights
Last updated: April 19, 2026
Application No. 18/454,000

VEHICLE DISPLAY DEVICE

Non-Final OA §103
Filed
Aug 22, 2023
Examiner
BAAJOUR, SHAHIRA
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Yazaki Corporation
OA Round
3 (Non-Final)
72%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
93%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
114 granted / 159 resolved
+19.7% vs TC avg
Strong +22% interview lift
Without
With
+21.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
29 currently pending
Career history
188
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
41.0%
+1.0% vs TC avg
§102
14.3%
-25.7% vs TC avg
§112
32.6%
-7.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 159 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/20/2026 has been entered. Status of the claims Claims 1 and 5 have been amended, claims 2, 4, and 6 have been canceled, no claims have been added. Accordingly, claims 1, 3, 5, 7-9 are pending herein. Response to Arguments Applicant’s arguments, filed on 02/20/2026, with respect to the rejection(s) of the claims under 35 USC § 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of the amendments and the applicant’s arguments. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: -Claim 1: a position information acquisition unit that acquires; -Claim 3: a traveling trajectory acquisition unit that acquires a traveling trajectory of the preceding vehicle; -Claims 5 and 7: a road shape detection unit that detects a road shape; Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof (ECU on Page 9 of the specification). If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over KOSAKA (US 20180240258A1) in view of SUZUKI (JP-5826375-B2; Examiner relied on English translation attached herein). Regarding claim 1, KOSAKA discloses A vehicle display device comprising: a display mounted on a vehicle ([0009]), the display superimposing and displaying an image on a real landscape in front of the vehicle with respect to a front windshield of the vehicle ([0026]-[0029]; Fig. 1); a controller configured to control the display (Fig. 2, Display processor, [0046]); and a position information acquisition unit that acquires a position of a preceding vehicle followed by the vehicle ([0037]); wherein the controller calculates a first straight line connecting the position of the preceding vehicle acquired by the position information acquisition unit and a position of the vehicle ([0043]; left-right difference; [0044]; [0056]-[0057]; Fig. 7); when the controller determines that the first straight line is along a straight direction of the vehicle, the controller displays a marker image at a position below the preceding vehicle as viewed from a viewpoint position of the vehicle (Fig. 7, [0050]; Fig. 8b, [0127]-[0128]: when the center of the preceding vehicle marker matches the center of the host vehicle-i.e. straight line); when the controller determines that the first straight line is inclined in an inclination direction with respect to the straight direction, the controller performs display movement control to display the marker image so as to be shifted from the position below the preceding vehicle and in a direction opposite to the inclination direction (Fig. 8a, b; [0127]-[0131]), and in the display movement control, identifying that an amount by which the preceding vehicle image is shifted based on the inclination direction of the first straight line with respect to the straight direction (Fig. 8A-E; [0050]; [0127]-[0131]). However, KOSAKA does not explicitly state: a plurality of sensors, one of the sensors acquires a steering angle of a steering wheel of the vehicle, determining an amount by which the marker image is shifted from the position below the preceding vehicle based on the inclination of the first straight line with respect to the straight direction, determining an amount by which the preceding vehicle image is shifted from the position below the preceding vehicle based on a steering angle of the steering wheel of the vehicle, when an angle formed by the straight direction and the first straight line is A1, a steering angle of the steering wheel of the vehicle is θ, and an amount by which the preceding vehicle image is shifted is an angle AS formed by a second straight line connecting a center of the marker image and the vehicle and the first straight line, an amount by which the marker image is shifted from the position below the preceding vehicle is expressed by AS = A1 ÷ 2 + θ. On the other hand, SUZUKI teaches: a plurality of sensors, one of the sensors acquires a steering angle of a steering wheel of the vehicle ([0011]: “The steering sensor 4 detects the steering angle of the own vehicle and outputs it to the display controller 2.”), determining an amount by which the marker image is shifted from the position below the preceding vehicle based on the inclination of the first straight line with respect to the straight direction, determining an amount by which the preceding vehicle image is shifted from the position below the preceding vehicle based on a steering angle of the steering wheel of the vehicle ([0017]: “The display controller 2 determines the display position of the recommended viewpoint marker 20 according to the vehicle speed measured by the vehicle speed sensor 3 and the steering angle detected by the steering sensor 4”; [0018]; [0019]: “the display controller 2 determines the reach distance according to the vehicle speed and the display direction according to the steering angle, corrects the reach distance according to the steering angle, and displays the recommended viewpoint marker 20 at that position. Do. Note that the display controller 2 may adjust the display position so that the recommended viewpoint marker 20 does not deviate from the road, based on the road curvature, the road width, and the like recognized by the image processing unit 8 by the image recognition process. Alternatively, the display position may be adjusted so that the recommended viewpoint marker 20 does not deviate from the road, based on the degree of curvature of the road, the road width”; [0020]), when an angle formed by the straight direction and the first straight line is A1, a steering angle of the steering wheel of the vehicle is θ, and an amount by which the preceding vehicle image is shifted is an angle AS formed by a second straight line connecting a center of the marker image and the vehicle and the first straight line, an amount by which the marker image is shifted from the position below the preceding vehicle is expressed by AS = A1 ÷ 2 + θ ([0020]: “When the steering angle changes from 0 degrees to 180 degrees, the display controller 2 moves the display position of the recommended viewpoint marker 20 from the point 34 to the point 35.”; [0041]: “the display controller 2 moves the display position of the recommended viewpoint marker 20 in the left and right direction according to the steering angle”; i.e. the marker position is shifted according to steering angle, which corresponds to “display movement control” for shifting a marker position based on vehicle direction; Note: With respect to the mathematical equation, SUZUKI teaches the parameters used in determining the marker display position, including steering angle and vehicle travel direction. Once these parameters are known, calculating the shift amount using a mathematical relationship such as AS = A1 ÷ 2 + θ would have been an obvious mathematical implementation for determining the marker display offset. Selecting a particular mathematical expression for combining known parameters represents routine optimization and a matter of design choice within the ordinary skill in the art). It would have been obvious to a person of ordinary skill in the art to modify the system of KOSAKA to determine the amount by which the marker image is shifted from the position below the preceding vehicle based on the vehicle steering angle and the direction of travel as taught by SUZUKI in order to improve alignment of the displayed marker with the driving trajectory of the vehicle and provide more accurate driving assistance information to the driver. Claims 3, 5, and 8 are rejected under 35 U.S.C. 103 as being unpatentable over KOSAKA and SUZUKI in further view of IDE (US2018/0181820). Regarding claim 3, KOSAKA discloses: when the controller determines that the first straight line is inclined with respect to the straight direction and when the controller cannot acquire the traveling trajectory of the preceding vehicle by the traveling trajectory acquisition unit, the controller performs the display movement control ([0127]-[0131]; Fig. 5, 7, 8; BRI of the claims in view of the KOSAKA reference, which cannot acquire a trajectory and thus yields the display movement control as described and mapped in rejection of claim 1 above). However, KOSAKA does not explicitly state a traveling trajectory acquisition unit that acquires a traveling trajectory of the preceding vehicle, wherein when the controller determines that the first straight line is inclined with respect to the straight direction and when the controller can acquire the traveling trajectory of the preceding vehicle by the traveling trajectory acquisition unit, the controller performs traveling trajectory reference control of displaying a center of the preceding vehicle image so as to overlap a center of the traveling trajectory instead of the display movement control. On the other hand, IDE teaches a traveling trajectory acquisition unit that acquires a traveling trajectory of the preceding vehicle, wherein when the controller determines that the first straight line is inclined with respect to the straight direction and when the controller can acquire the traveling trajectory of the preceding vehicle by the traveling trajectory acquisition unit, the controller performs traveling trajectory reference control of displaying a center of the preceding vehicle image so as to overlap a center of the traveling trajectory instead of the display movement control ([0069]-[0074]; Fig. 2, 3, 5, 7, and 9; Note: given the system in IDE can calculate the trajectory of the preceding vehicle, the display movement control does not need to be performed, since it is satisfied by the KOSAKA reference above). It would have been obvious for someone with ordinary skill in the art before the effective filing date of the current application to modify the teachings of the KOSAKA reference and include features from the IDE reference with a reasonable expectation of success. Combining the display system in KOSAKA which yields the display movement control, with the IDE reference which uses the trajectory estimation to determine the amount of deviation of the preceding vehicle from the host vehicle based on the inclination between the lanes of the two vehicles, allows to more accurately trace and track the traveling lane of the preceding vehicle, as disclosed by the IDE reference ([0006]-[0007]). Regarding claims 5, KOSAKA discloses when the controller determines that the first straight line is inclined with respect to the straight direction and when the controller cannot acquire the shape of the traveling lane of the preceding vehicle by the road shape detection unit, the controller performs the display movement control ([0127]-[0131]; Fig. 8, BRI of the claims in view of the KOSAKA reference, which cannot acquire a trajectory and thus yields the display movement control as described and mapped in rejection of claims 1, and 3 above; the same concept related to the capability of the KOSAKA reference to estimate the road shape and perform the display control). However, KOSAKA does not explicitly state a road shape detection unit that detects a road shape, wherein when the controller determines that the first virtual straight line is inclined with respect to the straight direction and when the controller can acquire a shape of the traveling lane of the preceding vehicle by the road shape detection unit, the controller performs, instead of the display movement control, road shape reference control of displaying the preceding vehicle image at a center of the traveling lane based on the detected shape of the traveling lane. On the other hand, IDE teaches a road shape detection unit that detects a road shape, wherein when the controller determines that the first virtual straight line is inclined with respect to the straight direction and when the controller can acquire a shape of the traveling lane of the preceding vehicle by the road shape detection unit, the controller performs, instead of the display movement control, road shape reference control of displaying the preceding vehicle image at a center of the traveling lane based on the detected shape of the traveling lane ([0044]-[0045]; [0091]-[0092]; Note: given the system in IDE can calculate the road shape, the display movement control does not need to be performed, since it is satisfied by the KOSAKA reference above). It would have been obvious for someone with ordinary skill in the art before the effective filing date of the current application to modify the teachings of the KOSAKA reference and include features from the IDE reference with a reasonable expectation of success. Combining the display system in KOSAKA which yields the display movement control, with the IDE reference which uses the road shape to determine the amount of deviation of the preceding vehicle from the host vehicle based on the inclination between the lanes of the two vehicles, allows to more accurately trace and track the traveling lane of the preceding vehicle, as disclosed by the IDE reference ([0006]-[0007]). Regarding claim 8, KOSAKA discloses the controller displays the marker image at the center of the traveling lane based on a position of the travel road (See Fig. 8). However, KOSAKA does not explicitly state the road shape detection unit detects at least one of a pair of colored lines sandwiching the traveling lane of the preceding vehicle as the road shape, and in the road shape reference control, the controller displays the preceding vehicle image at the center of the traveling lane based on a position of the colored line detected by the road shape detection unit. On the other hand, IDE teaches the road shape detection unit detects at least one of a pair of colored lines sandwiching the traveling lane of the preceding vehicle as the road shape, and in the road shape reference control, the controller displays the preceding vehicle image at the center of the traveling lane based on a position of the colored line detected by the road shape detection unit ([0044]-[0045]; [0082]-[0085]; [0091]-[0092]). It would have been obvious for someone with ordinary skill in the art before the effective filing date of the current application to modify the teachings of the KOSAKA reference and include features from the IDE reference with a reasonable expectation of success. Combining the display system in KOSAKA which yields the display movement control, with the IDE reference which uses the road shape to determine the amount of deviation of the preceding vehicle from the host vehicle based on the inclination between the lanes of the two vehicles, allows to more accurately trace and track the traveling lane of the preceding vehicle, as disclosed by the IDE reference ([0006]-[0007]). Allowable Subject Matter Claims 7 and 9 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Onda (US-10095039-B2) teaches a head-up display includes a display unit displaying images in a display region defined on a vehicle windshield, a vehicle information acquisition unit acquiring vehicle information, a forward view information acquisition unit acquiring forward view information, a display object detection unit detecting a display object for which guidance information is required to be displayed in the display region, a display control unit, and a display form setting unit. The display control unit generates a guidance image indicating the guidance information of the detected display object and displays the generated guidance image over the display object in superimposed manner on the windshield. The display form setting unit calculates an annoyance value indicative of an annoyance level felt by a vehicle occupant when displaying the guidance image in superimposed manner, and sets a display form of the guidance image so that the calculated annoyance value is within a predetermined appropriate range. SUN (US-20110301813-A1) discloses a method of displaying virtual lane markings relative to a vehicle position within a roadway lane may entail reading vehicle data such as speed into a vehicle control module, determining if the vehicle data is above a particular threshold, switching a virtual lane display switch, determining weather conditions, and displaying virtual lane markings upon a vehicle windshield based upon a result of determining weather conditions. Detecting actual lane markings on one or both of a left side of the roadway lane and a right side of the roadway lane may be accomplished with a vehicle-mounted camera. Moreover, determining whether a steering wheel has rotated a predetermined number of degrees may further play a role in displaying the virtual lane markings on a windshield of the vehicle. From a driver viewing perspective through a windshield, virtual lane markings may be displayed on the windshield to overlay actual lane markings. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAHIRA BAAJOUR whose telephone number is (313)446-6602. The examiner can normally be reached 9:00 am - 6:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SCOTT BROWNE can be reached at (571) 270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHAHIRA BAAJOUR/Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Aug 22, 2023
Application Filed
Apr 19, 2025
Non-Final Rejection — §103
Jul 17, 2025
Applicant Interview (Telephonic)
Jul 24, 2025
Response Filed
Sep 13, 2025
Examiner Interview Summary
Oct 18, 2025
Final Rejection — §103
Jan 16, 2026
Response after Non-Final Action
Feb 20, 2026
Request for Continued Examination
Mar 05, 2026
Response after Non-Final Action
Mar 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596374
TRAVELING VEHICLE SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12597345
METHODS, SYSTEMS, AND DEVICES FOR E-MIRROR TRAFFIC LANE IDENTIFICATION
2y 5m to grant Granted Apr 07, 2026
Patent 12589319
A system and method for controlling a plurality of karts implementing at least two communication networks.
2y 5m to grant Granted Mar 31, 2026
Patent 12592089
Method of Classifying a Road Surface Object, Method of Training an Artificial Neural Network, and Method of Operating a Driver Warning Function or an Automated Driving Function
2y 5m to grant Granted Mar 31, 2026
Patent 12583315
DISPLAY CONTROL DEVICE, DISPLAY DEVICE, VEHICLE, DISPLAY CONTROL METHOD, AND RECORDING MEDIUM RECORDED WITH DISPLAY CONTROL PROGRAM
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
72%
Grant Probability
93%
With Interview (+21.7%)
2y 11m
Median Time to Grant
High
PTA Risk
Based on 159 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month