Prosecution Insights
Last updated: April 19, 2026
Application No. 18/327,926

DISPLAY CONTROL DEVICE FOR A VEHICLE, DISPLAY SYSTEM FOR A VEHICLE, AND DISPLAY CONTROL METHOD FOR A VEHICLE FOR DISPLAYING AND DELETING SUPERIMPOSED INFORMATION

Final Rejection §103
Filed
Jun 02, 2023
Examiner
MATTA, ALEXANDER GEORGE
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Toyota Jidosha Kabushiki Kaisha
OA Round
4 (Final)
72%
Grant Probability
Favorable
5-6
OA Rounds
3y 0m
To Grant
94%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
98 granted / 137 resolved
+19.5% vs TC avg
Strong +23% interview lift
Without
With
+22.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
42 currently pending
Career history
179
Total Applications
across all art units

Statute-Specific Performance

§101
8.5%
-31.5% vs TC avg
§103
54.2%
+14.2% vs TC avg
§102
13.0%
-27.0% vs TC avg
§112
21.7%
-18.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 137 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This Office Action is in response to Applicant Amendment and Arguments filed on 11/03/2025. Claim(s) 1 - 10 are pending for examination. This Action is made FINAL. Response to Arguments With regards to claim(s) 1-10 previously rejected under 35 U.S.C. 103, applicant's arguments have been fully considered, but are deemed moot in view of new grounds of rejection necessitated by Applicant's amendment. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3 and 5-9 are rejected under 35 U.S.C. 103 as being unpatentable over Mori (US 20200180519 A1) in view of Nagasawa et al. (US 20140092134 A1, hereinafter known as Nagasawa), Thompson et al. (US 20220179223 A1, hereinafter known as Thompson), Edwards (US 20210072831 A1), and Hiraiwa et al. (US 20200148105 A1; hereinafter known as Hiraiwa). Mori, Nagasawa, Thompson and Edwards were cited in a previous office action. Regarding claim 1, Mori teaches A display control device for a vehicle, comprising a processor configured to: acquire a position of an obstacle in a vehicle's surroundings; {Abstract “A display control device for a vehicle include: an acquisition unit that acquires information of an object located at a progress path of the vehicle; and a display control unit that, on the basis of the information acquired by the acquisition unit, causes information of the object to be displayed at a display unit of a spectacles-form wearable terminal, the wearable terminal being provided with the display unit and being configured to be worn by an occupant of the vehicle.” Fig. 6 shows the system acquiring an pedestrian that is cross the street which can be considered an obstacle. Para [0045] “The ITS-ECU 26 executes object notification processing, which is described below, on the basis of infrastructure information received from the traffic condition provision system 12. The ITS-ECU 26 includes at least a central processing unit (CPU), a memory that serves as a temporary storage area, and a nonvolatile storage unit. A control program that causes the ITS-ECU 26 to execute the object notification processing is stored in the storage unit. The CPU of the ITS-ECU 26 reads the control program from the storage unit, loads the control program into the memory, and executes the loaded control program.” } determine a distance from the vehicle to the obstacle; {Para [0057] “In step 104, the ITS-ECU 26 makes a determination as to whether the automobile has entered the processing target intersection. If the result of the determination in step 104 is negative, the ITS-ECU 26 proceeds to step 106. In step 106, the ITS-ECU 26 calculates a distance between the position of the intersection contained in the received infrastructure information and a position of the present vehicle detected by the GPS sensor 28, and makes a determination as to whether the present vehicle is within a predetermined distance L2 from the processing targeting intersection (L2<L1). If the result of the determination in step 106 is negative, the ITS-ECU 26 returns to step 100, and if the result of the determination in step 106 is affirmative, the ITS-ECU 26 proceeds to step 108.” Para [0059] “On the other hand, if an object such as a walker, cyclist or the like is present within the predetermined range from the path along which the present vehicle turning at the processing target intersection is expected to proceed, there is a possibility of the present vehicle approaching to less than a predetermined distance from the object when the present vehicle turns at the processing target intersection. Therefore, if the result of the determination in step 108 is affirmative, the ITS-ECU 26 proceeds to step 110 and specifies that the walker, cyclist or the like located within the predetermined range from the expected progress route is a notification object. Then, in step 112, the ITS-ECU 26 executes AR display control processing.” } perform superimposed display of information on { Fig. 6 shows the system acquiring an pedestrian that is cross the street which can be considered an obstacle. Para [0065-0066] “FIG. 6 illustrates an example of a field of view seen through the AR glasses 40 by the occupant wearing the AR glasses 40 as a result of the processing described above in the situation illustrated in FIG. 5. In the example illustrated in FIG. 6, a graphic 66 representing the walker 62 who is obscured by the obstruction 64 as seen by the occupant wearing the AR glasses 40 is included in the field of view. Therefore, the occupant can be made aware of the presence of the walker 62 who is obscured by the obstruction 64. In the example illustrated in FIG. 6, a rectangular frame 68 that emphasizes the area in which the walker 62 who is obscured by the obstruction 64 is located is also included. Therefore, because the graphic 66 is displayed, a case of the occupant failing to notice the presence of the notified walker 62 who is obscured by the obstruction 64 can be suppressed. The frame 68 emphasizing the area in which the notification object is located is also displayed in a case in which the notification object is not obscured by an obstruction as seen by the occupant wearing the AR glasses 40. Therefore, even in a case in which a notification object is not obscured by an obstruction, a case of the occupant failing to notice the presence of the notification object can be suppressed.” } determine whether the vehicle occupant has perceived the obstacle; and delete the display of the information when it is determined that the vehicle occupant has perceived the obstacle while the information is displayed so the vehicle occupant can identify a new obstacle at an early stage. {Para [0076-0078] “In step 130, the ITS-ECU 26 acquires results of detection of the eyeline of the occupant wearing the AR glasses 40 according to the eyeline cameras 50 of the AR glasses 40. In step 132, on the basis of the eyeline detection results acquired in step 130, the ITS-ECU 26 makes a determination as to whether the graphic 66 and frame 68 or the like displayed at the display unit 46 of the AR glasses 40 has been seen by the occupant wearing the AR glasses 40. More specifically, the ITS-ECU 26 makes a determination as to whether, for example, the eyeline of the occupant wearing the AR glasses 40 has spent at least a predetermined duration (for example, a duration of around one second) at the display position of the graphic 66 and frame 68 or the like in the display unit 46 of the AR glasses 40. If the result of the determination in step 132 is negative, the ITS-ECU 26 proceeds to step 136. Alternatively, if the result of the determination in step 132 is affirmative, the ITS-ECU 26 may determine that the occupant wearing the AR glasses 40 is aware of the presence of the notification object corresponding with the graphic 66 and frame 68 or the like seen by the occupant wearing the AR glasses 40, and the ITS-ECU 26 proceeds to step 134. In step 134, the ITS-ECU 26 causes a display at the display unit 46 of the AR glasses 40, in which the display of the graphic 66 and frame 68 or the like that the ITS-ECU 26 determines has been seen by the occupant wearing the AR glasses 40 is erased. Thus, overcrowding of displays of the graphic 66, the frame 68 and the like in the display unit 46 of the AR glasses 40 may be suppressed.” } Mori does not teach, determine a distance from the vehicle to the obstacle; perform superimposed display of information on a windshield of the vehicle wherein the display of information is performed in a first manner if the determined distance is greater than a predetermined threshold and in a second manner if the determined distance is less than or equal to the predetermined threshold; wherein the processor is further configured to: determine that a sightline direction of the vehicle occupant is directed toward the obstacle, and while the occupant’s sightline is directed toward the object, change the color or the shape of the information; wherein the processor displays a mark and displays a type of the obstacle in the vicinity of the mark. However, Nagasawa teaches determine a distance from the vehicle to the obstacle; {Para [0037] “A vehicle speed sensor 5 is provided at a rear wheel of the vehicle to detect vehicle speed information, which is input to the processing unit 10. In the cabin, a stereo camera unit 8 including two cameras is provided to capture an image in front of and outside of the vehicle. The stereo camera unit 8 uses the parallax thereof to measure the distance to the road surface or an object in front of the vehicle. Further, in the cabin, provided are a gaze detection unit 9 for detecting the gaze of the driver, an operation unit 11 for setting a destination desired by the driver, a microphone 15 for setting the destination by voice, and an illuminance sensor 14 for detecting brightness.” } perform superimposed display of information on a windshield of the vehicle with a heads-up-displayed device at the position of the obstacle as viewed by a vehicle occupant, at a display region provided in front of the vehicle occupant; {Abstract “A visual guidance system includes an image display to present an image overlaid on a windshield in front of a driver of a vehicle, a processor to output image information on a virtual line to display visual guidance to the image display, and a steering input detector to detect a steering input. The processor presents an attention attracting indication about an object outside of the vehicle in synchronization with the virtual line in such a manner that the virtual line extends from above the driver along a course of the vehicle seen within the windshield, and a pointing end of the virtual line is overlaid on a road surface on the course seen within the windshield. The processor outputs image information for changing the attention attracting indication to be less conspicuous than the virtual line in accordance with a steering input signal given by the steering input detector.” Fig. 6- fig. 11 are illustrations of the display system } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Mori to incorporate the teachings of Nagasawa to use a heads up display on a windshield because it reduces driving burden (Nagasawa para [0013] “In order to reduce the burden imposed on the driver, it is an object of the present invention is a visual guidance system configured to timely present a visual guide for notifying a driving direction and objects outside of a vehicle on the basis of a steering input signal.”) Mori in view of Nagasawa does not teach, wherein the display of information is performed in a first manner if the determined distance is greater than a predetermined threshold and in a second manner if the determined distance is less than or equal to the predetermined threshold; wherein the processor is further configured to: determine that a sightline direction of the vehicle occupant is directed toward the obstacle, and while the occupant’s sightline is directed toward the object, change the color or the shape of the information; wherein the processor displays a mark and displays a type of the obstacle in the vicinity of the mark. However, Thompson teaches perform superimposed display of information on a windshield of the vehicle with a heads-up-displayed device at the position of the obstacle as viewed by a vehicle occupant, at a display region provided in front of the vehicle occupant; {Para [0037] “Referring also to FIG. 2B, in embodiments the HUD 100 traffic overlay may arrange and/or modify interactive symbology 212a-e based on one or more criteria (e.g., horizontal distance from ownship, angular displacement from boresight) selectable by the pilot and modifiable via the heads-up controller 206. For example, the control processors 202 may determine that the position of the proximate aircraft corresponding to the interactive symbol 212a is closest to the ownship position, and that therefore the interactive symbol 212a is to be placed first in the ordered sequence of interactive symbols 212a-d. In some embodiments, the interactive symbol 212a may also be displayed with increased or reduced prominence (e.g., greater or lesser brightness, increased or decreased size, change in color) relative to other interactive symbols to reflect the proximity of the corresponding aircraft 112 (e.g., or more generally the priority of the corresponding aircraft with respect to the ordered sequence of aircraft reporting position information). Similarly, the pilot may (e.g., via the heads-up controller 206, advance in turn through the interactive symbols 212b-d corresponding to more distant aircraft 112 (e.g., but still within the FOV of the HUD 100), and then to the interactive symbol 212e corresponding to a “parked” aircraft behind the ownship and/or its pilot (and thereby represented by a dashed symbol positioned at an edge of the FOV of the HUD 100 and corresponding to the relative position of the “parked” aircraft).” } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Mori in view of Nagasawa to incorporate the teachings of Thompson to arrange and adjust Hud markers based on proximity (e.g. distance) because it improves situational awareness (Thompson para [0035] “Broadly speaking, embodiments of the inventive concepts disclosed herein are directed to a heads-up display (HUD) and user interface incorporating interactive heads-up display and control of traffic targets. For example, traffic information may be displayed via the HUD in conformance to actual proximate traffic, and interactive controls allow the pilot to directly manage traffic targets through the HUD while eyes-out, eliminating the need to cycle back and forth with heads-down traffic displays and enhancing general situational awareness.”) Mori in view of Nagasawa and Thompson does not teach, wherein the processor is further configured to: determine that a sightline direction of the vehicle occupant is directed toward the obstacle, and while the occupant’s sightline is directed toward the object, change the color or the shape of the information; wherein the processor displays a mark and displays a type of the obstacle in the vicinity of the mark. However, Edwards teaches wherein the processor is further configured to: determine that a sightline direction of the vehicle occupant is directed toward object, and while the occupant’s sightline is directed toward the object, change the color or the shape of the information; {Para [0021] “Controller/GUI driver 124 is communicatively coupled to computer 116 to receive user commands that computer 116 has determined correspond to the gesture and gaze combination made by hand 128 and eye 129. Although in the illustrated embodiment it is shown as a separate component, in other embodiments the functions of controller/GUI driver 124 can be incorporated into and performed by computer 116. Controller/GUI driver 124 is also coupled to display 126, which can display a set of one or more graphic user interface controls that can then be selected, manipulated, or otherwise interacted with based on the user commands received from computer 116 (e.g., the combination of gestures with gazes). Furthermore, controller/GUI driver 124 may react to gaze detection by altering a user interface (e.g., shadowing, highlighting, adjusting color, etc.) in response to gaze detection in order for a vehicle occupant to visually identify what region their gaze is detected in.” } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Mori in view of Nagasawa and Thompson to incorporate the teachings of Edwards to change the color of the information when occupant’s sightline is directed toward the object because it allows the occupant to identify what region there gaze is being detected in as discussed in Edwards Para [0021] (“Controller/GUI driver 124 is communicatively coupled to computer 116 to receive user commands that computer 116 has determined correspond to the gesture and gaze combination made by hand 128 and eye 129. Although in the illustrated embodiment it is shown as a separate component, in other embodiments the functions of controller/GUI driver 124 can be incorporated into and performed by computer 116. Controller/GUI driver 124 is also coupled to display 126, which can display a set of one or more graphic user interface controls that can then be selected, manipulated, or otherwise interacted with based on the user commands received from computer 116 (e.g., the combination of gestures with gazes). Furthermore, controller/GUI driver 124 may react to gaze detection by altering a user interface (e.g., shadowing, highlighting, adjusting color, etc.) in response to gaze detection in order for a vehicle occupant to visually identify what region their gaze is detected in.”) Mori in view of Nagasawa, Thompson, and Edwards does not teach, wherein the processor displays a mark and displays a type of the obstacle in the vicinity of the mark. However, Hiraiwa teaches wherein the processor displays a mark and displays a type of the obstacle in the vicinity of the mark. {Fig. 9 and fig. 10 and Para [0083] “Namely, during restricted display decided by the determination section 60, as illustrated in FIG. 9, an output section 98 (see FIG. 5) of the information control device 91 displays masking images 64, 100 overlaid on potential hazards 52, 102 (see FIG. 10), and displays the icons 94, 96 corresponding to the potential hazards 52, 102 superimposed on the respective masking images 64, 100. As an example, the masking image 100 covers the entirety of another vehicle, this being a potential hazard 102, and is formed in a rectangular shape filled in with colors approximated to the scenery in the surroundings of the potential hazard 102. Note that although the masking image 100 is illustrated with an outline in FIG. 10 in order to facilitate understanding of the masking image 100, this outline does not have to be present.” Para [0085] “As an example, the icon 96 is a simplified depiction of a vehicle (a symbol in the shape of a vehicle), this being a potential hazard 102, and is displayed on the masking image 100 displayed overlaid on the potential hazard 102 (see FIG. 10). Note that although the icons 94, 96 described above are displayed superimposed on the respective masking images 64, 100, there is no limitation thereto, and the icons 94, 96 may be displayed close to the respective masking images 64, 100. Moreover, there is no limitation to the icons 94, 96 described above, and although not illustrated in the drawings, icons representing other vehicle types, such as a bicycle, or another type of obstacle, may be displayed in accordance with to the type of potential hazard.” Box (64, 100) can be considered a mark and the icon (94, 96) can be considered as displaying a type of object } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Mori in view of Nagasawa, Thompson, and Edwards to incorporate the teachings of Hiraiwa to display type of object because as discussed in para [0088] of Hiraiwa “With the exception of the fact that the icons 94, 96 serving as simplified information are displayed during restricted display, the configuration described above is similar to that of the vehicle information provision device 10 of the first exemplary embodiment. The configuration obtains similar advantageous effects to the first exemplary embodiment. Moreover, in a case in which the driver is not observing the situation in the surroundings of the vehicle 14 during self-driving of the vehicle 14, the information control device 91 displays the icons 94, 96 corresponding to information regarding the potential hazards 52, 102 superimposed on the masking images 64, 100, thereby enabling the driver to ascertain that the potential hazards 52, 102 are present in the surroundings of the vehicle 14, without being overly distracted by the potential hazards 52, 102. This enables the comfort of the driver during self-driving to be still further improved.” Regarding claim 2, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa teaches The display control device for a vehicle of claim 1. Mori teaches wherein the processor acquire a sightline direction of the vehicle occupant, wherein delete the display of the information in a case in which a sightline direction of the vehicle occupant, which has been acquired, has been directed toward the obstacle for a predetermined time. {Para [0076-0078] “In step 130, the ITS-ECU 26 acquires results of detection of the eyeline of the occupant wearing the AR glasses 40 according to the eyeline cameras 50 of the AR glasses 40. In step 132, on the basis of the eyeline detection results acquired in step 130, the ITS-ECU 26 makes a determination as to whether the graphic 66 and frame 68 or the like displayed at the display unit 46 of the AR glasses 40 has been seen by the occupant wearing the AR glasses 40. More specifically, the ITS-ECU 26 makes a determination as to whether, for example, the eyeline of the occupant wearing the AR glasses 40 has spent at least a predetermined duration (for example, a duration of around one second) at the display position of the graphic 66 and frame 68 or the like in the display unit 46 of the AR glasses 40. If the result of the determination in step 132 is negative, the ITS-ECU 26 proceeds to step 136. Alternatively, if the result of the determination in step 132 is affirmative, the ITS-ECU 26 may determine that the occupant wearing the AR glasses 40 is aware of the presence of the notification object corresponding with the graphic 66 and frame 68 or the like seen by the occupant wearing the AR glasses 40, and the ITS-ECU 26 proceeds to step 134. In step 134, the ITS-ECU 26 causes a display at the display unit 46 of the AR glasses 40, in which the display of the graphic 66 and frame 68 or the like that the ITS-ECU 26 determines has been seen by the occupant wearing the AR glasses 40 is erased. Thus, overcrowding of displays of the graphic 66, the frame 68 and the like in the display unit 46 of the AR glasses 40 may be suppressed.” } Regarding Claim 3, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa teaches The display control device for a vehicle of claim 1. Nagasawa teaches wherein the processor acquire a steering direction of the vehicle, wherein delete the display of the information in a case in which the steering direction of the vehicle, which has been acquired, is a direction of moving away from the obstacle. {Para [0064] “FIG. 11 indicates that the attention attracting indication LA is erased and only the virtual line L is presented so that the attention attracting indication LA is displayed less conspicuously than the virtual line L after the driver A steered around the object outside of the vehicle (bicycle B) by steering operation.” Para [0074] “With the configuration as described above, natural visual guidance can be achieved in such a manner that the gaze of the driver is guided from the course based on the virtual line to the object outside of the vehicle to which it is necessary to pay attention, and further, the evasion by an steering operation of the driver is detected, and the attention attracting indication is erased, so that the indication on the windshield is simplified, which reduces the burden imposed on the driver.” } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa to incorporate additional teachings of Nagasawa to erase the visual indication due to steering away because as discussed in para [0074] of Nagasawa “With the configuration as described above, natural visual guidance can be achieved in such a manner that the gaze of the driver is guided from the course based on the virtual line to the object outside of the vehicle to which it is necessary to pay attention, and further, the evasion by an steering operation of the driver is detected, and the attention attracting indication is erased, so that the indication on the windshield is simplified, which reduces the burden imposed on the driver.” Regarding Claim 5, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa teaches The display control device for a vehicle of claim 1. Nagasawa teaches wherein the display region comprises a portion of a windshield glass onto which images are projected in front of a sightline of a driver by a head-up display device. {Abstract “A visual guidance system includes an image display to present an image overlaid on a windshield in front of a driver of a vehicle, a processor to output image information on a virtual line to display visual guidance to the image display, and a steering input detector to detect a steering input. The processor presents an attention attracting indication about an object outside of the vehicle in synchronization with the virtual line in such a manner that the virtual line extends from above the driver along a course of the vehicle seen within the windshield, and a pointing end of the virtual line is overlaid on a road surface on the course seen within the windshield. The processor outputs image information for changing the attention attracting indication to be less conspicuous than the virtual line in accordance with a steering input signal given by the steering input detector.” Fig. 6- fig. 11 are illustrations of the display system } Regarding Claim 6, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa teaches A display system for a vehicle, the system comprising: the display control device for a vehicle of claim 1; {Mori abstract “A display control device for a vehicle include: an acquisition unit that acquires information of an object located at a progress path of the vehicle; and a display control unit that, on the basis of the information acquired by the acquisition unit, causes information of the object to be displayed at a display unit of a spectacles-form wearable terminal, the wearable terminal being provided with the display unit and being configured to be worn by an occupant of the vehicle.” Fig. 1 shows the overall display system } Nagasawa teaches and a head-up display device configured to project images onto a windshield glass. {Abstract “A visual guidance system includes an image display to present an image overlaid on a windshield in front of a driver of a vehicle, a processor to output image information on a virtual line to display visual guidance to the image display, and a steering input detector to detect a steering input. The processor presents an attention attracting indication about an object outside of the vehicle in synchronization with the virtual line in such a manner that the virtual line extends from above the driver along a course of the vehicle seen within the windshield, and a pointing end of the virtual line is overlaid on a road surface on the course seen within the windshield. The processor outputs image information for changing the attention attracting indication to be less conspicuous than the virtual line in accordance with a steering input signal given by the steering input detector.” Fig. 6- fig. 11 are illustrations of the display system } Regarding claim 7, it recites A display control method for a vehicle having limitations similar to those of claim 1 and therefore is rejected on the same basis. Regarding claim 8, it recites A non-transitory storage medium having limitations similar to those of claim 1 and therefore is rejected on the same basis. Additionally Mori teaches A non-transitory storage medium storing a program that is executable by a computer to execute processing {Para [0045] “The ITS-ECU 26 includes at least a central processing unit (CPU), a memory that serves as a temporary storage area, and a nonvolatile storage unit. A control program that causes the ITS-ECU 26 to execute the object notification processing is stored in the storage unit. The CPU of the ITS-ECU 26 reads the control program from the storage unit, loads the control program into the memory, and executes the loaded control program.” } Regarding Claim 9, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa teaches The display control device for a vehicle of claim 1. Thompson teaches wherein the first manner comprises displaying the information in first color, and the second manner comprises displaying the information in a second color. {Para [0037] “Referring also to FIG. 2B, in embodiments the HUD 100 traffic overlay may arrange and/or modify interactive symbology 212a-e based on one or more criteria (e.g., horizontal distance from ownship, angular displacement from boresight) selectable by the pilot and modifiable via the heads-up controller 206. For example, the control processors 202 may determine that the position of the proximate aircraft corresponding to the interactive symbol 212a is closest to the ownship position, and that therefore the interactive symbol 212a is to be placed first in the ordered sequence of interactive symbols 212a-d. In some embodiments, the interactive symbol 212a may also be displayed with increased or reduced prominence (e.g., greater or lesser brightness, increased or decreased size, change in color) relative to other interactive symbols to reflect the proximity of the corresponding aircraft 112 (e.g., or more generally the priority of the corresponding aircraft with respect to the ordered sequence of aircraft reporting position information). Similarly, the pilot may (e.g., via the heads-up controller 206, advance in turn through the interactive symbols 212b-d corresponding to more distant aircraft 112 (e.g., but still within the FOV of the HUD 100), and then to the interactive symbol 212e corresponding to a “parked” aircraft behind the ownship and/or its pilot (and thereby represented by a dashed symbol positioned at an edge of the FOV of the HUD 100 and corresponding to the relative position of the “parked” aircraft).” } Claim(s) 4 is rejected under 35 U.S.C. 103 as being unpatentable over Mori (US 20200180519 A1) in view of Nagasawa et al. (US 20140092134 A1, hereinafter known as Nagasawa), Thompson et al. (US 20220179223 A1, hereinafter known as Thompson), Edwards (US 20210072831 A1), Hiraiwa et al. (US 20200148105 A1; hereinafter known as Hiraiwa), and Arai (US 20060049927 A1). Arai was cited in a previous office action. Regarding Claim 4, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa teaches The display control device for a vehicle of claim 1, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa does not teach, wherein the processor acquire an acceleration of the vehicle, wherein delete the display of the information in a case in which the acceleration of the vehicle, which has been acquired, has decreased by a predetermined amount or more However, Arai teaches wherein the processor acquire an acceleration of the vehicle, wherein delete the display of the information in a case in which the acceleration of the vehicle, which has been acquired, has decreased by a predetermined amount or more {Para [0070] “In this embodiment, the system provides a vehicle-stop mark (NB) as an alarm display for indicating stop at a crossing place, an intersection, and the like, although the system 100 of the first embodiment provides the navigation arrow-mark. Therefore, an alarm object is a stop position (PS) for a vehicle to stop next and an alarm-distance (D) is a distance between a present vehicle-position (PP) and the stop position.” Where a crossing place or intersection can be considered a type of obstacle. However, it should be noted Mori already teaches an obstacle. Para [0089] “Then, the steps S103 to S105 are repeated until the alarm-distance D becomes 20 m (D1 in the table of FIG. 7). When the alarm-distance D becomes 20 m, the multistage display control unit judges whether or not the brake-pedal is operated as shown in the step S109. If the pedal is operated, the value of i is set to zero as shown in the step S108, the stop mark is erased as shown in steps S103 and S111.” Under BRI acquire an acceleration need not be a precise value but rather could be as simple as a Boolean of if deceleration is occurring or not as the claimed invention does not specify how or the format of the acceleration. Determining that the brake pedal is pressed can be considered a determination that the acceleration has decreased by more than zero. } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa to incorporate the teachings of Arai to erase the visual indication due to decelerating the vehicle because the driver will be less distracted if less information is displayed para [0090] of Arai “Therefore, the driver can concentrate his or her attention upon confirming safety at surrounding circumstance of the stop position and driving operation of the steering, brake, and the like when the driver stops the vehicle, because of lack of continuously displaying the mark during the stop.” Claim(s) 10 is rejected under 35 U.S.C. 103 as being unpatentable over Mori (US 20200180519 A1) in view of Nagasawa et al. (US 20140092134 A1, hereinafter known as Nagasawa), Thompson et al. (US 20220179223 A1, hereinafter known as Thompson), Edwards (US 20210072831 A1), Hiraiwa et al. (US 20200148105 A1; hereinafter known as Hiraiwa), and Yoshida et al. (US 20160082840 A1; hereinafter known as Yoshida). Regarding Claim 10, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa teaches The display control device for a vehicle of claim 1, Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa does not teach, wherein the obstacle is a pedestrian having an upper portion and a foot portion below the upper portion, and the processor displays the mark superposed around only the foot portion. However, Yoshida teaches wherein the obstacle is a pedestrian having an upper portion and a foot portion below the upper portion, and the processor displays the mark superposed around only the foot portion. {Para [0036-0037] “FIG. 2 is a diagram illustrating an example in a case where the risk level corresponds to the range of (3) B≧D>0 and the virtual image of the obstacle is generated and displayed. The components used in FIG. 1 will be denoted by the same symbols, and the reference number 105 is the combiner, the scene displayed in the range is the front scene viewed by the driver 102 (the same will be applied hereinafter). The reference number 201 represents a steering, the reference number 202 represents an instrument panel, the reference number 203 represents an obstacle (a pedestrian in this case), the reference number 204 represents a virtual image of the estimated vehicle traffic line, the reference number 205 represents the generated virtual image of the obstacle, the reference number 109L represents a left speaker, and the reference number 109R represents a right speaker. 022 In this case, the virtual image 205 is generated and displayed as a partial outline of only a lower half of an obstacle 203. The driver 102 can recognize at least a lower half of the pedestrian (that is, feet of pedestrian) and thus the safety can be secured. Further, since an unnecessary virtual image display is not performed in a state where the risk level is not high, the displayed information does not hinder the driving.” See all para [0062] } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Mori in view of Nagasawa, Thompson, Edwards, and Hiraiwa to incorporate the teachings of Yoshida to display a mark around the foot portion because as discussed in para [0037] of Yoshida “022 In this case, the virtual image 205 is generated and displayed as a partial outline of only a lower half of an obstacle 203. The driver 102 can recognize at least a lower half of the pedestrian (that is, feet of pedestrian) and thus the safety can be secured. Further, since an unnecessary virtual image display is not performed in a state where the risk level is not high, the displayed information does not hinder the driving.” Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEXANDER MATTA whose telephone number is (571)272-4296. The examiner can normally be reached Mon - Fri 10:00-6:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Lee can be reached on (571) 270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.G.M./Examiner, Art Unit 3668 /JUSTIN S LEE/Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Jun 02, 2023
Application Filed
Feb 08, 2025
Non-Final Rejection — §103
Mar 28, 2025
Applicant Interview (Telephonic)
Mar 31, 2025
Examiner Interview Summary
Apr 10, 2025
Response Filed
May 11, 2025
Final Rejection — §103
Jun 24, 2025
Interview Requested
Jul 15, 2025
Applicant Interview (Telephonic)
Jul 16, 2025
Examiner Interview Summary
Jul 17, 2025
Request for Continued Examination
Jul 22, 2025
Response after Non-Final Action
Aug 23, 2025
Non-Final Rejection — §103
Oct 17, 2025
Interview Requested
Oct 23, 2025
Applicant Interview (Telephonic)
Oct 27, 2025
Examiner Interview Summary
Nov 03, 2025
Response Filed
Jan 05, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589770
SAFETY CONTROLLER FOR AUTOMATED DRIVING
2y 5m to grant Granted Mar 31, 2026
Patent 12570148
ACCESSORY MANAGEMENT SYSTEM THAT IDENTIFIES ACCESSORIES TO ALLOW FOR CONNECTION
2y 5m to grant Granted Mar 10, 2026
Patent 12552253
VEHICLE AND A METHOD OF CONTROLLING A DISPLAY TO OUTPUT A VISUAL INDICATION FOR INDUCING SELECTION OF A SPECIFIC DRIVING MODE
2y 5m to grant Granted Feb 17, 2026
Patent 12534132
SYSTEM AND METHOD FOR PROVIDING A VISUAL AID FOR STEERING ANGLE OFFSET IN A STEER-BY-WIRE SYSTEM
2y 5m to grant Granted Jan 27, 2026
Patent 12522245
COMPUTER-IMPLEMENTED METHOD FOR MANAGING AN OPERATIONAL DESIGN DOMAIN'S EXPANSION FOR AN AUTOMATED DRIVING SYSTEM
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
72%
Grant Probability
94%
With Interview (+22.6%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 137 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month