Prosecution Insights
Last updated: April 19, 2026
Application No. 18/283,003

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM

Final Rejection §103
Filed
Sep 20, 2023
Examiner
HARVEY II, KEVIN JEROME
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sony Group Corporation
OA Round
2 (Final)
0%
Grant Probability
At Risk
3-4
OA Rounds
3y 0m
To Grant
0%
With Interview

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 1 resolved
-52.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
48 currently pending
Career history
49
Total Applications
across all art units

Statute-Specific Performance

§101
9.7%
-30.3% vs TC avg
§103
70.8%
+30.8% vs TC avg
§102
8.7%
-31.3% vs TC avg
§112
10.8%
-29.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims 2. This office action is in response to application number 18/283,003 filed on 09/20/2023, in which the amendments and arguments filed on 08/20/2025. Claims 1, 4-6, 8-11, 13-15, 19, and 20 has been amended. Claims 21-24 have been added. Claims 2-3 and 16-18 have been cancelled. Claims 1, 4-6, 8-11, 13-15, 19, and 20 are currently pending and have been examined. Information Disclosure Statement 3. The information disclosure statement (IDS) submitted on 09/20/2023 and 06/14/2024 have been received and considered. Response to Amendment 4. Applicant' s amendments to the Claims have not overcome the rejection previously set forth in the Non-Final Office Action mailed 05/21/2025. Applicants arguments, see page 13-19 filed on 05/21/2025, with respect to the rejection(s) of claim(s) 1 and 4-15 under 35 USC 112(f) are persuasive. Furthermore, with respect to the rejection(s) of claim(s) 20 under 35 USC 101 is persuasive, rejection(s) of claim 1-20 under USC 103 is also persuasive, and finally all objections have been persuasive. Therefore, a new grounds for rejection is made under 35 USC 103 as necessitated by amendment as being unpatentable over Jung (US 20160061613 A1) in view of Iida (US 20200408559 A1) further in view of Wan (US 11127373 B2) further in view of Osterhout (US 20120194551 A1) further in view of Mase (JP 2015217798 A) and further in view of Beaurepaire (US 20210389152 A1). Examiner Notes 5. Examiner cites particular paragraphs (or columns and lines) in the references as applied to Applicant’s claims for the convenience of the Applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the Applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. The prompt development of a clear issue requires that the replies of the Applicant meet the objections to and rejections of the claims. Applicant should also specifically point out the support for any amendments made to the disclosure. See MPEP §2163.06. Applicant is reminded that the Examiner is entitled to give the Broadest Reasonable Interpretation (BRI) to the language of the claims. Furthermore, the Examiner is not limited to Applicant’s definition which is not specifically set forth in the claims. Claim Objections 6. Claim 15 and claim 23 are objected to because of the following informalities: Claim 15 depends on cancelled claim 3. Claim 23 reads “user% in response” this appears to be a typo and should read “user in response”. Appropriate correction is required. For purpose of examinations the examiner is rejecting claim 15 over Osterhout (US 20120194551 A1) because claim 15 is dependent on cancelled claim 3. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 7. Claim(s) 1, 4-7, 19, and 20-24 is/are rejected under 35 U.S.C. 103 as being unpatentable over (US 20160061613 A1) to Jung et al. (hereinafter Jung) in view of (US 20200408559 A1) to Iida et al. (hereinafter Iida) and further in view of (US 11127373 B2) to Wan et al. (hereinafter Wan). Regarding claim 1, Jung discloses An information processing device comprising: circuitry configured to determine a movement state of a user, that indicates whether the user is in a manual driving state, an autonomous driving state, or a walking state, (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) […] control to switch automatically, between a vehicle-mounted display and a user- wearable Augmented Reality (AR) device, to display navigation information based on the movement state of the user (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) (Jung Paragraph 0027: “According to an example associated with the present disclosure, the controller may activate the pedestrian mode when the mobile terminal is away from the vehicle by more than a predetermined distance.”) (Jung Paragraph 0078: “FIGS. 16A through 16C are views illustrating a mobile terminal for providing a pedestrian mode according to an embodiment disclosed in the present disclosure.”) (Jung Paragraph 0230: “In such a manner, the image output through the display unit 251″ may be viewed while overlapping with the general visual field. The mobile terminal 200″ may provide an augmented reality (AR) by overlaying a virtual image on a realistic image or background using the display.”) (Jung Paragraph 0285: “In this case, the wearable device may include a communication unit that performs communication with the vehicle control apparatus installed in the vehicle and a body worn on the user's wrist portion and formed to always contact with the wrist portion.”) (Jung Paragraph 0387: “the vehicle control apparatus 400 is configured to include one first display unit D100 in front of the driver seat,”) (Jung Paragraph 0454: “All the functions (for example, including the navigation function) that are performed by the vehicle 400 described above are performed the mobile terminal 100 or the wearable device 200 that is connected to the vehicle control apparatus 400 in a wired or wireless manner.”) (Jung Paragraph 0455: “In addition, the vehicle control apparatus 400 and the mobile terminal 100 performs all the functions in cooperation with each other or in conjunction with each other.”) […] control to display the navigation information on the user-wearable AR device, under a condition that the determined movement state of the user is the walking state, (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) (Jung Paragraph 0027: “According to an example associated with the present disclosure, the controller may activate the pedestrian mode when the mobile terminal is away from the vehicle by more than a predetermined distance.”) (Jung Paragraph 0078: “FIGS. 16A through 16C are views illustrating a mobile terminal for providing a pedestrian mode according to an embodiment disclosed in the present disclosure.”) (Jung Paragraph 0230: “In such a manner, the image output through the display unit 251″ may be viewed while overlapping with the general visual field. The mobile terminal 200″ may provide an augmented reality (AR) by overlaying a virtual image on a realistic image or background using the display.”) (Jung Paragraph 0285: “In this case, the wearable device may include a communication unit that performs communication with the vehicle control apparatus installed in the vehicle and a body worn on the user's wrist portion and formed to always contact with the wrist portion.”) (Jung Paragraph 0387: “the vehicle control apparatus 400 is configured to include one first display unit D100 in front of the driver seat,”) (Jung Paragraph 0454: “All the functions (for example, including the navigation function) that are performed by the vehicle 400 described above are performed the mobile terminal 100 or the wearable device 200 that is connected to the vehicle control apparatus 400 in a wired or wireless manner.”) (Jung Paragraph 0455: “In addition, the vehicle control apparatus 400 and the mobile terminal 100 performs all the functions in cooperation with each other or in conjunction with each other.”) Jung does not teach […] determine whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state; […] and the determination of the plurality of occupants, […] control to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, and control to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. However, Iida does teach […] determine whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state, (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) (Iida Paragraph 0035: “The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1.”) […] and the determination of the plurality of occupants, (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0035: “The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1.”) […] control to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) (Iida Paragraph 0035: “The camera CR is installed at a position where the occupant in the vehicle C1 can be captured, and is communicably connected to the in-vehicle device CN1. The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1. The camera CR transmits the captured image and the detected line-of-sight direction of the driver to the in-vehicle device CN1.”) (Iida Paragraph 0037: “The in-vehicle device CN1 generates, for example, information on a route to an optional destination set by the occupant and outputs the generated route information to a monitor 24.”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung to include […] determine whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state, […] and the determination of the plurality of occupants, […] control to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, taught by Iida. This would have been for the benefit to perform the input operation and the information provision in consideration of safety according to the vehicle information and the traveling state of the vehicle. [Iida Paragraph 0006] Iida does not teach […] and control to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. However, Wan does teach […] and control to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. (Wan Column 2, line number 30-34: “The systems and methods disclosed herein describe a wearable integrated augmented reality (AR) system for a vehicle configured to display road-side information in front of the wearer's direct gaze when they are driving and/or riding in the vehicle,”) (Wan Column 5, line number 60-64: “The sensor(s) 125 may include any number of devices configured or programmed to generate signals that help navigate the vehicle 105 while operating in a manual and/or an autonomous (e.g., driverless) mode.”) (Wan Column 6, line number 37-40: “The AR controller 120, as described herein, may do this by receiving, from one or more integrated cameras (not shown in FIG. 1) associated with the AR wearable system 145, a video feed of the interior surfaces of the vehicle 105,”) (Wan Column 10, line number 34-42: “The occupant ID system 226 may identify one or more riders and/or drivers (collectively occupants) when they enter the vehicle, and retrieve occupant identifiers associated with the one or more occupants. The occupant ID system 226 may assign a unique ID to individual users such that the occupant identifier includes occupant-specific information that may be used to provide a unique AR experience to each vehicle occupant. Example information may include, for example, navigation preferences,”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung in view of Iida to include […] and control to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state taught by Wan. This would have been for the benefit to provide The AR system may determine an identity of a user of the AR wearable system, and generate, based at least in part on the user ID associated with the user of the AR wearable device, a first virtual representation of the roadside object aligned with a GPS location and a direction of the vehicle. [Wan Column 2, line number 52-57] Regarding claim 4, Jung discloses The information processing device according to claim 1, wherein the circuitry is configured to control to display the navigation information indicating a traveling direction is superimposed on a real or on-screen roadway as the navigation to the user (Jung Paragraph 0262: “The vehicle control apparatus disclosed in the present specification is described below referring to FIGS. 6A to 8C.”) (Jung Paragraph 0391: “FIG. 8A illustrates a case where the vehicle control apparatus 400 is realized as in the form of an image display apparatus, a head unit of the vehicle, or a telematics terminal.”) (Jung Paragraph 0392: “As illustrated in FIG. 8A, a vehicle control apparatus 400′ is configured to include a main board 410′. A controller (for example, a central processing unit (CPU) 412′ that controls all operations of the vehicle control apparatus 400′, a program for processing or controlling the controller 412′, a key controller 411′ that controls various key signals, and an LCD controller 414′ that controls a liquid crystal display (LCD) are built into the main board 410′.”) (Jung Paragraph 0393: “Map information (map data) for displaying directions-suggestion information on a digital map is stored in the memory 413′.”) (Jung Paragraph 0443: “As illustrated in FIG. 8C, an icon I1 indicating a compass direction of the map is displayed on one region of the screen on the display unit to which a screen associated with the navigation function is provided. The map is displayed on the display unit to which the screen associated with the navigation function is provided, in such a manner that a specific direction (for example, the true north direction of the Earth), a moving direction of a moving object, a direction of the destination point”) Jung does not teach in a case where the user is in the manual driving state or the autonomous driving state. However, Iida does teach in a case where the user is in the manual driving state or the autonomous driving state. (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung to include where the user is in the manual driving state or the autonomous driving state, taught by Iida. This would have been for the benefit to perform the input operation and the information provision in consideration of safety according to the vehicle information and the traveling state of the vehicle. [Iida Paragraph 0006] Regarding claim 5, Jung discloses The information processing device according to claim 4, wherein the circuitry is configured to control to display, as the navigation information, traffic information regarding a destination or traffic information regarding a road on which the user is currently moving (Jung Paragraph 0262: “The vehicle control apparatus disclosed in the present specification is described below referring to FIGS. 6A to 8C.”) (Jung Paragraph 0391: “FIG. 8A illustrates a case where the vehicle control apparatus 400 is realized as in the form of an image display apparatus, a head unit of the vehicle, or a telematics terminal.”) (Jung Paragraph 0392: “As illustrated in FIG. 8A, a vehicle control apparatus 400′ is configured to include a main board 410′. A controller (for example, a central processing unit (CPU) 412′ that controls all operations of the vehicle control apparatus 400′, a program for processing or controlling the controller 412′, a key controller 411′ that controls various key signals, and an LCD controller 414′ that controls a liquid crystal display (LCD) are built into the main board 410′.”) (Jung Paragraph 0443: “As illustrated in FIG. 8C, an icon I1 indicating a compass direction of the map is displayed on one region of the screen on the display unit to which a screen associated with the navigation function is provided. The map is displayed on the display unit to which the screen associated with the navigation function is provided, in such a manner that a specific direction (for example, the true north direction of the Earth), a moving direction of a moving object, a direction of the destination point”) (Jung Paragraph 0445: “An icon I3 indicating whether or not a path search function is activated that is in accordance with Transport Portal Experts Group (TPEG) specifications for transmission of traffic information is displayed on one region of the screen on the display unit.”) PNG media_image1.png 376 341 media_image1.png Greyscale However, Iida does teach in a case where the user is in the manual driving state or the autonomous driving state. (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung to include where the user is in the manual driving state or the autonomous driving state, taught by Iida. This would have been for the benefit to perform the input operation and the information provision in consideration of safety according to the vehicle information and the traveling state of the vehicle. [Iida Paragraph 0006] Regarding claim 6, Jung discloses The information processing device according to claim 4, wherein the circuitry is configured to control to display, as the navigation information, vehicle information regarding the vehicle (Jung Paragraph 0041: “According to an example associated with the present disclosure, the vehicle information may be information associated with at least one of an air-conditioning function for the vehicle, whether or not a door is open or closed, whether or not a window is open or closed, whether or not a sunroof is open or closed, a battery charging state of the vehicle, a parking position of the vehicle, a navigation function provided in the vehicle, a theft state of the vehicle, and a fueling state of the vehicle.”) (Jung Paragraph 0433: “The image information (or directions-suggestion map) included in the direction-suggestion information generated by the controller 407 is displayed on the display unit 405″. At this point, the display unit 405 is configured to include the touch sensor (touch screen) and the proximity sensor. In addition, the directions-suggestion information includes not only the map data, but also the various types of information relating to driving, such as the traffic lane information, the speed limit information, the turn-by-turn (TBT) information, the traffic safety information, the traffic condition information, the vehicle information, the path-finding information and the like.”) However, Iida does teach i on which the user rides in a case where the user is in the manual driving state or the autonomous driving state (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung to include on which the user rides in a case where the user is in the manual driving state or the autonomous driving state, taught by Iida. This would have been for the benefit to perform the input operation and the information provision in consideration of safety according to the vehicle information and the traveling state of the vehicle. [Iida Paragraph 0006] Regarding claim 7, The information processing device according to claim 6, wherein the vehicle information includes information on a speed or remaining energy of the vehicle. (Jung Paragraph 0359: “In addition, the vehicle information is configured to further include at least information relating to at least one, among current driving speed of the vehicle,”) (Jung Paragraph 0433: “The image information (or directions-suggestion map) included in the direction-suggestion information generated by the controller 407 is displayed on the display unit 405″. At this point, the display unit 405 is configured to include the touch sensor (touch screen) and the proximity sensor. In addition, the directions-suggestion information includes not only the map data, but also the various types of information relating to driving, such as the traffic lane information, the speed limit information, the turn-by-turn (TBT) information, the traffic safety information, the traffic condition information, the vehicle information, the path-finding information and the like.”) Regarding claim 19, Jung discloses An information processing method comprising: (Jung Paragraph 0049: “A control method of a mobile terminal for providing a navigation function according to an embodiment of the present disclosure may include”) determining a movement state of a user, that indicates whether the user is in a manual driving state, an autonomous driving state, or a walking state; (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) […] controlling to switch automatically, between a vehicle-mounted display and a user- wearable Augmented Reality (AR) device, to display navigation information based on the movement state of the user (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) (Jung Paragraph 0027: “According to an example associated with the present disclosure, the controller may activate the pedestrian mode when the mobile terminal is away from the vehicle by more than a predetermined distance.”) (Jung Paragraph 0078: “FIGS. 16A through 16C are views illustrating a mobile terminal for providing a pedestrian mode according to an embodiment disclosed in the present disclosure.”) (Jung Paragraph 0230: “In such a manner, the image output through the display unit 251″ may be viewed while overlapping with the general visual field. The mobile terminal 200″ may provide an augmented reality (AR) by overlaying a virtual image on a realistic image or background using the display.”) (Jung Paragraph 0285: “In this case, the wearable device may include a communication unit that performs communication with the vehicle control apparatus installed in the vehicle and a body worn on the user's wrist portion and formed to always contact with the wrist portion.”) (Jung Paragraph 0387: “the vehicle control apparatus 400 is configured to include one first display unit D100 in front of the driver seat,”) (Jung Paragraph 0454: “All the functions (for example, including the navigation function) that are performed by the vehicle 400 described above are performed the mobile terminal 100 or the wearable device 200 that is connected to the vehicle control apparatus 400 in a wired or wireless manner.”) (Jung Paragraph 0455: “In addition, the vehicle control apparatus 400 and the mobile terminal 100 performs all the functions in cooperation with each other or in conjunction with each other.”) […] controlling to display the navigation information on the user-wearable AR device, under a condition that the determined movement state of the user is the walking state, (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) (Jung Paragraph 0027: “According to an example associated with the present disclosure, the controller may activate the pedestrian mode when the mobile terminal is away from the vehicle by more than a predetermined distance.”) (Jung Paragraph 0078: “FIGS. 16A through 16C are views illustrating a mobile terminal for providing a pedestrian mode according to an embodiment disclosed in the present disclosure.”) (Jung Paragraph 0230: “In such a manner, the image output through the display unit 251″ may be viewed while overlapping with the general visual field. The mobile terminal 200″ may provide an augmented reality (AR) by overlaying a virtual image on a realistic image or background using the display.”) (Jung Paragraph 0285: “In this case, the wearable device may include a communication unit that performs communication with the vehicle control apparatus installed in the vehicle and a body worn on the user's wrist portion and formed to always contact with the wrist portion.”) (Jung Paragraph 0387: “the vehicle control apparatus 400 is configured to include one first display unit D100 in front of the driver seat,”) (Jung Paragraph 0454: “All the functions (for example, including the navigation function) that are performed by the vehicle 400 described above are performed the mobile terminal 100 or the wearable device 200 that is connected to the vehicle control apparatus 400 in a wired or wireless manner.”) (Jung Paragraph 0455: “In addition, the vehicle control apparatus 400 and the mobile terminal 100 performs all the functions in cooperation with each other or in conjunction with each other.”) Jung does not teach […] determining whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state; […] and the determination of the plurality of occupants, […] controlling to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, and controlling to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. However, Iida does teach […] determining whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state; (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) (Iida Paragraph 0035: “The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1.”) […] and the determination of the plurality of occupants, (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0035: “The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1.”) […] controlling to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) (Iida Paragraph 0035: “The camera CR is installed at a position where the occupant in the vehicle C1 can be captured, and is communicably connected to the in-vehicle device CN1. The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1. The camera CR transmits the captured image and the detected line-of-sight direction of the driver to the in-vehicle device CN1.”) (Iida Paragraph 0037: “The in-vehicle device CN1 generates, for example, information on a route to an optional destination set by the occupant and outputs the generated route information to a monitor 24.”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung to include […] determining whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state, […] and the determination of the plurality of occupants, […] controlling to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, taught by Iida. This would have been for the benefit to perform the input operation and the information provision in consideration of safety according to the vehicle information and the traveling state of the vehicle. [Iida Paragraph 0006] Iida does not teach […] and controlling to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. However, Wan does teach […] and controlling to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. (Wan Column 2, line number 30-34: “The systems and methods disclosed herein describe a wearable integrated augmented reality (AR) system for a vehicle configured to display road-side information in front of the wearer's direct gaze when they are driving and/or riding in the vehicle,”) (Wan Column 5, line number 60-64: “The sensor(s) 125 may include any number of devices configured or programmed to generate signals that help navigate the vehicle 105 while operating in a manual and/or an autonomous (e.g., driverless) mode.”) (Wan Column 6, line number 37-40: “The AR controller 120, as described herein, may do this by receiving, from one or more integrated cameras (not shown in FIG. 1) associated with the AR wearable system 145, a video feed of the interior surfaces of the vehicle 105,”) (Wan Column 10, line number 34-42: “The occupant ID system 226 may identify one or more riders and/or drivers (collectively occupants) when they enter the vehicle, and retrieve occupant identifiers associated with the one or more occupants. The occupant ID system 226 may assign a unique ID to individual users such that the occupant identifier includes occupant-specific information that may be used to provide a unique AR experience to each vehicle occupant. Example information may include, for example, navigation preferences,”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung in view of Iida to include […] and controlling to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state taught by Wan. This would have been for the benefit to provide The AR system may determine an identity of a user of the AR wearable system, and generate, based at least in part on the user ID associated with the user of the AR wearable device, a first virtual representation of the roadside object aligned with a GPS location and a direction of the vehicle. [Wan Column 2, line number 52-57] Regarding claim 20, Jung discloses […] determining a movement state of a user, that indicates whether the user is in a manual driving state, an autonomous driving state, or a walking state; (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) […] controlling to switch automatically, between a vehicle-mounted display and a user- wearable Augmented Reality (AR) device, to display navigation information based on the movement state of the user (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) (Jung Paragraph 0027: “According to an example associated with the present disclosure, the controller may activate the pedestrian mode when the mobile terminal is away from the vehicle by more than a predetermined distance.”) (Jung Paragraph 0078: “FIGS. 16A through 16C are views illustrating a mobile terminal for providing a pedestrian mode according to an embodiment disclosed in the present disclosure.”) (Jung Paragraph 0230: “In such a manner, the image output through the display unit 251″ may be viewed while overlapping with the general visual field. The mobile terminal 200″ may provide an augmented reality (AR) by overlaying a virtual image on a realistic image or background using the display.”) (Jung Paragraph 0285: “In this case, the wearable device may include a communication unit that performs communication with the vehicle control apparatus installed in the vehicle and a body worn on the user's wrist portion and formed to always contact with the wrist portion.”) (Jung Paragraph 0387: “the vehicle control apparatus 400 is configured to include one first display unit D100 in front of the driver seat,”) (Jung Paragraph 0454: “All the functions (for example, including the navigation function) that are performed by the vehicle 400 described above are performed the mobile terminal 100 or the wearable device 200 that is connected to the vehicle control apparatus 400 in a wired or wireless manner.”) (Jung Paragraph 0455: “In addition, the vehicle control apparatus 400 and the mobile terminal 100 performs all the functions in cooperation with each other or in conjunction with each other.”) […] controlling to display the navigation information on the user-wearable AR device, under a condition that the determined movement state of the user is the walking state, (Jung Paragraph 0023: “According to an example associated with the present disclosure, the controller may determine whether or not the user gets off the vehicle prior to arriving at the destination of the vehicle, and activate a pedestrian mode to output a walking direction to the destination of the vehicle through the output unit when the user gets off the vehicle prior to arriving at the destination of the vehicle.”) (Jung Paragraph 0027: “According to an example associated with the present disclosure, the controller may activate the pedestrian mode when the mobile terminal is away from the vehicle by more than a predetermined distance.”) (Jung Paragraph 0078: “FIGS. 16A through 16C are views illustrating a mobile terminal for providing a pedestrian mode according to an embodiment disclosed in the present disclosure.”) (Jung Paragraph 0230: “In such a manner, the image output through the display unit 251″ may be viewed while overlapping with the general visual field. The mobile terminal 200″ may provide an augmented reality (AR) by overlaying a virtual image on a realistic image or background using the display.”) (Jung Paragraph 0285: “In this case, the wearable device may include a communication unit that performs communication with the vehicle control apparatus installed in the vehicle and a body worn on the user's wrist portion and formed to always contact with the wrist portion.”) (Jung Paragraph 0387: “the vehicle control apparatus 400 is configured to include one first display unit D100 in front of the driver seat,”) (Jung Paragraph 0454: “All the functions (for example, including the navigation function) that are performed by the vehicle 400 described above are performed the mobile terminal 100 or the wearable device 200 that is connected to the vehicle control apparatus 400 in a wired or wireless manner.”) (Jung Paragraph 0455: “In addition, the vehicle control apparatus 400 and the mobile terminal 100 performs all the functions in cooperation with each other or in conjunction with each other.”) Jung does not teach A non-transitory computer-readable storage medium including computer executable instructions, wherein the instructions, when executed by an information processing device, cause the information processing device to perform a method, the method comprising […] determining whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state; […] and the determination of the plurality of occupants, […] controlling to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, and controlling to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. However, Iida does teach […] determining whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state; (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) (Iida Paragraph 0035: “The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1.”) […] and the determination of the plurality of occupants, (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0035: “The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1.”) […] controlling to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, (Iida Paragraph 0005: “The present disclosure provides a method for controlling a vehicle navigation system, the vehicle navigation system including: an in-vehicle camera configured to capture at least one occupant in a vehicle”) (Iida Paragraph 0027: “The vehicle C1 according to the first embodiment is not limited to a vehicle manually driven by a driver, and may be an automated driving vehicle.”) (Iida Paragraph 0035: “The camera CR is installed at a position where the occupant in the vehicle C1 can be captured, and is communicably connected to the in-vehicle device CN1. The camera CR performs an image processing on a captured image in the vehicle C1, and detects a destination of a line of sight (hereinafter, referred to as line-of-sight direction) of an occupant positioned in a driver seat (hereinafter, referred to as driver) among occupants in the vehicle C1. The camera CR transmits the captured image and the detected line-of-sight direction of the driver to the in-vehicle device CN1.”) (Iida Paragraph 0037: “The in-vehicle device CN1 generates, for example, information on a route to an optional destination set by the occupant and outputs the generated route information to a monitor 24.”) Therefore, it would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Jung to include […] determining whether a plurality of occupants is detected in a vehicle on which the user rides, in a case that the user is in the manual driving state or the autonomous driving state, […] and the determination of the plurality of occupants, […] controlling to display the navigation information on the vehicle-mounted display, under a condition that the plurality of occupants is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state, taught by Iida. This would have been for the benefit to perform the input operation and the information provision in consideration of safety according to the vehicle information and the traveling state of the vehicle. [Iida Paragraph 0006] Iida does not teach A non-transitory computer-readable storage medium including computer executable instructions, wherein the instructions, when executed by an information processing device, cause the information processing device to perform a method, the method comprising […] and controlling to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. However, Wan does teach A non-transitory computer-readable storage medium including computer executable instructions, wherein the instructions, when executed by an information processing device, cause the information processing device to perform a method, the method comprising (Wan Column 16, line number 30-36: “A computer-readable medium (also referred to as a processor-readable medium) includes any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, nonvolatile media and volatile media.”) […] and controlling to display the navigation information on the user-wearable AR device, under a condition that only the user is detected in the vehicle in a case that the user is in the manual driving state or the autonomous driving state. (Wan Column 2, line number 30-34: “The systems and methods disclosed herein describe a wearable integrated augmented reality (A
Read full office action

Prosecution Timeline

Sep 20, 2023
Application Filed
May 16, 2025
Non-Final Rejection — §103
Aug 20, 2025
Response Filed
Nov 26, 2025
Final Rejection — §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
0%
Grant Probability
0%
With Interview (+0.0%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 1 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month