DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Application
Claims 1-27 are pending. Claims 1 and 17 are the independent claims. Claims 2, 4, 6, 8-11 and 19 have been cancelled. Claims 1, 3, 5, 7, 12-13, 15, 17, and 20 have been amended. Claims 21-27 have been newly added. This office action is in response to the Amendments received on 11/06/2025.
Response to Arguments
With respect to Applicant’s remarks filed on 11/06/2025, “Applicant Arguments/Remarks Made in an Amendment” have been fully considered. Applicant's amendment and arguments according to the Applicant’s Remarks filed on 11/06/2025, have been fully considered. With respect to the previous rejections under 35 U.S.C § 102 and 35 U.S.C § 103 (non-final office action filed on 08/12/2025), applicant has amended the independent claims 1 and 17 and dependent claims 3, 5, 7,12-16 and these amendments have changed the scope of the original claims. The applicant’s arguments are made with regard to the newly added limitations in the amended claims and newly added claims; therefore, the arguments are moot and respectfully not persuasive. The new ground of rejections has been made according to the newly amended claims (Office action below).
Office Note: Due to applicant’s amendments, further claim rejections appear on the record as stated in the below Office Action.
It is the Office’ stance that all of applicant arguments have been considered.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 3, 5, 14-18, 21-25, and 27 are rejected under 35 U.S.C. 103 as being unpatentable over Bronder et al., US 20240404489 A1, hereinafter “Bronder”, in view of Syed et al., US20220047951A1, hereinafter, “Syed”.
Regarding claim 1, Bronder discloses a method, comprising: transmitting, from a vehicle to an Augmented Reality (AR) source, vehicle data relating to a location of the vehicle (at least, Abstract, “An augmented reality display system included in a vehicle”, [0050], [0051], “information can include location information,”, [0020], “the VNS (vehicle navigation system) 140 [] provides an augmented reality display of one or more portions of the environment perceived by the occupant.”), receiving, at the vehicle from the AR source, an AR element selected based on the vehicle data (at least, [0028], “The augmented reality display system can generate an augmented reality display on a transparent surface of the vehicle which includes a display element”, “the augmented reality display can comprise an overlay of at least a portion of the environment visible, also referred to herein as perceptible, via the transparent surface.”, [0038], [0076]), displaying the AR element on a display of the vehicle, wherein the AR element is displayed in context with an area outside the vehicle ([0003], [0020], [0028], “The overlay can include display elements, also referred to herein as representations, which provide three-dimensional representations of one or more various graphical icons, such that the three-dimensional representations are perceptible as being positioned in the external environment when the environment is perceived via the augmented reality display”) and the AR element is displayed on or near a window through which the area can be viewed or on a display on which is provided image data from a camera of the vehicle that has a field of view including the area (at least [0024], [0029]-[0032], __an augmented reality display can be displayed on surface 220 (which can be a windshield)__, __also according to e.g., [0021], representation of the portion of external environment is by external sensors such as camera).
Bronder doesn’t explicitly disclose selecting at the vehicle a game application, where the game application is provided via the AR source; the selected game application where the AR element is specific to the selected game application; and wherein the display of the AR element is responsive to one or more of movement of a vehicle steering wheel, movement of a vehicle throttle or movement of a vehicle brake.
However, Syed discloses a gaming system of a vehicle and teaches selecting at the vehicle a game application, where the game application is provided via the AR source ([0050], [0061], “one or more games”, “determine which game to execute for gameplay (e.g., the avatar traversing the virtual environment)”), the selected game application where the AR element is specific to the selected game application ([0061], “Each of the games 308 may include an avatar navigating a virtual environment. The processor module 304 may determine which game to execute for gameplay (e.g., the avatar traversing the virtual environment) based on user input from one or more of the input devices 185.”); and wherein the display of the AR element is responsive to one or more of movement of a vehicle steering wheel, movement of a vehicle throttle or movement of a vehicle brake (at least [0020], “display a virtual environment of the interactive game via a display in the vehicle; and control action within the virtual environment of the interactive game based on input received via at least one of: a steering wheel of the vehicle;”,[0021]-[0022], “move an avatar left and right within the virtual environment of the interactive game in response to turning of the steering wheel of the vehicle left and right”, [0050], [0089], “In the virtual environment, the gaming module 182 controls steering of the avatar of the user based on the SWA 142 measured by the SWA (steering wheel angle (SWA)) sensor 1016.”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with the option of selecting a game application provided via the AR source including a specific AR element to the selected game that is responsive to a vehicle steering wheel, as taught by Syed, with a reasonable expectation of success, with motivation of enhancing the game user’s experience by relating the games to the vehicle’s dynamic movements.
Regarding claim 3, Bronder discloses wherein the AR element includes one or more icons that are provided on the display so that they appear in a predetermined context relative to the area outside the vehicle ([0038], “overlay one or more portions of the environment which include one or more objects in the environment”, [0039], [0040], includes display elements 360, 380 which overlay and conform to one or more boundaries of one or more particular portions of particular objects 350, 370 in the environment 390.”)
Bronder doesn’t explicitly disclose wherein the AR element is a game element that is separate from and does not represent a feature of the environment in which the vehicle is located.
However, Syed teaches wherein the AR element is a game element that is separate from and does not represent a feature of the environment in which the vehicle is located (__according to the disclosure of Syed, for example, at least paragraphs [0075], [0067], [0069], Avatar of the user reads on AR element as recited in the claim which does not represent a feature of the environment__).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with an AR element that is a game element as thought by Syed as a game avator, with a reasonable expectation of success, with motivation of enhancing the game user’s experience.
Regarding claim 5, Bronder discloses wherein the AR element is displayed as a function of the current location of the vehicle, with a different AR element displayed when the vehicle is at a different location ([0072], “an augmented reality display which includes a representation of a particular driving zone in the environment,”, [0073]-[0076], “A zone identification element 824 can include one or more of a sign, visual icon,”).
Bronder doesn’t teach wherein the AR element and the different AR element are both game elements that are separate from and do not represent a feature of the environment in which the vehicle is located.
However, Syed teaches wherein the AR element and the different AR element are both game elements that are separate from and do not represent a feature of the environment in which the vehicle is located ([0067], “updates a position and orientation of an avatar in the virtual world based on the current GPS data 316.”, __Note: while Bronder teaches the AR elements displayed being different according to different location, Syed teaches the AR element i.e. an avatar (which is different from or does not represent a feature in the environment), being updated (position and location) which reads on different AR elements as recited in the claims where different AR elements represent the AR element being a function of the current location of the vehicle.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with an AR element that is a game element as thought by Syed as a game avatar which is a function of a current location of the vehicle, with a reasonable expectation of success, with motivation of enhancing the game user’s experience.
Regarding claim 13, Bronder discloses the method of claim 1 (see rejection for claim 1), however, Bronder doesn’t explicitly discloses it also includes moving the AR element in response to actuation of an input within the vehicle and wherein actuation of the input is determined by detecting movement of one or more of a vehicle steering wheel, vehicle throttle or vehicle brake.
Nevertheless, Syed teaches moving the AR element in response to actuation of an input within the vehicle and wherein actuation of the input is determined by detecting movement of one or more of a vehicle steering wheel, vehicle throttle or vehicle brake ([0089], “the gaming module 182 controls steering of the avatar of the user based on the SWA (steering wheel angle) 142 measured by the SWA sensor 1016.”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in a vehicle as taught by Bronder with the possibility of moving the AR element by actuating an input within vehicle as taught by Syed, with a reasonable expectation of success, with the motivation of improving the user’s gaming experience by being able to move the AR elements in response to actuation of an input within the vehicle.
Regarding claim 14, Bronder discloses the method of claim 1 which also includes controlling at least one of steering, throttle or braking of the vehicle as a function of information from a safety system of the vehicle ([0023], [0068]).
Regarding claim 15, Bronder discloses wherein the safety system includes sensors used by an Advanced Driver Assist System, and the sensors include one or more of a camera, Lidar, Radar, or location sensor ([0021]-[0022], __according to at least [0020], and [0062], vehicle autonomously navigated therefore the advanced driver assist system is included__).
Regarding claim 16, Bronder discloses the method of claim 1 which also includes providing sound in the vehicle as a function of the AR element or the vehicle location ([0081], “element 880 can be associated with [] instance of audio content, [] which indicates the identity of the particular zone indicated by the element 880 (e.g., school zone).”).
Regarding claim 17, Bronder discloses system, comprising: a display within a vehicle (at least in [0005], and [0020], “augmented reality display”); a communication device of the vehicle permitting wireless communication to and from the vehicle (at least [0056], [0099], [0103]); a vehicle control system coupled to the display and to the communication device ([0024], [0068]), the vehicle control system including a controller and memory including instructions executable by the processor ([0096]-[0097]); and a location sensor of the vehicle, wherein the display includes an image or a view of an area outside the vehicle (e.g., [0020], “vehicle navigation system (VNS)”, [0021], “representations of one or more portions of the external environment”, “geographic position detection devices”, [0012]), and the control system is responsive to information from the location sensor to provide on the display at least one augmented reality (AR) element that is positioned within the image or view of the area outside the vehicle so that the display shows both the image or view and the AR element (e.g., [0028], “various graphical icons, such that the three-dimensional representations are perceptible as being positioned in the external environment when the environment is perceived via the augmented reality display presented on the transparent surface. ”, [0020], “VNS (Vehicle Navigation System) can control a display of information to the occupant via a graphical overlay on the transparent surface which provides an augmented reality display of one or more portions of the environment perceived by the occupant via the transparent surface.”, [0033], [0061]),
Bronder doesn’t explicitly teaches wherein the vehicle control system defines or is communicated with an AR source including multiple game applications, and the AR element is specific to one of the multiple game applications and does not indicate the location of a feature of the environment in which the vehicle is located, and wherein either: a) the display of the AR element is responsive to one or more of movement of a vehicle steering wheel, movement of a vehicle throttle or actuation of a vehicle brake, or b) the vehicle is driven relative to the displayed AR element by actuation of the vehicle steering wheel, vehicle throttle or vehicle brake.
However, Syed teaches wherein the vehicle control system defines or is communicated with an AR source including multiple game applications ([0036], [0050], [0061], “one or more games”, “determine which game to execute for gameplay (e.g., the avatar traversing the virtual environment)”), and the AR element is specific to one of the multiple game applications and does not indicate the location of a feature of the environment in which the vehicle is located ([0061], “Each of the games 308 may include an avatar navigating a virtual environment. The processor module 304 may determine which game to execute for gameplay (e.g., the avatar traversing the virtual environment) based on user input from one or more of the input devices 185.”, __Note: the avatar as disclosed by Syed reads on the claimed AR element as it does not indicate the location of a feature of the environment__); and wherein either: a) the display of the AR element is responsive to one or more of movement of a vehicle steering wheel, movement of a vehicle throttle or actuation of a vehicle brake, or b) the vehicle is driven relative to the displayed AR element by actuation of the vehicle steering wheel, vehicle throttle or vehicle brake (at least [0020], “display a virtual environment of the interactive game via a display in the vehicle; and control action within the virtual environment of the interactive game based on input received via at least one of: a steering wheel of the vehicle;”,[0021]-[0022], “move an avatar left and right within the virtual environment of the interactive game in response to turning of the steering wheel of the vehicle left and right”, [0050], [0089], “In the virtual environment, the gaming module 182 controls steering of the avatar of the user based on the SWA 142 measured by the SWA (steering wheel angle (SWA)) sensor 1016.”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in a vehicle as taught by Bronder with the possibility of moving the AR element being responsive to the vehicle’s steering wheel, throttle and braking actuations as taught by Syed, with a reasonable expectation of success, with the motivation of improving the user’s gaming experience by being able to move the AR elements in response to actuation of an input within the vehicle.
Regarding claim 18, Bronder discloses wherein the AR element is a game element ([0089], “an augmented reality display system can generate various display elements, which are presented on a transparent surface of the vehicle, as part of implementing an entertainment program, game, etc.”, “the augmented reality display system can generate various display elements as part of an interactive game,”) and the AR element is located relative to the area surrounding the vehicle so that as the vehicle moves, the vehicle moves relative to the AR element ([0091] "Based on generating a display element which identifies a particular object in the environment, the augmented reality display system can adjust the display element to follow the object in the environment, present information associated with the identified element, receive occupant commands associated with the identified element, etc.").
Regarding claim 21, Bronder teaches the method of claim 1, however, Bronder doesn’t disclose wherein the AR element is a game element that is part of a game objective of the selected game application, and wherein the AR element displayed is selected as a function of achievement of the game objective.
Syed teaches wherein the AR element is a game element that is part of a game objective of the selected game application, and wherein the AR element displayed is selected as a function of achievement of the game objective ([0061], “Each of the games 308 may include an avatar navigating a virtual environment. The processor module 304 may determine which game to execute for gameplay (e.g., the avatar traversing the virtual environment) based on user input from one or more of the input devices 185.”, __Note: the avatar disclosed in the cited reference is a game element that is part of the game and its traversing the virtual environment is the objective of the game application__, __further according to for example [0067], “The gaming module 182 adjusts the avatar in the virtual world of the game 308 based on the user input, if any.”, reads on the AR element displayed is selected as a function of achievement of the game objective which is interpreted as adjusting/updating the avatar (e.g., position and orientation) based on the user input__)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in a vehicle as taught by Bronder with the AR element being a game element that is part of a game objective of the selected game application, as taught by Syed, with a reasonable expectation of success, with the motivation of improving the user’s gaming experience.
Regarding claim 22, The method of claim 21 wherein achievement of the game objective requires movement of the vehicle by drive actuation of a steering input and a throttle input, and the step of displaying the AR element includes changing the size or location of the AR element in response to movement of the vehicle.
Syed teaches wherein achievement of the game objective requires movement of the vehicle by drive actuation of a steering input and a throttle input (at least [0035]-[0037], [0039], [0050]), and the step of displaying the AR element includes changing the size or location of the AR element in response to movement of the vehicle ([0067], “updates a position and orientation of an avatar in the virtual world based on the current GPS data 316.”, [0023], “accelerate movement of an avatar in the virtual environment of the interactive game in response to actuation of the accelerator pedal of the vehicle.”.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with the option of selecting a game application provided via the AR source including a specific AR element to the selected game that is responsive to a vehicle steering wheel, as taught by Syed, with a reasonable expectation of success, with motivation of enhancing the game user’s experience by relating the games to the vehicle’s dynamic movements.
Regarding claim 23, Bronder teaches the method of claim 22, however, Bronder doesn’t explicitly disclose wherein the AR element denotes a location through which the vehicle must be moved for achievement of the objective, and wherein the step of displaying the AR element includes changing the AR element that is displayed upon detecting movement of the vehicle through the location.
Syed teaches wherein the AR element denotes a location through which the vehicle must be moved for achievement of the objective ([0061], “Each of the games 308 may include an avatar navigating a virtual environment. The processor module 304 may determine which game to execute for gameplay (e.g., the avatar traversing the virtual environment) based on user input from one or more of the input devices 185.”, __traversing the virtual environment reads on objective of the game__, [0075]), and wherein the step of displaying the AR element includes changing the AR element that is displayed upon detecting movement of the vehicle through the location ([0067], “updates a position and orientation of an avatar in the virtual world based on the current GPS data 316.”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with the AR element denoting the location through which the vehicle must be moved in the game and changing the AR element upon detecting the vehicle’s movement as taught by Syed, with a reasonable expectation of success, with motivation of enhancing the game user’s experience by relating the games to the vehicle’s dynamic movements and environment.
Regarding claims 24 and 25, Bronder in view of Syed teaches the method of claim 21, however, Bronder doesn’t explicitly teach which also includes sensing actuation of an input to cause the AR element to move and displaying movement of the AR element as a function of the input and wherein the input includes one or more of movement of a vehicle steering wheel, vehicle throttle or vehicle brake.
Syed teaches which also includes sensing actuation of an input to cause the AR element to move and displaying movement of the AR element as a function of the input, wherein the input includes one or more of movement of a vehicle steering wheel, vehicle throttle or vehicle brake. (at least [0020], “control action within the virtual environment of the interactive game based on input received via at least one of: a steering wheel of the vehicle; a horn input device of the vehicle; an accelerator pedal of the vehicle; and a brake pedal of the vehicle”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with an AR element that is responsive to a vehicle steering wheel as taught by Syed, with a reasonable expectation of success, with motivation of enhancing the game user’s experience by relating the games to the vehicle’s dynamic movements.
Regarding claim 27, modified Bronder teaches the method of claim 21 and although Bronder teaches, according to [0068], that the augmented reality display system can simulate objects in the environment and command to actuate the suspension system differently, however, Bronder doesn’t explicitly disclose that the system actuate the brake to prevent the vehicle from moving through the area including the virtual obstacle.
However, Syed teaches the AR element represents a virtual obstacle over or through which the vehicle cannot be driven, and which also includes causing a vehicle control system to actuate a vehicle brake to prevent the vehicle from moving through the area including the virtual obstacle (at least [0052], “a perception module 187 perceives objects around the vehicle and locations of the objects relative to the vehicle. [] the EBCM 150 may adjust braking based on input from the perception module 187.”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with an AR element representing a virtual obstacle and preventing the vehicle from moving through it by actuating the vehicle brake as thought by Syed, with motivation of enhancing the game user’s experience.
Claims 7 and 26 are rejected under 35 U.S.C. 103 as being unpatentable over Bronder, in view of Syed, or as alternative rejection, further in view of Rathod, US 20180345129 A1, hereinafter “Rathod”.
Regarding claim 7, Bronder discloses which also includes determining one or more vehicle parameters, comparing each of the one or more vehicle parameters to a corresponding threshold, and terminating the display of the AR element when one of the vehicle parameters is outside of the corresponding threshold for that vehicle parameter (__according to at least [0067], it is disclosed that when vehicle reaches an excessive speed threshold value. The AR display elements will be displayed. Therefore, terminating the display when the speed is outside the threshold is obvious__, Also, according to paragraph [0008] of specification of the instant application: “vehicle parameters is a geofence and the corresponding threshold is a location of the vehicle relative to the geofence.”, Accordingly see Bronder, [0072]-[0073], “vehicle navigating through an environment can identify a particular zone in the environment and can, in response, generate an augmented reality display ”, [0074], [0075], “The augmented reality display system included in vehicle 810 can, based at least in part upon a determination that region 820 is associated with a particular zone, generate an augmented reality display 812, presented on a transparent surface 811 of the vehicle 810 [] display element [] terminates, at opposite ends, at boundaries of the zone 820.”) and Bronder further teaches or the one of the one or more vehicle parameters is a geofenced area and the corresponding threshold is a location of the vehicle relative to the geofenced and the step of terminating the display of the AR element is performed when the location of the vehicle is outside of the geofence (__Note: limitations are recited in alternatives therefore only covering one limitation by the prior art is sufficed for the rejection of the claim. The two alternative limitations have been strikethrough for clarification and only the last limitation is addressed in the rejection__, Bronder, [0074], and [0075], “The augmented reality display system included in vehicle 810 can, based at least in part upon a determination that region 820 is associated with a particular zone, generate an augmented reality display 812, presented on a transparent surface 811 of the vehicle 810 [] display element [] terminates, at opposite ends, at boundaries of the zone 820.”).
Further, although Broder implicitly teaches wherein the step of terminating the display of the AR element includes preventing the display of any AR element specific to the selected game application when the vehicle is outside of the geofenced area (__according to the cited paragraphs of Broder, the teaching of turning off display element entirely would necessarily involve preventing the display of the AR element__), however, for the purpose of compact prosecution, Rathod also more explicitly teaches the limitation of wherein the step of terminating the display of the AR element includes preventing the display of any AR element specific to the selected game application when the vehicle is outside of the geofenced area (See…at least [0397], “The player also continuously moves about in a range of coordinates in the real world digital map or virtual world. In an embodiment in the event of exiting from said pre-defined geofence boundary 3535, [] automatically close virtual world map user interface 3610 by sever module 184”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with With terminating of display of the VR by preventing the siplay of any AR element when the vehicle is outside the geofenced area as taught by Broder and Rathod, with a reasonable expectation of success, with motivation of improving safety.
Regarding claim 26, modified Bronder teaches The method of claim 21, however, it doesn’t explicitly teach wherein the step of displaying the AR element includes changing the AR element that is displayed as a function of a change in location of the vehicle, where a first AR element is displayed in a first location and a second AR element that is different than the first AR element is displayed in a second location
Although, Syed teaches wherein the step of displaying the AR element includes changing the AR element that is displayed as a function of a change in location of the vehicle, where a first AR element is displayed in a first location and a second AR element that is different than the first AR element is displayed in a second location ([0067], “updates a position and orientation of an avatar in the virtual world based on the current GPS data 316.”, __Note: while Bronder teaches the AR elements displayed being different according to different location, Syed teaches the AR element i.e. an avatar (which is different from or does not represent a feature in the environment), being updated (position and location) which reads on different AR elements as recited in the claims where different AR elements represent the AR element being a function of the current location of the vehicle.), however, just for the purpose of compact prosecution, and as an alternative rejection Rathod also teaches the claimed limitation of the step of displaying the AR element includes changing the AR element that is displayed as a function of a change in location of the vehicle, where a first AR element is displayed in a first location and a second AR element that is different than the first AR element is displayed in a second location (See Rathod, at least [0003], “Pokémon Go™ enables user to identify and get particular type of Pokémon at particular location, pre-defined place or spot or location, gym and like.”)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with an AR element that is a game element as thought by Syed as a game avatar where the game avatar and its location changes as a function of a current location of the vehicle, or in alternative as taught by Rathod, with a reasonable expectation of success, with motivation of enhancing the game user’s experience.
Claims 12 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Bronder, in view of Syed, further in view of Wang et al., US20210150814A1, hereinafter “Wang”.
Regarding claim 12, Modified Bronder teaches the method of claim 1, however, it doesn’t explicitly teach wherein the AR element represents a different vehicle that is connected to the AR source and the step of displaying the AR element includes changing the location of the displayed AR element as a function of movement of the different vehicle.
Wang discloses wherein the AR element represents a different vehicle that is connected to the AR source (at least Abstract “at a first vehicle, a set of presentation attributes for a second vehicle that is in an external environment of the first vehicle,”) and the step of displaying the AR element includes changing the location of the displayed AR element as a function of movement of the different vehicle (at least [0003], “the virtual-reality display apparatus in a virtual-reality space, the second vehicle in accordance with the received set of presentation attributes for the second vehicle while the second vehicle is visible from the first vehicle in the external environment of the first vehicle.”, [0052], “vehicles frequently report their positions (e.g., as GPS coordinates) to cloud server 185 to enable cloud server 185 to perform the function of distributing sets of presentation attributes to vehicles near a given vehicle.”, [0090], “In one or more arrangements, the autonomous driving module(s) 160 can use such data to generate one or more driving scene models. The autonomous driving module(s) 160 can determine the position and velocity of the vehicle 100.”).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality display system in vehicle that can generate various display elements as part of an interactive game, as taught by Bronder (according to at least paragraph [0090]) with an AR element (game element) as thought by Syed, further includes a different vehicle that its displayed corresponding AR element changes with changing the location of the different vehicle as taught by Wang, with a reasonable expectation of success, with motivation of enhancing the game user’s experience by simulating the more of the real visual environment including moving vehicles an interacting element in the game.
Regarding claim 20, Bronder discloses the system of claim 17 (See rejection for claim 17), however, Bronder doesn’t explicitly disclose the AR source is located remotely from the vehicle and includes memory and instructions saved in the memory and relating to providing the at least one AR element, wherein the AR source includes a second communication device that communicates with the communication device of the vehicle and provides the at least one AR element to the vehicle, wherein the AR element that is displayed is a first AR element and wherein the AR source also is adapted to communicate with a second vehicle to enable the second vehicle to display in the second vehicle a second AR element that is related to said one of the multiple game applications at the same time as the first AR element is displayed.
Nevertheless, Wang teaches the AR source is located remotely from the vehicle ([0024], “While the various elements are shown as being located within the vehicle 100 in FIG. 1, it will be understood that one or more of these elements can be located external to the vehicle 100.”) and includes memory and instructions saved in the memory and relating to providing the at least one AR element ([0003], “The memory stores a communication module including instructions [] to receive, at a first vehicle, a set of presentation attributes for a second vehicle that is in an external environment of the first vehicle,”), wherein the AR source includes a second communication device that communicates with the communication device of the vehicle and provides the at least one AR element to the vehicle ([0028], __VR system can communicate (reads on a second communication device as claimed) with communication system 130 of vehicle which reads on communication device of the vehicle in the claim, also see Fig. 1 of Wang__), wherein the AR element that is displayed is a first AR element and wherein the AR source also is adapted to communicate with a second vehicle to enable the second vehicle to display in the second vehicle a second AR element that is related to said one of the multiple game applications at the same time as the first AR element is displayed (at least abstract and [0022], [0035]-[0039], [0041], “present to an occupant of vehicle 100, via a VR display apparatus 250 in a VR space, another vehicle in accordance with the received set of presentation attributes for the other vehicle while the other vehicle is visible from vehicle 100 in the external environment of the vehicle 100.”, [0045]-[0046], [0048] and also see Fig. 6).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the augmented reality source in a vehicle as taught by Bronder with an AR source including memory and instructions to provide at least one AR element and also communicates with the vehicle to provide the AR element to the vehicle as taught by Wang as communication the presentation attributes (of own vehicle and also other vehicles) to enable displaying of the presentation attributed (AR element in the claim) at the same for the own vehicle and other vehicles in the surrounding of the own vehicle, with a reasonable expectation of success, with the motivation of enabling data-exchange with the vehicle’s communication system to provide and display AR elements of other vehicles as well as own vehicle in order to enhance the occupant’s experience of gaming system by increasing the interaction between the user of the AR source with own vehicle and other vehicles.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HAJAR HASSANIARDEKANI whose telephone number is (571)272-1448. The examiner can normally be reached Monday thru Friday 8 am-5 pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erin Piateski can be reached at 5712707429. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/H.H./Examiner, Art Unit 3669
/Erin M Piateski/Supervisory Patent Examiner, Art Unit 3669