Prosecution Insights
Last updated: April 19, 2026
Application No. 18/514,477

SYSTEM AND METHODS FOR PROVIDING IMMERSIVE VIRTUAL CONTENT DURING A SELF-DRIVING MODE OF AN AUTONOMOUS VEHICLE

Non-Final OA §103
Filed
Nov 20, 2023
Examiner
KARWAN, SIHAR A
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Adeia Guides Inc.
OA Round
1 (Non-Final)
56%
Grant Probability
Moderate
1-2
OA Rounds
3y 3m
To Grant
82%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
215 granted / 385 resolved
+3.8% vs TC avg
Strong +26% interview lift
Without
With
+25.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
41 currently pending
Career history
426
Total Applications
across all art units

Statute-Specific Performance

§101
11.2%
-28.8% vs TC avg
§103
27.8%
-12.2% vs TC avg
§102
33.4%
-6.6% vs TC avg
§112
16.4%
-23.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 385 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Claims 1-20 are pending. Claims 1-20 are rejected. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-8, 12-19 are rejected under 35 U.S.C. 103 as being unpatentable over Moustafa US 2022026864 in view of Rochford US 20190243599. 1.Moustafa teaches a method comprising: determining a navigation path including one or more autonomous mode segments and one or more non-autonomous mode segments; 166; Such autonomous driving stacks may allow vehicles to self-control or provide driver assistance to detect roadways, navigate from one point to another, 286; The exchange and collection of behavioral models; operating in an autonomous mode or human driver mode determining that a vehicle is entering an autonomous mode, for an autonomous mode segment, based on at least one of real-time location data associated with the vehicle or metadata associated with the navigation path that the vehicle is traveling along; 286; The exchange and collection of behavioral models; operating in an autonomous mode or human driver mode based on determining that the vehicle is entering an autonomous mode, initiating a virtual journey view on one or more displays integrated within the vehicle, including causing presentation of immersive content on the one or more displays; 183; user interfaces (e.g., 230), may include driving controls (e.g., a physical or virtual steering wheel, accelerator, brakes, clutch, etc.) to allow a human driver to take control from the autonomous driving system (e.g., in a handover or following a driver assist action). operating a motion simulated accessory within the vehicle based on either data associated with the immersive content or data indicative of current and future motion status of the vehicle; 183; user interfaces (e.g., 230), may include driving controls [operating a notion simulated assessor] (e.g., a physical or virtual steering wheel, accelerator, brakes, clutch, etc.) to allow a human driver to take control from the autonomous driving system (e.g., in a handover or following a driver assist action). determining that the vehicle will no longer operate in autonomous mode based on at least one of the data indicative of current and future motion status of the vehicle or navigation path metadata; 243; request may be reactionary (e.g., in response to a pullover event, sensor outage, or emergency), while in other cases the request may be sent to preemptively cause the remote valet service 1505 to take over control of the vehicle (based on a prediction that a pullover event or other difficulty is likely given conditions ahead on a route. modifying a display of the immersive content based on determining that the vehicle will no longer operate in autonomous mode, 212; generate alerts for presentation on the vehicle's audio and/or graphic displays, such as to alert a driver of potential areas of concerns, prepare one or more passengers for a handover or pullover event, Moustafa [834-835 combining overlapping including stitching i.e. transparency] teaches all of the limitations of claim but does not explicitly teach including adjusting a transparency of the one or more displays to cause the presentation of the immersive content to fade away while a real-world environment becomes visible. However; Rochford 60 teaches the synchronization FOVs of virtual reality displays and their creation of an immersive, realistic experience, a transition animation may be used when synchronizing the follower FOV 504 to the master FOV 502. the display of the follower HMD 106 may fade to black and fade back in at the new “position” that matches the master FOV 502. FOV 504 can be zoomed out to a distant point of view that encompasses both the virtual location of the follower FOV 504 and the virtual location of the master FOV 502, and then zoomed in to the virtual location of the master FOV 502. HMD 106 causes a warning to be displayed in the follower FOV 504 to warn the operator of the impending transition. transition of the follower FOV 504 to match the master FOV 502. Therefore, it was well known at the time the invention was filed and would have been obvious to one of ordinary skill in the art to combine the teachings with a reasonable expectation of success in order to prevent motion sickness using adaptive display such that the claimed invention as a whole would have been obvious. 2. (Original) The method of claim 1, wherein the data associated with the immersive content indicates motion status of items depicted within the immersive content. 204; augment or direct operation of devices within the vehicle, which may be used by the users while the vehicle is in motion through an in-vehicle environment adjustment phase 625. As an illustrative example, on a curvy road, it may be detected that a passenger is using a VR/AR headset, and the autonomous vehicle may signal the headset to cause the screen inside the headset to tilt to adjust the visual to make the ride and viewing experience smoother. 3. (Original) The method of claim 1, wherein the current and future motion status comprise a current acceleration and an anticipated deceleration, respectively. 926; whether the driver's acceleration/deceleration was within the expected acceleration/deceleration range after the hand off; 4. (Original) The method of claim 1, wherein the virtual journey view is at least one of a video game, a recording of a popular travel route, or a gamification of a real-world environment surrounding the vehicle. 250; the view of the vehicle surroundings and road conditions that are displayed in near real-time; controls the vehicle (similar to video immersive games where the player sees the car's view and drives and control them with a wheel, handheld controller, etc.) 5. (Original) The method of claim 1, wherein a playback speed of at least a portion of the immersive content displayed on the one or more displays is based on the current motion status of the vehicle.250; the view of the vehicle surroundings and road conditions that are displayed in near real-time [playback speed 1 to 1]; controls the vehicle (similar to video immersive games where the player sees the car's view and drives and control them with a wheel, handheld controller, etc.) 6. (Original) The method of claim 1, further comprising: styling the immersive content based on real-time conditions associated with a current real- world environment. 250; the view of the vehicle surroundings and road conditions that are displayed in near real-time [playback speed 1 to 1]; 7. (Original) The method of claim 1, wherein operating the motion simulated accessory further comprises: providing haptic feedback by way of the motion simulated accessory, the haptic feedback comprising at least one of simulating a motion associated with an item depicted within the immersive content or compensating a motion associated with the current or future motion status of the vehicle. 592; vehicle can confirm driver engagement though the use of certain sensors and monitoring. For example, the vehicle can use gaze monitoring, haptic feedback, audio feedback, etc. 8. (Original) The method of claim 1, wherein operating the motion simulated accessory further comprises: determining the current or future motion status of vehicle exceeds a threshold; 926; whether the driver's acceleration/deceleration was within the expected acceleration/deceleration range after the hand off; adjusting the immersive content to depict an adjusted immersive content item, the adjusted immersive content item having a motion status matching the current or future motion status of the vehicle; and 250; the view of the vehicle surroundings and road conditions that are displayed in near real-time [matching playback speed 1 to 1]; providing haptic feedback by way of the motion simulated accessory, wherein the haptic feedback is aligned with a motion associated with the current or future motion status of the vehicle. 592; vehicle can confirm driver engagement though the use of certain sensors and monitoring. For example, the vehicle can use gaze monitoring, haptic feedback, audio feedback, etc. 12. is rejected using the same rejections as made to claim 1. 13. is rejected using the same rejections as made to claim 2. 14. is rejected using the same rejections as made to claim 3. 15. is rejected using the same rejections as made to claim 4. 16. is rejected using the same rejections as made to claim 5. 17. is rejected using the same rejections as made to claim 6. 18. is rejected using the same rejections as made to claim 7. 19. is rejected using the same rejections as made to claim 8. 21-33. (Canceled) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 9 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Moustafa and Rochford as applied to claim above, and further in view of Alcaidinho 20190047498. 9. Moustafa and Rochford teach all of the limitations of claim 1 but does not teach, further comprising: modifying the display of the immersive content based on at least one of user gaze data, user head position data, or user head orientation data. However, Alcaidinho teaches 22; Based on the direction of gaze for the passenger, the projector 210 in the vehicle may project an image 220 such that it is within the direction of gaze of the passenger. Therefore, it was well known at the time the invention was filed and would have been obvious to one of ordinary skill in the art to combine the teachings with a reasonable expectation of success in order to prevent motion sickness using adaptive display such that the claimed invention as a whole would have been obvious. 20. is rejected using the same rejections as made to claim 9. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Moustafa and Rochford as applied to claim above, and further in view of Frolov US 20150190726. 10. Moustafa and Rochford all of the limitations of claim 1 but does not teach, further comprising: adjusting a start location and end location of the virtual journey view to match a start location and end location of the autonomous mode segment. However, Frolov teaches 32; separate game scene, the visitors may believe they are on a lifelike ride within a virtual reality. 55; Controlling system 510 can control the operation of transportation system 505 and cause starting and stopping of the actuators, motors, engines, and/or drivers to cause visitors move inside the interactive amusement attraction. Skull Island. Therefore, it was well known at the time the invention was filed and would have been obvious to one of ordinary skill in the art to combine the teachings with a reasonable expectation of success in order to provide interactive amusement such that the claimed invention as a whole would have been obvious. Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Moustafa and Rochford as applied to claim above, and further in view of Tsang US 20180088682. 11. Moustafa and Rochford all of the limitations of claim 1 but does not teach, wherein the presentation of immersive content on the one or more displays further comprises: adjusting the transparency of the one or more displays to cause the presentation of immersive content to fade in while the real-world environment becomes less visible. However, Tsang teaches 56; adjusting a position, a transparency, a size, and a blending effect of the second window to provide the semi-immersive virtual reality to the user with both the virtual reality and the keyboard simultaneously viewable on the display. Therefore, it was well known at the time the invention was filed and would have been obvious to one of ordinary skill in the art to combine the teachings with a reasonable expectation of success in order to provide a semi-immersive virtual reality such that the claimed invention as a whole would have been obvious. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SIHAR A KARWAN whose telephone number is (571)272-2747. The examiner can normally be reached on M-F; 11-7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached on 571-270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SIHAR A KARWAN/Examiner, Art Unit 3664
Read full office action

Prosecution Timeline

Nov 20, 2023
Application Filed
Nov 12, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589502
CARGO-HANDLING APPARATUS, CONTROL DEVICE, CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12589750
VEHICULAR CONTROL SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12589504
SYSTEM AND METHOD FOR COGNITIVE SURVEILLANCE ROBOT FOR SECURING INDOOR SPACES
2y 5m to grant Granted Mar 31, 2026
Patent 12583100
ROBOT TO WHICH DIRECT TEACHING IS APPLIED
2y 5m to grant Granted Mar 24, 2026
Patent 12576516
HUMAN SKILL BASED PATH GENERATION
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
56%
Grant Probability
82%
With Interview (+25.8%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 385 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month