Prosecution Insights
Last updated: April 19, 2026
Application No. 18/231,607

METHODS, SYSTEMS, APPARATUSES, AND DEVICES FOR FACILITATING PROVISIONING OF A VIRTUAL EXPERIENCE

Non-Final OA §102§103
Filed
Aug 08, 2023
Examiner
SNYDER, ADAM J
Art Unit
2623
Tech Center
2600 — Communications
Assignee
Red Six Aerospace Inc.
OA Round
3 (Non-Final)
69%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
88%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
622 granted / 896 resolved
+7.4% vs TC avg
Strong +19% interview lift
Without
With
+18.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
30 currently pending
Career history
926
Total Applications
across all art units

Statute-Specific Performance

§101
0.5%
-39.5% vs TC avg
§103
59.3%
+19.3% vs TC avg
§102
26.6%
-13.4% vs TC avg
§112
6.8%
-33.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 896 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/24/2025 has been entered. Response to Amendment The amendment filed on 11/25/2025 has been considered by Examiner. Claim Objections Claims 1 and 18 are objected to because of the following informalities: Claims are missing period at the end of the statement. Appropriate correction is required. Claim 20 is objected to because of the following informalities: The claim contains the limitation “even camera” please change to --event camera--. Appropriate correction is required. Claim 21 is objected to because of the following informalities: The claim contains the limitation “the neuromorphic sensor” on lines 2 and 3, please change to --a neuromorphic sensor--. Appropriate correction is required. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 4-6, 17-19, and 24-26 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kirchner et al (US 2019/0041979 A1). Claim 1, Kirchner (Fig. 1-6C) discloses a system (100b; Fig. 3; Paragraph [0025]; wherein discloses a hybrid headtracking system) of tracking a position (h or (118); Fig. 2A) of a wearable worn (102; Fig. 1) by a user (104; Fig. 1 and 2A; wherein disclose a pilot) as an indication of a user's position in a space (106; Fig. 2A; Paragraph [0020]; wherein “The relative navigation problem involves the determination of an accurate relative position and orientation (pose) of the head of the user 104 relative to the aircraft 106”), said wearable (102; Fig. 1) comprising a plurality of spatially-oriented markers (Paragraph [0047]; wherein discloses “white the fiducial markers 110a-b are rigidly mounted to the helmet 102”), said system (100b; Fig. 3) comprising: an event camera (108; Fig. 2A and 3; wherein discloses a camera) mounted in a cockpit of an aircraft (106; Fig. 2A; Paragraph [0025]; wherein discloses “mounted inside the cockpit), said event camera (108; Fig. 3) configured to operate asynchronously and transmit data (108 and 136; Fig. 3) only upon a change in relative movement (Paragraph [0036]) of said plurality of spatially-oriented markers (110a-100d; Fig. 3) to with respect to said event camera (108; Fig. 3); a pre-mapped data set (140; Fig. 3; Fig. 4A; Paragraph [0037]; wherein discloses a constellation database of marker identifiers and 3D locations) representing at least a portion of said cockpit (106; Fig. 2A); and a processor (122; Fig. 3; wherein discloses a control processors) adapted to determine a spatial location and orientation a position (Paragraph [0024]; wherein discloses “ the various reference frames of the hybrid headtracking system 100a may be defined in terms of poses, e.g., position translations (r) and orientational rotations (ϕ) relative to each other”) of said wearable (102; Fig. 1) relative to said cockpit (106; Fig. 1) based, at least in part (Fig. 3; wherein data from camera used to correct position and orientation data), on said data from said event camera (108; Fig. 3) of said spatially-oriented markers (110a-110d; Fig. 3) relative to said pre-mapped data (140; Fig. 3; Fig. 4A; Paragraph [0037]; wherein discloses a constellation database of marker identifiers and 3D locations). Claim 4, Kirchner (Fig. 1-6C) discloses wherein said space (106; Fig. 1) is a cockpit of an airplane (Paragraph [0025]; wherein discloses a cockpit of an aircraft) and said event camera (108; Fig. 1) is mounted to a surface of said cockpit (Paragraph [0047]; wherein discloses “the hybrid headtracking system 100d may be associated with an “outside-in” configuration whereby the camera 108 is rigidly mounted to the aircraft 106”). Claim 5, Kirchner (Fig. 1-6C) discloses wherein said user (104; Fig. 1) is a pilot (Paragraph [0020]; wherein discloses a pilot) of an aircraft (106; Fig. 1) and said event camera (108; Fig. 1) is mounted (Paragraph [0046]; wherein discloses “with respect to the hybrid headtracking system 100c, the camera 108 may be rigidly mounted to the helmet 102 worn by the user (104, FIG. 1) and the fiducial markers 110a-b may be rigidly mounted to the aircraft 106 (or other like mobile platform)”) on said pilot (104; Fig. 1). Claim 6, Kirchner (Fig. 1-6C) discloses wherein said event camera (108; Fig. 10 is mounted (Paragraph [0046]; wherein discloses “with respect to the hybrid headtracking system 100c, the camera 108 may be rigidly mounted to the helmet 102 worn by the user (104, FIG. 1)”) on said wearable (102; Fig. 1). Claim 17, Kirchner (Fig. 1-6C) discloses further comprising said wearable (102; Fig. 1). Claim 18, Kirchner (Fig. 1-6C) discloses wherein said wearable (102; Fig. 1) is a helmet (Paragraph [0021]; wherein discloses a “helmet 102”) Claim 19, Kirchner (Fig. 1-6C) discloses wherein said event camera (108; Fig. 2A) is configured with a certain perspective (C or 120; Fig. 2; Paragraph [0023]) relative to said plurality of spatially-oriented markers (110a and 110b; Fig. 2A). Claim 24, Kirchner (Fig. 1-6C) discloses wherein said spatial location and orientation (Paragraph [0024]; wherein discloses “ the various reference frames of the hybrid headtracking system 100a may be defined in terms of poses, e.g., position translations (r) and orientational rotations (ϕ) relative to each other”) of said wearable (102; Fig. 2B) is based on a two-dimensional image captured (Paragraph [0037]) by said event camera (108; Fig. 3B). Claim 25, Kirchner (Fig. 1-6C) discloses wherein said spatial location and orientation (Paragraph [0024]; wherein discloses “ the various reference frames of the hybrid headtracking system 100a may be defined in terms of poses, e.g., position translations (r) and orientational rotations (ϕ) relative to each other”) of said wearable (102; Fig. 2B) is in three dimensional space (Paragraph [0045]). Claim 26, Kirchner (Fig. 1-6C) discloses wherein said event camera (Paragraph [0047]; wherein discloses “the hybrid headtracking system 100d may be associated with an “outside-in” configuration whereby the camera 108 is rigidly mounted to the aircraft 106”) is configured with a certain perspective (Paragraph [0047]) relative to said plurality of spatially-oriented markers (Paragraph [0047]; wherein discloses “the hybrid headtracking system 100d may be associated with an “outside-in” configuration whereby the camera 108 is rigidly mounted to the aircraft 106, white the fiducial markers 110a-b are rigidly mounted to the helmet 102.”). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Kirchner et al (US 2019/0041979 A1) in view of Tawada et al (JP 2009-109319 A). Claim 7, Kirchner discloses the system of claim 1. Kirchner does not expressly disclose wherein spatially-oriented markers comprise active markers selected from the group consisting of light emitting diodes and OLEDs. Tawada (Fig. 1-7) discloses wherein spatially-oriented markers comprise active markers selected from the group consisting of light emitting diodes and OLEDs. wherein spatially-oriented markers (7a-7e; Fig. 2) comprise active markers (Fig. 3) selected from the group consisting of light emitting diodes (wherein discloses “The front face member light emitting portions 7a to 7f include a micro lens 13d that emits infrared light, an LED (display element) 13c that emits infrared light, and a transparent electrode that supplies electricity to the LED 13c”) and OLEDs. Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s a hybrid headtracking system by applying light emitting markers, as taught by Tawada, so to use a hybrid headtracking system with light emitting markers for providing a head motion tracker device that can measure the movement of the head of an observer even when an observer such as a pilot faces in all directions (See Tech-Problem in translation document). Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Kirchner et al (US 2019/0041979 A1) in view of Perbet et al (US 2011/0006984 A1). Claim 8, Kirchner discloses the system of claim 1. Kirchner does not expressly disclose wherein spatially-oriented markers the visual markers comprise inactive markers selected from the group consisting of paint, stickers, reflectors, IR reflectors and UV reflectors. Perbet (Fig. 1-10) discloses wherein spatially-oriented markers (3; Fig. 3) the visual markers comprise inactive markers (32; Fig. 3; Paragraph [0005]; wherein discloses passive markers) selected from the group consisting of paint, stickers, reflectors, IR reflectors and UV reflectors (32; Fig. 5; Paragraph [0032]; wherein discloses “ the contrast of the marker is obtained by the phosphorescent border excited by the UV light-emitting diodes near the CCD cameras”). Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s hybrid headtracking system by applying inactive markers, as taught by Perbet, so to use a hybrid headtracking system with inactive markers for providing the main advantages, compared with the prior art, of not requiring a power supply for the markers on the pilot's helmet, of being particularly simple and robust, and of giving signal/noise ratios that are always high irrespective of the illumination. It is therefore perfectly suited to the environment of aircraft cockpits (Paragraph [0006]). Claims 20, 21, 27, and 28 are rejected under 35 U.S.C. 103 as being unpatentable over Kirchner et al (US 2019/0041979 A1) in view of To (DE 102019215691 A1). Claim 20, Kirchner discloses the system of claim 1. Kirchner does not expressly disclose wherein said even camera is a neuromorphic camera. To (Fig. 1-2) discloses wherein said even camera is a neuromorphic camera (5; Fig. 2; wherein discloses “An image recording unit with a neuromorphic asynchronous image sensor” ). Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s hybrid headtracking system by applying a neuromorphic asynchronous image sensor, as taught by To, so to use a hybrid headtracking system with a neuromorphic asynchronous image sensor for providing a method for the state recognition of a vehicle occupant, in particular a driver of a vehicle, which is improved in comparison (See object of the invention in translation). Claim 21, Kirchner (Fig. 1-6C) discloses wherein said camera (108; Fig. 3) generates a model (Fig. 4B) of the position of the helmet (102; Fig. 4B) based on one or more pixels indicating a light change (Paragraph [0036]; wherein discloses pixels) from each of the plurality of spatially-oriented markers (110a and 110b; Fig. 4B) on the helmet (102; Fig. 1; Paragraph [0047]; wherein discloses “he hybrid headtracking system 100d may be associated with an “outside-in” configuration whereby the camera 108 is rigidly mounted to the aircraft 106, white the fiducial markers 110a-b are rigidly mounted to the helmet 102.” ). To (Fig. 1-2) discloses wherein said neuromorphic camera (5; Fig. 2; wherein discloses “An image recording unit with a neuromorphic asynchronous image sensor” ) generates a model of the position of the head based on one or more pixels (wherein discloses “An imaging unit 5 in the form of a camera with a neuromorphic asynchronous image sensor, the one Has a plurality of individually readable pixels, a person (not shown), for example the driver of a vehicle, detects this. The camera is arranged in such a way that when the person is in a predetermined position, for example a driver located in the driver's seat of a vehicle, the person's face and in particular the eyes are in the detection area”) of the neuromorphic sensor (5; Fig. 2; wherein discloses “An image recording unit with a neuromorphic asynchronous image sensor”) indicating a light change from the head (wherein discloses “The processing unit 6th processes the from the image acquisition unit 5 generated event values by using level transition detectors for each of the pixels to detect when the intensity values exceed or fall below a previously defined threshold value and then generates an asynchronous event value for this pixel”). Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s hybrid headtracking system by applying a neuromorphic asynchronous image sensor, as taught by To, so to use a hybrid headtracking system with a neuromorphic asynchronous image sensor for providing a method for the state recognition of a vehicle occupant, in particular a driver of a vehicle, which is improved in comparison (See object of the invention in translation). Claim 27, Kirchner discloses the system of claim 1. Kirchner does not expressly disclose wherein said event camera comprises an array of pixels, each pixel configured to operate independently and asynchronously to generate an event only when a change in brightness at that pixel exceeds a threshold. To (Fig. 1-2) discloses wherein said event camera (5; Fig. 2) comprises an array of pixels (Page 4 of translation discloses “An imaging unit 5 in the form of a camera with a neuromorphic asynchronous image sensor, the one Has a plurality of individually readable pixels”), each pixel configured to operate independently and asynchronously to generate an event (Page 4 of translation discloses “The processing unit 6th processes the from the image acquisition unit 5 generated event values by using level transition detectors for each of the pixels to detect when the intensity values exceed or fall below a previously defined threshold value and then generates an asynchronous event value for this pixel”) only when a change in brightness at that pixel exceeds a threshold (Page 4 of translation discloses “The processing unit 6th processes the from the image acquisition unit 5 generated event values by using level transition detectors for each of the pixels to detect when the intensity values exceed or fall below a previously defined threshold value and then generates an asynchronous event value for this pixel”). Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s hybrid headtracking system by applying a neuromorphic asynchronous image sensor, as taught by To, so to use a hybrid headtracking system with a neuromorphic asynchronous image sensor for providing a method for the state recognition of a vehicle occupant, in particular a driver of a vehicle, which is improved in comparison (See object of the invention in translation). Claim 28, To (Fig. 1-22) discloses wherein said event camera (5; Fig. 2) outputs an asynchronous stream of events (Page 4 of translation discloses “The processing unit 6th processes the from the image acquisition unit 5 generated event values by using level transition detectors for each of the pixels to detect when the intensity values exceed or fall below a previously defined threshold value and then generates an asynchronous event value for this pixel”) corresponding to changes in scene illumination (Page 3 of translation discloses “An image recording unit with a neuromorphic asynchronous image sensor, which records the illuminated part of the vehicle interior and generates an asynchronous event value for the individual pixels if, based on a previously recorded intensity value of a pixel, the currently recorded intensity value of this pixel exceeds or falls below a predefined threshold value”). Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s hybrid headtracking system by applying a neuromorphic asynchronous image sensor, as taught by To, so to use a hybrid headtracking system with a neuromorphic asynchronous image sensor for providing a method for the state recognition of a vehicle occupant, in particular a driver of a vehicle, which is improved in comparison (See object of the invention in translation). Claim 22 is rejected under 35 U.S.C. 103 as being unpatentable over Kirchner et al (US 2019/0041979 A1) in view of Roggendorf et al (US 9,892,489 B1). Claim 22, Kirchner discloses the system of claim 1. Kirchner does not expressly disclose wherein spatial location and orientation of said wearable is used to determine a positioning of AR content within a FOV of the wearable to cause the user to perceive the AR content as representing the AR content at geospatial location prescribed in a virtual environment mapped to correspond with a real environment in which the cockpit is positioned. Roggendorf (Fig. 1-7) discloses wherein spatial location and orientation (Col. 2, Lines 62-67; wherein discloses “Processor 68 can utilize a pattern recognition software module to identify the fixed patterns associated with markers 72 and 74 to determine both position and orientation of the head based upon relative location and distortion of the fixed pattern as perceived by each of cameras 262, 264 and 266”) of said wearable (63; Fig. 2) is used to determine a positioning of AR content (112 and 116; Fig. 6) within a FOV (Fig. 4) of the wearable (63; Fig. 2) to cause the user (wherein discloses a pilot; Col. 3, Lines 48-65) to perceive the AR content (112 and 116; Fig. 6) as representing the AR content at geospatial location (72 and 74; Fig. 5) prescribed in a virtual environment (Fig. 4) mapped to correspond with a real environment in which the cockpit is positioned (Fig. 6; wherein figure shows augment reality content being mapped to positions within the cockpit). Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s hybrid headtracking system by applying AR content to cockpit, as taught by Roggendorf, so to use a hybrid headtracking system with AR content to cockpit for providing the virtual head down display in an appropriate virtual location in response to the head orientation (Col. 2, Lines 15-22). Claim 23 is rejected under 35 U.S.C. 103 as being unpatentable over Kirchner et al (US 2019/0041979 A1) in view of Knebel et al (US 2017/0115730 A1). Claim 23, Kirchner discloses the system of claim 1. Kirchner does not expressly disclose wherein said pre-mapping is based on a CAD description of said cockpit. Knebel (Fig. 1) discloses wherein said pre-mapping is based on a CAD description (Paragraph [0013]; wherein discloses “A static object can be, for example, an instrument cluster, a display, a trim panel, a window frame, etc. These objects can be represented in a 3-D CAD model and serve for posture determination”) of said cockpit (Paragraph [0021]; wherein discloses “a 3D CAD model of the vehicle's interior and, in particular, of the cockpit having the steering wheel, is communicated to the head-mounted display 1”). Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Kirchner’s hybrid headtracking system by applying a CAD model, as taught by Knebel, so to use a hybrid headtracking system with a CAD model for providing a method with which the posture of the head-mounted display can be determined and, based thereon, information can be displayed in contact analog fashion (Paragraph [0006]). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ADAM J SNYDER whose telephone number is (571)270-3460. The examiner can normally be reached Monday-Friday 8am-4:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh D Nguyen can be reached at (571)272-7772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Adam J Snyder/Primary Examiner, Art Unit 2623 12/16/2025
Read full office action

Prosecution Timeline

Aug 08, 2023
Application Filed
Dec 13, 2024
Non-Final Rejection — §102, §103
Mar 18, 2025
Response Filed
May 21, 2025
Final Rejection — §102, §103
Nov 24, 2025
Request for Continued Examination
Dec 02, 2025
Response after Non-Final Action
Dec 16, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602108
SYSTEMS AND METHODS OF MINIMIZING AND MAXIMIZING DISPLAY OF THREE-DIMENSIONAL OBJECTS
2y 5m to grant Granted Apr 14, 2026
Patent 12603042
SHIFT REGISTER UNIT, GATE DRIVING CIRCUIT AND DISPLAY PANEL WITH PULL-UP VOLTAGE STABILIZING CIRCUITS
2y 5m to grant Granted Apr 14, 2026
Patent 12602759
VERIFICATION OF CRITICAL DISPLAY FRAME PORTIONS FOR MULTIPLE DISPLAYS IN A VIRTUAL MACHINE ENVIRONMENT
2y 5m to grant Granted Apr 14, 2026
Patent 12597400
DISPLAY PANEL AND DISPLAY DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12586546
DISPLAY PANEL INCLUDING PRE-CHARGING CONTROL MODULE AND DISPLAY DEVICE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
69%
Grant Probability
88%
With Interview (+18.8%)
2y 7m
Median Time to Grant
High
PTA Risk
Based on 896 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month