DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
Claims 1-17, 18, 20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Osterhout et al [US 2021/0173480]
Claim 1. A computer-implemented method (see Figs. 24, 37, abstract), comprising: receiving, by a virtual reality headset, sensor data that reflects characteristics of
a user wearing the virtual reality headset based on the sensor data (the user’s virtual head mounted eyepiece, headset, AR glasses or goggles 100, see abstract, Figs. 1, 10, para [0008, 0014, 0082, 0084]),
determining, by the virtual reality headset, that the user is likely experiencing motion sickness (the headset/eyepiece/AR glassed wearer is an experienced or shock experienced such as of motion sickness or motion unstable situations or experience a g-force headshake from a proximate explosion causing user’s health condition or injuries, see Figs. 1, 28, para [0418, 0456, 0557-0561, 0711, 0813, 0819]); and
based on determining that the user is likely experiencing motion sickness, activating, by the virtual reality headset, a vibration device that is configured to provide vibration feedback to a body of the user (the eyepiece's ability to interpret patterns of motion across a surface may allow for projecting reference content in order to give the user something to point at and provide them with visual, audio, tactile and/or vibration feedback (see para [0447, 0475]). The control aspects of the eyepiece may include combinations of using user action capture inputs/devices and feedback to the user related to external devices and applications, such as with an activity determination system and tonal output, sound warning or vibration. The soldier may have access to the activity determination system through the eyepiece 100 to monitor and determine the soldier's state of activity, such as in extreme activity, at rest, bored, anxious, in exercise, and the like, and where the eyepiece may provide forms of tonal output, sound, tactile or vibration warning when conditions go out of limits in any way, such as pre-set, learned, as typical, and the like. For instance, the soldier may be monitored for current state of health during combat, and where the soldier and/or another individual (e.g. medic, hospital personnel, another member of the soldier's team, a command center, and the like) are provided an audible signal when health conditions enter a dangerous level, such as indicating that the soldier has been hurt in battle. As such, others may be alerted to the soldier's injuries, and would be able to attend to the injuries in a more time effective manner (see para [0813, 0819]).
Claim 2. The method of claim 1, wherein activating the vibration device comprises: based on the sensor data, generating, by the virtual reality headset, an electrical
signal (as cited in respect to claim 1 above, and including electrical signals, see para [0068, 0069, 0454]); and
providing, by the virtual reality headset and to the vibration device, the electrical
signal (see para [0712]).
Claim 3. The method of claim 2, wherein the electrical signal is an audio signal (see para [0024, 0068, 0447]).
Claim 4. The method of claim 1, wherein the sensor data is received from a head position sensor (see para [0297, 0413]), a gaze direction sensor, an eye tracking sensor (see para [0087, 0454, 0459-0461, 0475), a gravity sensor (see para [0350]), a skin conductance sensor (see para [0022]), an accelerometer (see para [0351, 0366, 0356, 0413]), a proximity sensor (see para [0425, 0441]), a light sensor (see para [0387, 0401]), a camera (see para [0037-0041, 0055-0057, 0351]), and a microphone (see para [0295, 0362, 0409]).
Claim 5. The method of claim 1, wherein determining that the user is likely experiencing
motion sickness comprises: providing, to a model that is trained using previous sensor data and previous data indicating that previous users are experiencing motion sickness, the sensor data (as cited in respect to claim 1 above, and including the learning and training for soldier or user, see para [0805-0807, 0803, 0839-0841]); and
receiving, from the model, data indicating that the user is likely experiencing motion
sickness (as cited in respect to claim 1 above, see Figs. 15D, 34B, para [0179, 0260]).
Claim 6. The method of claim 1, comprising: receiving, by the virtual reality headset and from the user, data indicating whether the user is experiencing motion sickness (as cited in respect to claims 1 and 5 above); and
providing, for output by the virtual reality headset, the data indicating whether the
user is experiencing motion sickness, the sensor data, and data indicating that the user
is likely experiencing motion sickness (as cited in respect to claims 1 and 5 above, such as the soldier or user’s experienced or shock experienced such as of motion sickness or motion unstable situations or experience a -g-force headshake from a proximate explosion causing user’s health condition or injuries).
Claim 7. The method of claim 1, wherein the virtual reality headset is executing diving
simulation software (the eyepiece may be able to detect a plurality of movements of the hand, ranging from simple motions normally associated with computer mouse motion, to more highly complex motion, such as interpretation of complex hand motions in a simulation application by the computer software, see para [0437, 0771, 0784]).
Claim 8. The method of claim 1, wherein determining that the user is likely experiencing
motion sickness is further based on graphics being displayed on a screen of the virtual
reality headset (as cited in respect to claim 1 above, and including the graphic display, see TABLE-US-00002, para 0340, 0362, 0368, 0436, 0467, 0761, 0764, 0820]).
Claim 9. The method of claim 1, wherein determining that the user is likely experiencing
motion sickness is further based on a field of view or a narrowing or widening in the field
of view of graphics being displayed on a screen of the virtual reality headset (as cited in respect to claim 1 above, and including small display and/or or narrow or wide view, see Fig. 2, para [0228, 0314, 453]).
Claim 10. The method of claim 1, comprising: after activating the vibration device, receiving, by the virtual reality headset, additional sensor data that reflects the characteristics of the user wearing the virtual reality headset; based on the additional sensor data, determining, by the virtual reality headset, that the user is continuing to likely experience motion sickness; and based on determining that the user is continuing to likely experiencing motion sickness: adjusting, by the virtual reality headset, a vibration level of the vibration device; and activating, by the virtual reality headset, an additional vibration device that is configured to provide additional vibration feedback to the body of the user (as cited in respect to claim 1 above, and including additional sensor data from two or more of the head position sensor (see para [0297, 0413]), a gaze direction sensor, an eye tracking sensor (see para [0087, 0454, 0459-0461, 0475), a gravity sensor (see para [0350]), a skin conductance sensor (see para [0022]), an accelerometer (see para [0351, 0366, 0356, 0413]), a proximity sensor (see para [0425, 0441]), a light sensor (see para [0387, 0401]), a camera (see para [0037-0041, 0055-0057, 0351]), and a microphone (see para [0295, 0362, 0409]); and the head mounted eyepiece 100 can controlling and adjusting the focus of the lens and changing the focal length of the lens, or control to adjust the gain or other parameters of the corrections see para [0344, 0345, 0354, 0356]).
Claim 11. The method of claim 1, comprising: based on determining that the user is likely experiencing motion sickness, adjusting, by the virtual reality headset, a field of view of graphics being displayed on a screen of the virtual reality headset (as cited in respect to claims 8-10 above).
Claim 12. The method of claim 1, comprising: receiving, by the virtual reality headset and from the user, data indicating whether the user is experiencing motion sickness,
wherein determining that the user is likely experiencing motion sickness is further
based on the data indicating whether the user is experiencing motion sickness as cited in respect to claim 6 above).
Claim 13. A driving simulator, comprising: an input device that is configured to receive input from a user (the cameras or other imagers used for the contactless imaging may be mounted on opposite sides of one set of augmented reality glasses for one person. For example, a two-camera version is shown in FIG. 8F, with two cameras 870 mounted on frame 864. The user may initiate a data-gathering session by pressing a touch pad on the glasses, or by giving a voice command (see Fig. 8F, para [0743, 0746]);
a feedback device that is configured to provide feedback to the user (the sensors feedback to the user, see para [0353, 0356, 0418]);
a virtual reality headset that is configured to: receive data indicating a type of disability of the user based on the type of disability of the user (the headset/eyepiece/AR glassed wearer is an experienced or shock experienced such as of motion sickness or motion unstable situations or experience a g-force headshake from a proximate explosion causing user’s health condition or injuries, or unconscious, see Figs. 1, 28, para [0418, 0456, 0557-0561, 0711, 0813, 0819]),
select a driving scenario (selecting when the soldier, user or wearer of the eyepiece is walking or driving a vehicle, by applications as mentioned above for searching for providers of goods and services (see Figs. 18-20A, 72, para [0024, 0302, 0407, 0455, 0522, 0526, 0746, 0783, 0786, 0809, 0810]);
execute the driving scenario; and while executing the driving scenario: receive, via the input device, input from the user; and based on the input from the user and the type of disability of the user, provide, to the feedback device, instructions to provide feedback to the user (as cited above, and during the wearer eyepiece is walking or driving to view the driving environment scenario/information including weather conditions, off-road, mud, heavy rain, snow, current status, friendly forces position and strength, see Figs. 18-20A, para [0526, 0542, 0786, 0809]).
Claim 14. The driving simulator of claim 13, wherein the input device comprises a steering device or a hand control for a throttle or a brake input (as cited in claim 13 above, and including the brake and steering wheel, see para [0548, 0809, 0810]).
Claim 15. The driving simulator of claim 13, wherein the feedback device is a vibration
device that is configured to provide vibration feedback to a body of the user (as cited in respect to claim 1 above).
Claim 16. The driving simulator of claim 13, wherein the virtual reality headset is configured to: after executing the driving scenario, determine whether to repeat the driving scenario based on the type of disability of the user (as cited in respect to claim 13 above, and including the swiping past at different speed with repeated motions, movement and the like, see para [0416]).
Claim 17. A controller that is configured to communicate with a driving simulator,
comprising: a first input device that is configured to receive a first type of input from a user (the user’s first hand uses on the virtual keyboard, see Fig. 14, para [0014, 0015, 0393]);
a second input device that is configured to a second type of input from the user (the user’s second hand uses on the virtual keyboard by swipe, tap, touch, press, click, roll of a rollerball, and the like, see Fig. 14, para [0015, 0017, 0393, 0429, 0430]);
a processor that is configured to: receive data indicating a type of disability of the user (the processor controls to receive feedback to the injury user’s hand through haptic feedback such as acceleration, vibration, force, pressure, electrical impulse, temperature, electric field sensing, and the like, see para [0436, 0551, 0840]); and
based on the type of disability of the user, activate the first input device and deactivate the second input device (reads upon the soldier may have access to the activity determination system through the eyepiece to monitor and determine the soldier's state of activity, such as in extreme activity, at rest, bored, anxious, in exercise, hurt, injury and the like (as type of disability), and where the eyepiece may provide forms of tonal output or sound warning when conditions go out of limits in any way, such as pre-set, learned, as typical, and the like. For instance, the soldier may be monitored for current state of health during combat, and where the soldier and/or another individual (e.g. medic, hospital personnel, another member of the soldier's team, a command center/input, and the like) are provided an audible signal when health conditions enter a dangerous level, such as indicating that the soldier has been hurt in battle. As such, others may be alerted to the soldier's injuries, and would be able to attend to the injuries in a more time effective manner. In embodiments, other user action capture inputs and/or devices, feedback related to external devices and/or external applications, and the like, as described herein, may also be applied (see para [0813-0819]).
Claim 18. The controller of claim 17, wherein the first input device and the second input
device are selected from a hand or foot control for a brake input or throttle, a steering
device, and a control for a vehicle accessory (as cited in respect to claim 14 above).
Claim 20. The controller of claim 17, comprising: a first feedback device that is configured to provide a first type of feedback to the user (the feedback from the external device to the eyepiece, see para [0078, 0084, 0086]); and
a second feedback device that is configured to provide a second type of feedback to the user (the command and control interface may present feedback from the eyepiece applications in the eyepiece, see para [0085]),
wherein the processor is configured to: facilitate communication between the first feedback device and the driving simulator, based on the type of disability of the user (the feedback including viewing of imaged object, vibration, haptic, medical simulation, combat simulations to the soldier, user or wearer, see Figs. 15, 15A, para [0039, 0068, 0414, 0415, 0436, 0784]); and block communication between the second feedback device and the driving simulator, based on the type of disability of the user (as cited in respect to claim 1 above, and including different type of eyepiece wearers or various users such as injury soldier, high stress user, medical situations and/or patient, see para [0784, 0836, 0849, 0859]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Osterhout et al [US 2021/0173480]
Claim 16. Osterhout et al fails to disclose wherein the virtual reality headset is configured to: after executing the driving scenario, determine whether to repeat the driving scenario based on the type of disability of the user.
However, Osterhout et al teaches that the sense device may include an optical sensor or optical transmitter as a way for movement to be interpreted as a command. For instance, a sense device may include an optical sensor mounted on the hand of the wearer, and the eyepiece housing may include an optical transmitter, such that when a user moves their hand past the optical transmitter on the eyepiece, the motions may be interpreted as commands. A motion detected through an optical sensor may include swiping past at different speeds, with repeated motions, combinations of dwelling and movement, and the like (see para [0416]).
The soldier may be able to control the vehicle dashboard device through voice controls to the eyepiece. The military vehicle may be immersed in a very loud acoustic environment (see para [0810]).
The soldier may be monitored for current state of health during combat, and where the soldier and/or another individual (e.g. medic, hospital personnel, another member of the soldier's team, a command center, and the like) are provided an audible signal when health conditions enter a dangerous level, such as indicating that the soldier has been hurt in battle. As such, others may be alerted to the soldier's injuries, and would be able to attend to the injuries in a more time effective manner (see para [0810, 0813]).
Therefore, it would have been obvious to one skill in the art to recognize that the soldier, user or wearer of eyepiece is capable to repeat driving or motion for viewing and/or attending an injury soldier or disability soldier along the battle route/road environment over again as necessary to save a solider life.
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Osterhout et al [US 2021/0173480] in view of Forsland et al [US 2021/0223864]
Claim 19. Osterhout et al fails to disclose wherein the first input device and the second input device are configured to connect to a wheelchair.
However, Osterhout et al teaches that the user’s second hand uses on the virtual keyboard by swipe, tap, touch, press, click, roll of a rollerball, and the like, see Fig. 14, para [0015, 0017, 0393, 0429, 0430]);
Forsland et al suggests that the bio-signal data is collected from the sensors on or connected to the headset, input into the printed circuit board on the headset, processed on the headset, and then output to transducers including but not limited to visual, auditory, and haptic transducers. The senses that may be stimulated as biofeedback may include, e.g. output commands sent to inflatable bags for pressure, temperature for increasing therapeutic sensation, electrical stimulation, or even a command to an external device or system such as a prosthetic hand/arm/leg or wheelchair for controlled movement. The headset is a unique design that consolidates more processing power into a smaller package than conventional BCI headsets. The portability factor may make a significant impact on individuals who want to have this experience in locations that are away from modern conveniences, as well as for people who are disabled (see Fig. 1, para [0036-0039]).
Therefore, it would have been obvious to one skill in the art before the effective filing date of the invention to add or implement the entering or inputting commands to a wheelchair of Forsland et al to the second user input of Osterhout et al to provide convenient and easy for a patent, soldier or user being disable walking, injury, etc. used a wheelchair for transportation and capable to control the wheelchair and communicate with eyepiece and remote caregiver or other people for assistance.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Shams et al discloses the interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The displayed content is an interactive keyboard control element. The keyboard control element is associated with an input path analyzer, a word matching search facility, and a keyboard input interface, wherein the user inputs text by sliding a pointing device across character keys of the keyboard input interface in a sliding motion through an approximate sequence of a word the user would like to input as text, wherein the input path analyzer determines the characters contacted in the input path, the word matching facility finds a best word match to the sequence of characters contacted and inputs the best word match as input text. [US 2011/0225536]
Choeng et al discloses the user worn head-mounted device HMD for monitoring and viewing surrounding environment, which can be use to a disability person walking by a wheelchair. [US 2017/0011210]
Any inquiry concerning this communication or earlier communications from examiner should be directed to primary examiner craft is Van Trieu whose telephone number is (571) 2722972. The examiner can normally be reached on Mon-Fri from 8:00 AM to 3:00 PM. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Mr. Wang Quan-Zhen can be reached on (571) 272-3114.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair- direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786- 9199 (IN USA OR CANADA) or 571-272-1000.
/VAN T TRIEU/
Primary Examiner, Art Unit 2685
01/16/2026