Prosecution Insights
Last updated: April 19, 2026
Application No. 18/913,870

SYSTEM FOR ASSISTED OPERATOR SAFETY USING AN HMD

Non-Final OA §103
Filed
Oct 11, 2024
Examiner
KHAN, IBRAHIM A
Art Unit
2628
Tech Center
2600 — Communications
Assignee
Mentor Acquisition One LLC
OA Round
3 (Non-Final)
82%
Grant Probability
Favorable
3-4
OA Rounds
2y 2m
To Grant
94%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
447 granted / 546 resolved
+19.9% vs TC avg
Moderate +12% lift
Without
With
+12.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 2m
Avg Prosecution
17 currently pending
Career history
563
Total Applications
across all art units

Statute-Specific Performance

§101
2.7%
-37.3% vs TC avg
§103
66.5%
+26.5% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
11.1%
-28.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 546 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION In the response to this office action, the Examiner respectfully requests that support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line numbers in the specification and/or drawing figure(s). This will assist the Examiner in prosecuting this application. CONTINUED EXAMINATION UNDER 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/08/2026 has been entered. RESPONSE TO AMENDMENT Acknowledgment is made of the amendment filed 10/09/2025, in which:claims 1 and 11 are amended; and the rejections of the claims are traversed. Claims 1-20 are currently pending and an Office Action on the merits follows. CLAIM REJECTIONS - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 , if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 1. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Gieseke et al. US 20140336876 in view of Baillot US 20120120070 and further in view of Miyoshi et al. US 20070009137. Consider claim 1. Gieseke discloses a method of selectively presenting information to a user of a wearable head device heads-up glasses e.g. google glass eyeglasses [0004][0015], the method comprising: detecting, via a camera of the wearable head device, a user is looking at a view target of a vehicle [0023] detect viewing direction of person wearing the wearable head device [0025] user looking at a non-transparent portion of the vehicle , determining, based on a relationship between the view target and a field of view of the camera, whether a gaze direction of the user is toward the view target [0025] driver’s gaze is determined to be towards a non-transparent portion of the vehicle; in accordance with a determination that the gaze direction of the user is toward the view target, presenting a first content via a see-through display of the wearable head device [0025] in response to gaze being towards the non-transparent portion of the vehicle a virtual display depicts a scene of the exterior of the car [0026] the virtual image may be displayed at or by the glass worn by the driver while the driver is operating the vehicle; and in accordance with the determination that the gaze direction of the user is away from the view target, forgoing presenting the first content via the see-through display [0026] when the display of the glasses is not activated the driver views through the lenses of the glasses [0030] the display is activated when the driver views non-transparent portions that hide or obstruct outside view, wherein: said presenting the first content comprises presenting the first content at a location in the vehicle determined based on a location of the view target [0025][0026][0030]. Gieseke does not explicitly disclose detecting the view target using the camera, the view target comprising a light emitter; the camera is configured to detect light of a first wavelength outside a visible spectrum; the light emitter is configured to emit light of the first wavelength; and further configured to detect light emitted from the light emitter of the view target of the vehicle Baillot et al. however discloses detecting the view target using the camera [0020] the tracking mechanism is a camera [0023] camera 18, the view target comprising a light emitter [0019] active marker is attached to object. the active marker has a source that produces IR or UV light and uses the light to create a pattern. [0021] marker may refer to one or more LEDs or patterns. see marker’s constellation; the camera is configured to detect light of a first wavelength outside a visible spectrum; the light emitter is configured to emit light of the first wavelength [0019-0020] the emitter emits IR or UV light and the camera captures the IR or UV light which cannot be seen by a human further configured to detect light emitted from the light emitter of the view target of the vehicle[0019-0020] active marker is attached to object. the emitter emits IR or UV light and the camera captures the IR or UV light which cannot be seen by a human. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the wearable head device of Gieseke to include detecting the view target using the camera, the view target comprising a light emitter; the camera is configured to detect light of a first wavelength outside a visible spectrum; the light emitter is configured to emit light of the first wavelength; further configured to detect light emitted from the light emitter of the view target of the vehicle, as taught by Baillot, to provide precise and continuous position and orientation tracking [0011]. Gieseke as modified by Baillot do not disclose a focus distance of the first content is determined based on a location of an object external to the vehicle. Miyoshi however discloses a focus distance of the first content is determined based on a location of an object external to the vehicle. fig. 4 [0147] change displays of a viewpoint converted image of a zone having a high probability of collision by different hues based on the distance and relative speed between the owner s vehicle and a second body, and the probability of collision calculated based on the aforementioned pieces of information. [0086] change color or hue based on distance or probability of collision. [0138-0139] describes an embodiment which uses an HMD. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the wearable head device of Gieseke as modified by Baillot to include, a focus distance of the first content is determined based on a location of an object external to the vehicle., as taught by Miyoshi, to enable a driver or a pedestrian to recognize a risk of collision more intuitively, thereby assisting to accomplish safe driving or walking [0149]. Consider claim 2. Gieseke as modified by Baillot disclose the method of claim 1, wherein the light of the first wavelength comprises infrared light Baillot [0019-20]. Motivation to combine is similar to motivation in claim 1. Consider claim 3. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 1, further comprising determining a light characteristic from the light emitter Baillot [0019] active marker is attached to object. the active marker has a source that produces IR or UV light and uses the light to create a pattern. [0021] marker may refer to one or more LEDs or patterns. see marker’s constellation, wherein the first content is determined according to the determined light characteristic. Gieseke the content is determined based on the object the user is looking at [0025][0026][0030]. Baillot the visible light characteristic of the object is determined based on the one or more LED pattern. Motivation to combine is similar to motivation in claim 1. Consider claim 4. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 3, wherein the visible light characteristic comprises a blinking rate of the light Baillot [0021] the light sources have to activate in order to emit light. Motivation to combine is similar to motivation in claim 1. Consider claim 5. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 3, wherein the light characteristic comprises one or more of a light geometry and a light pattern Baillot [0021] pattern or marker constellation. Motivation to combine is similar to motivation in claim 1. Consider claim 6. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 1, wherein said determining whether the gaze direction of the user is toward the view target comprises detecting, via the camera of the wearable head device, one or more objects in the vehicle Gieseke [0023][0025][0026][0030]. Consider claim 7. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 1, wherein the vehicle is a moving vehicle and wherein the user is seated in a passenger seat of the vehicle Gieseke [0026] the virtual image may be displayed at or by the glass worn by the driver while the driver is operating the vehicle. [0030] but could be a passenger in the passenger seat. Consider claim 8. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 1, wherein the relationship between the view target and the field of view of the camera comprises an angle between the gaze direction and light from the light emitter Gieseke [0023] detect viewing direction of person wearing the wearable head device [0025] user looking at a non-transparent portion of the vehicle. also see Baillot [0019-0021]. Consider claim 9. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 1, further comprising in accordance with the determination that the gaze direction of the user is toward the view target: determining that the gaze direction of the user is no longer toward the view target; in accordance with the determination that the gaze direction of the user is no longer toward the view target, ceasing the presentation of the first content. Gieseke [0027] display based on driving condition or triggering event includes system determining if the driver is viewing a particular direction... displayed views may also be based on driver’s gaze [0035] if one content is being displayed it can be stopped according to driving condition. Consider claim 10. Gieseke as modified by Baillot and Miyoshi disclose the method of claim 1, further comprising: detecting, via the camera, a second view target of the vehicle, the second view target Gieseke [0025] the first view target can be the hood of the car as second view object can be a side door of the vehicle comprising a second light emitter Baillot [0019] active marker is attached to object. the active marker has a source that produces IR or UV light and uses the light to create a pattern. [0021] marker may refer to one or more LEDs or patterns. see marker’s constellation; determining, based on a relationship between the second view target and the field of view of the camera, whether the gaze direction of the user is toward the second view target Gieseke [0025] driver’s gaze is determined to be towards a non-transparent portion of the vehicle. side door; in accordance with a determination that the gaze direction of the user is toward the second view target, presenting a second content via the see-through display Gieseke [0025] in response to gaze being towards the non-transparent portion of the vehicle a virtual display depicts a scene of the exterior of the car [0026] the virtual image may be displayed at or by the glass worn by the driver while the driver is operating the vehicle; and in accordance with a determination that the gaze direction of the user is away from the second view target, forgoing presenting the second content via the see-through display. Gieseke [0026] when the display of the glasses is not activated the driver views through the lenses of the glasses [0030] the display is activated when the driver views non-transparent portions that hide or obstruct outside view. Motivation to combine is similar to motivation in claim 1. Claim 11 is rejected for similar reasons to claim 1 where the processor is wearable computer or processor [0004]. Claim 12 is rejected for similar reasons to claim 2. Claim 13 is rejected for similar reasons to claim 3. Claim 14 is rejected for similar reasons to claim 4. Claim 15 is rejected for similar reasons to claim 5. Claim 16 is rejected for similar reasons to claim 6. Claim 17 is rejected for similar reasons to claim 7. Claim 18 is rejected for similar reasons to claim 8. Claim 19 is rejected for similar reasons to claim 9. Claim 20 is rejected for similar reasons to claim 10. RESPONSE TO ARGUMENTS Applicant's arguments file have been fully considered but are not persuasive. Applicant argues (pages 6-7) that the cited references fail to disclose “a focus distance of the first content is determined based on a location of an object external to the vehicle.” The Office agrees and has accordingly updated the rejection to include the Miyoshi reference which clearly shows that the distance different objects are displayed and highlighted, e.g. as bodies A B C, are relative to their distance to the vehicle (see rejection above for further details). V. CONCLUSION Any inquiry concerning this communication or earlier communications from the examiner should be directed to IBRAHIM A KHAN whose telephone number is (571)270-7998. The examiner can normally be reached on 10am-6pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nitin Patel can be reached on 571-272-7677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. IBRAHIM A. KHAN Primary Examiner Art Unit 2628 /IBRAHIM A KHAN/ 03/05/2026Primary Examiner, Art Unit 2628
Read full office action

Prosecution Timeline

Oct 11, 2024
Application Filed
May 30, 2025
Non-Final Rejection — §103
Aug 26, 2025
Response Filed
Oct 07, 2025
Final Rejection — §103
Nov 11, 2025
Response after Non-Final Action
Jan 08, 2026
Request for Continued Examination
Jan 24, 2026
Response after Non-Final Action
Mar 06, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602017
WRISTWATCH AND WRISTWATCH-TYPE DISPLAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12603067
Displaying Image Data based on Ambient Light
2y 5m to grant Granted Apr 14, 2026
Patent 12573152
OVERLAY TECHNOLOGY FOR ENHANCING CONNECTIVITY AND REALISM IN INTERACTING SIMULATIONS
2y 5m to grant Granted Mar 10, 2026
Patent 12572211
VIRTUAL REALITY INTERACTION
2y 5m to grant Granted Mar 10, 2026
Patent 12557706
PIXEL PACKAGE AND MANUFACTURING METHOD THEREOF
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
82%
Grant Probability
94%
With Interview (+12.0%)
2y 2m
Median Time to Grant
High
PTA Risk
Based on 546 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month