Prosecution Insights
Last updated: April 19, 2026
Application No. 18/649,971

Electronic Device With Inner Display and Externally Accessible Input-Output Device

Non-Final OA §103
Filed
Apr 29, 2024
Examiner
WILSON, DOUGLAS M
Art Unit
2622
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
4 (Non-Final)
75%
Grant Probability
Favorable
4-5
OA Rounds
2y 9m
To Grant
91%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
320 granted / 427 resolved
+12.9% vs TC avg
Strong +16% interview lift
Without
With
+16.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
25 currently pending
Career history
452
Total Applications
across all art units

Statute-Specific Performance

§101
1.9%
-38.1% vs TC avg
§103
56.5%
+16.5% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
14.4%
-25.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 427 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-20 are pending. Response to Arguments Applicant's arguments regarding Claim 16, filed 02 December, have been fully considered but they are not persuasive. Regarding Claim 16. Applicant argues the disclosure of Sako (US 2018/0003983), which explicitly teaches [0142] user control of internal electronic equipment while the inner display is off, is prohibited from combination with prior art references that teach wirelessly controlling external electronic equipment because Sako does not contemplate expanding the wireless control of electronic equipment to include wireless control of external electronic equipment while the inner display is off. The Examiner respectfully disagrees with Applicant’s argument. MPEP 2143.01 provides two conditions which prohibit the modification of a prior art reference with another prior art reference. (1) The proposed modification cannot render the prior art unsatisfactory for its intended purpose, and (2) The proposed modification cannot change the principle of operation of a reference Modifying Sako to wirelessly control external as well as internal electrical equipment, while the inner display is turned off, expands Sako but does not render Sako unsatisfactory for control of internal electrical equipment or change Sako’s principle of operation. The remainder of Applicant’s arguments have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2 are rejected under 35 U.S.C. 103 as being unpatentable over Sako (US 2018/0003983) in view of Wong (US 2017/0262150) and Katz (US 2016/0253044). All reference is to Sako unless indicated otherwise. Regarding Claim 1 (Currently Amended) Sako teaches a head-mounted device configured to operate in an environment [fig. 1], the head-mounted device comprising: an inner display [fig. 2 @Internal Display Unit] facing an eye box [fig. 2 @ eye position] configured to display first images [¶0090, “The image processing unit 508 includes an internal image generating unit 508-1 that generates an internal image on the basis of an image signal output from the control unit 501”]; a lens [fig. 2 @Virtual Image Optical Unit] through which the first images [¶0092, “A virtual image optical unit 513 is disposed in front of the display screen of the internal image display panel 511. The virtual image optical unit 513 enlarges and projects the display image of the internal image display panel 511, which is viewed by the user as an enlarged virtual image”] are viewable from an eye box [fig. 2 @eye position]; an outer display [fig. 2 @External Display Unit] configured to display second [¶0090, “in the case of externally displaying the same image as the internal image, the external image generating unit 508-2 is omitted”] images [¶0090, “an external image generating unit 508-2 that generates an external image on the basis of an image signal output from the control unit 501”]; and an image sensor [¶0081, “The environmental information acquisition unit 504 acquires information related to the environment outside the head-mounted image display device 1, and outputs to the control unit 501 … the environmental information acquisition unit 504 may also be equipped … an image sensor (camera)”, ¶0082, “the control unit 501 controls display operation for displaying … an external image seen from the outside of the relevant device according to environmental information acquired by the environmental information acquisition unit 504”]; configured to gather user input while the inner display is off [¶0139, “although omitted from illustration in FIG. 23 … the state transitions to the Only External Image On state”, ¶0142, “ In … the Only External Image On state in which the external image is turned on, the same image as the internal image or an external-only image that differs from the internal image is displayed ... Furthermore, in response to an instruction from the user given via the input operating unit 502 … the information to display as the external image may be changed”], wherein an image on an external surface in the environment is adjusted [¶0093, “The projection optical unit 514 enlarges and projects a real image of the external image displayed on the external image display panel 512 onto a wall or the like near the head-mounted image display device 1”, ¶0142 teaches changing the external image based on user input when the inner display is off] Sako does not teach the image sensor faces away from the eye box and is configured to gather user air gesture input; and an image projected on an external surface is adjusted based on a tracked location at which the air gesture input overlaps the image on the external surface Wong teaches an image sensor facing away from the eye box [¶0044, “Wearable computing device 10 may also include a camera 26 that is configured to capture images of the environment of wearable computing device 10 from a particular point-of-view. The images could be either video images or still images. The point-of-view of camera 26 may correspond to the direction where HMD 20 is facing”] and configured to gather user air gesture input [¶0049, “Processor 22 may analyze still images or video images obtained by camera 26 to identify any gesture that corresponds to a control instruction … a gesture that does not involve physical contact with the target device, such as a movement of the wearer's finger, hand, or an object held in the wearer's hand, toward the target device or in the vicinity of the target device, could be recognized as a control instruction”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of detecting air gestures with an image sensor, as taught by Wong, into the head mounted device taught by Sako in order to control the head mounted device without requiring a user to physically contact or manipulate an input device Sako in view of Wong does not teach an image projected on an external surface is adjusted based on a tracked location at which the air gesture input overlaps the image on the external surface Katz teaches an image [fig. 1A @140] projected on an external surface [¶0030, “Embodiments may also include a display 130 Display 130 may include … a projector and surface upon which images are projected … or any other electronic device for outputting visual information and/or creating a perception of user 102 of a presentation of visual information”] is adjusted [fig. 1A illustrates activated keys as taught by ¶0044] based on a tracked location [construed as air gesture finger position overlapping keys of virtual keyboard taught by ¶0039] at which an air gesture input [¶0022, System 100 may detect touch-free gestures, poses, and movement from one or more finger(s) 106 and/or one or both hand(s) 104 of a user 102”] overlaps the image on the external surface [fig. 1A @140, ¶0039, “processor 122 may receive image data from sensor 110 of user 102's hand 104, including five fingers 106 identified by processor 122 as fingers “A,” “B,” C,” “D,” and “E,” as illustrated in FIG. 1A. Processor 122 may determine at least one of the changes in movement, position, and orientation for some or all of fingers A-E, and of user 102's hand 104, in real-time as sensor 110 captures data such as image data, in order to track one or more hands 104 and one or more fingers 106 in air at a distance from the displayed keyboard image”, ¶0044, “The predefined gestures may involve a pattern of movement indicative of manipulating an activatable object, such as typing a keyboard key, clicking a mouse button, or moving a mouse housing. As used herein, an “activatable object” may include any displayed visual representation that, when selected or manipulated, results in data input or performance of a function. In some embodiments, a visual representation may include displayed image item or portion of a displayed image such as keyboard image 140, a virtual key, a virtual button, a virtual icon, a virtual knob, a virtual switch, and a virtual slider”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of detecting the positions of an air gesture on a projected image and adjust the projected image to indicate the overlap positions as taught by Katz, into the head mounted device taught by Sako in view of Wong in order to provide visual feedback of a control gesture that does not comprise tactile feedback. Regarding Claim 2 (Original), Sako in view of Wong and Katz teaches the head-mounted device defined in Claim 1 wherein the air gesture input comprises air gesture input from a user’s hand that is not contacting the head-mounted device [Katz: fig. 1A @108, ¶0039 and ¶0044]. Claims 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Sako (US 2018/0003983) in view of Lyons (US 2017/0255019). All reference is to Sako unless indicated otherwise. Regarding Claim 11 (Previously Provided), Sako teaches a head-mounted device, comprising: a head-mounted housing [fig. 1 @frame]; an inner display [fig. 2 @internal display unit] supported by the head-mounted housing, wherein the inner display comprises left and right displays [¶0069, “FIG. 2 illustrates the overhead appearance of a head-mounted image display device 1 of the opaque type. As illustrated in the drawing, the head-mounted image display device 1 has internal display units for the left and right eyes on the internal side, that is, the side that faces the user's face”] that are separated by an adjustable distance [¶0069, “the head-mounted image display device 1 is equipped with an interpupillary adjustment mechanism that adjusts the interpupillary width between the right-eye display unit and the left-eye display unit”]; a lens through which images on the inner display [fig. 2 @left and right virtual image optical unit] are viewable from an eye box [fig. 2 @ area in front of user’s eyes]; an outer display [fig. 2 @external display unit] supported by the head-mounted housing, wherein the outer display [¶0070, “In the head-mounted image display device 1 of the example illustrated in FIGS. 1 and 2, the external display units are disposed at positions in a front-and-back relationship with the internal display units … although the head-mounted image display device 1 is equipped with a pair of left and right external display units, a single external display unit … may also be provided”, replacing fig. 2 left and right external displays with a single display unit overlaps left and right internal displays in fig. 2] overlaps the left and right displays Sako does not teach a head mounted display configured to store a removable handheld electronic device that is non-overlapping with the outer display; and wireless communications circuitry configured to receive user input from the removable handheld electronic device Lyons teaches a head mounted display configured to store a removable handheld electronic device [figs. 1 and 2 @30] that is non-overlapping with the outer display [figs. 1 and 2 @50, fig. 1 @50 is construed as the single external display device taught by ¶0070 of Sako] and wireless communications circuitry configured to receive user input from the removable handheld electronic device [¶0090, “the remote controller 30 as illustrated in FIG. 7a, FIG. 7b, FIG. 8a, and FIG. 8b, receives input from the user 70 (not shown) and communicates the input to the mobile computing device 50 (not shown) … wireless communication is preferred”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of incorporating a single external display for the right and left eyes, as taught by Lyons, into the head mounted device taught by Sako in in order to provide an external display device with self-contained image and communications processing that can function as a standalone smart phone, reducing the costs of a user’s electronics. Regarding Claim 12 (Original), Sako in view of Lyons teaches the head-mounted device defined in Claim 11, wherein the head-mounted housing [Lyons: fig. 17A @518] comprises a recess [Lyons: fig. 17A illustrates sidewall 518 built up near 512 attachment point where 512 fits in recess of built up area] configured to receive the wireless handheld controller [Lyons: fig. 17A @512]. Regarding Claim 13 (Original), Sako in view of Lyons teaches the head-mounted device defined in Claim 12 further comprising magnetic structures [Lyons: ¶0083, “the method of attachment can be accomplished by any of various other methods of attachment, such as … Velcro® magnetics”] in the recess [fig. 17A illustrates 512 fitted in recess] configured to couple the wireless handheld controller [Lyons: [fig. 17A @512] to the head-mounted housing. Claims 16-17, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Sako (US 2018/0003983) in view of Rochford (US 2017/0068500). All reference is to Sako unless indicated otherwise. Regarding Claim 16 (Currently Amended), Sako teaches a head-mounted device configured to operate in an environment with external electronic equipment [¶0085], the head-mounted device comprising: an inner display [fig. 2 @ Internal Display Unit and fig. 5 @511] configured to display images [¶0090, “The image processing unit 508 includes an internal image generating unit 508-1 that generates an internal image on the basis of an image signal output from the control unit 501”]; a lens [fig. 2 @Virtual Image Optical Unit] through which the images [¶0092, “A virtual image optical unit 513 is disposed in front of the display screen of the internal image display panel 511. The virtual image optical unit 513 enlarges and projects the display image of the internal image display panel 511, which is viewed by the user as an enlarged virtual image”] are viewable from an eye box [fig. 2 @eye position]; an outer display [fig. 2 @External Display Unit] configured to display additional [¶0090, “in the case of externally displaying the same image as the internal image, the external image generating unit 508-2 is omitted”] images [¶0090, “an external image generating unit 508-2 that generates an external image on the basis of an image signal output from the control unit 501”]; and an input device configured to gather user input [¶0079, “The input operating unit 502 is equipped with one or more operating elements on which the user performs an input operation, such as keys, buttons, and switches, receives user instructions via the operating elements, and outputs to the control unit 501”], wherein the head-mounted device is operable in: a first mode [initial state] in which the user input [fig. 5 @502] is configured to control the head-mounted device [turn external image on] while the inner display [fig. 2 @ Internal Display Unit and fig. 5 @511] is displaying the images [¶0138, “In the Initial state, the head-mounted image display device 1 or 3 turns on the internal image and turns off the external image. At this point, if an instruction to display the external image is given via the input operating unit 502, the external image is turned on”], and a second mode [Only External Image on State] in which the user input is configured to control the internal [fig. 6 @3] electronic equipment [¶0183, “according to an input operation by the user on the input operating unit, turns display of the internal image or the external image on/off, conducts a color adjustment of the internal ext image or the external image, conducts a brightness adjustment of the internal image or the external image, changes a display size of the external image, or moves a display area of the external image”] while the inner display is off [¶0139, “although omitted from illustration in FIG. 23, if an instruction to turn off the display of the internal image is given while in the Both Images On state, the internal image is turned off, and the state transitions to the Only External Image On state”] Sako does not teach the input device is on an external surface of the head-mounted device; and user input is configured to wirelessly control the external electronic equipment Rochford teaches an input device [¶0018, “A touch sensitive device (TSD) 116 may be mounted to the headpiece 102 in a position that permits the person other than the user 104 to activate or otherwise use the TSD 116. While the TSD 116 is shown mounted to a front side of the external display 110, it will be understood that in other embodiments the TSD 116 may be mounted in any suitable position on the headpiece 102”] is on an external surface of the head-mounted device [fig. 1 @116] and user input is configured to wirelessly control the external electronic equipment [¶0024, “… The user 104 may activate controls … on a host system in communication with the HMD 100 (also not shown in FIG. 1) to select the predetermined information to be included on the external display 110. [¶0037, “ … controller 218 is communicatively coupled to HMD elements internal display 208, external display 210, optional internal sensor 212, optional external sensor 214, and optional touch sensitive device 216”, ¶0038, “the controller 218 may be communicatively coupled to an external device 206. Communication between the controller 218 and the external device 206 may via one or both of a wireless communication link 228”, ¶0039, “the controller 218 is communicatively coupled to a host system 204 via one or both of a wireless communication link 220 and/or a wired communication link 222. The host system 204 is further communicatively coupled to the external device 206 via one or both of a wireless communication link 224 and/or a wired communication link 226”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of wirelessly controlling external electronic equipment, as taught by Rochford, into the head-mounted device taught by Sako in order to control image processing performed by devices having greater computing power that the internal HMD processor without restricting user mobility wearing the HMD. Regarding Claim 17 (Original), Sako in view of Rochford teaches the head-mounted device defined in Claim 16 wherein the input device comprises a touch sensor [Rochford: fig. 1 @116] and the user input comprises touch input [Rochford: ¶0018, “While the TSD 116 is shown mounted to a front side of the external display 110, it will be understood that in other embodiments the TSD 116 may be mounted in any suitable position on the headpiece 102”]. Regarding Claim 20 (Original), Sako in view of Rochford teaches the head-mounted device defined in Claim 16 wherein the user input ¶0142, “ In … the Only External Image On state in which the external image is turned on, the same image as the internal image or an external-only image that differs from the internal image is displayed ... Furthermore, in response to an instruction from the user given via the input operating unit 502 (including blink operations and eyeball movement detected with a myoelectric sensor or oculo-electric sensor), the information to display as the external image may be changed”] is configured to select on-screen options [construed as external display content options] that are displayed on the outer display [Only external Image On state]. Claims 3-5 are rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Wong, Katz, and Short (US 2014/0139717). All reference is to Sako unless indicated otherwise. Regarding Claim 3 (Original), Sako in view of Wong and Katz teaches the head-mounted device defined in Claim 1 wherein the air gesture input comprises air gesture input from an accessory that is not contacting the head-mounted device [Wong: ¶0049, “a gesture that does not involve physical contact with the target device, such as a movement of the wearer's finger, hand, or an object held in the wearer's hand, toward the target device or in the vicinity of the target device, could be recognized as a control instruction”] Sako in view of Wong does not teach the accessory is selected from the group consisting of: a stylus, a finger-mounted device, and a handheld wireless controller Short teaches the air gesture accessory is selected from the group consisting of: a stylus, a finger-mounted device, and a handheld wireless controller [¶0028, “In the example shown in FIG. 4, user input device 26 includes an infrared digital stylus 28 and an infrared camera 30 for detecting stylus 28 in workspace 12. Although any suitable user input device may be used, a digital stylus has the advantage of allowing input in three dimensions”, ¶0043] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of holding a digital stylus when providing user input by an air gesture, as taught by Short, into the head mounted device taught by Sako in view of Wong and Katz in order to generate three-dimensional gestures using a handheld device that is similar to a writing instrument thereby making the movement corresponding to a gesture more intuitive. Regarding Claim 4 (Original), Sako in view of Wong and Katz teaches the head-mounted device defined in Claim 1 Sako in view of Wong and Katz does not teach the image sensor comprises a three-dimensional image sensor Short teaches an image sensor comprises a three-dimensional image sensor [¶0043, “As noted above, a touch-free infrared digital stylus has the advantage of allowing input in three dimensions, including along work surface 24, without a sensing pad or other special surface”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of a 3D image sensor, as taught by Short, into the head mounted device taught by Sako in view of Wong and Katz in order to recognize three-dimensional non-contact gestures. Regarding Claim 5 (Original), Sako in view of Wong and Katz teaches the head-mounted device defined in Claim 1 Sako in view of Wong and Katz does not teach the image sensor comprises a depth sensor Short teaches an image sensor comprises a depth sensor [¶0043, “… Depth cameras using structured light, time-of-flight, disturbed light pattern, or stereoscopic vision might also be used to enable in-air gesturing or limited touch and touch gesture detection without a touch pad”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of a depth sensor, as taught by Short, into the head mounted device taught by Sako in view of Wong and Katz in order to recognize three-dimensional non-contact gestures. Claims 6-8 are rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Wong, Katz, and Lyons (US 2017/0255019). All reference is to Sako unless indicated otherwise. Regarding Claim 6 (Previously Presented), Sako in view of Wong and Katz teaches the head-mounted device defined in Claim 1 further comprising a head-mounted housing that supports the inner and outer displays [¶0005, “Since the user may view an image both the front side and the back side of the device housing, a double-sided display device can be a good information providing tool”] Sako in view of Wong and Katz does not teach the head mounted housing configured to store a wireless handheld controller for the head-mounted device Lyons teaches a head mounted housing [fig. 1 @10] configured to store a wireless handheld controller [fig. 1 @30] for the head-mounted device [¶0092] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate a wireless handheld controller removably attached to the HMD frame, as taught by Lyons, into the head-mounted device taught by Sako in view of Wong and Katz in order to provide a convenient device with distinctly shaped buttons allowing the user to select a button based on feel without looking (Lyons: ¶0092). Regarding Claim 7 (Original), Sako in view of Wong, Katz, and Lyons teaches the head-mounted device defined in Claim 6 wherein the head-mounted housing [Lyons: fig. 17A @518] comprises a recess [Lyons: fig. 17A illustrates sidewall 518 built up near 512 attachment point where 512 fits in recess of built up area] configured to receive the wireless handheld controller [Lyons: fig. 17A @512]. Regarding Claim 8 (Original), Sako in view of Wong, Katz, and Lyons teaches the head-mounted device defined in Claim 6 further comprising magnetic structures configured to couple the wireless handheld controller [Lyons: fig. 1 @30] to the head-mounted housing [Lyons: ¶0083, “the method of attachment can be accomplished by any of various other methods of attachment, such as … Velcro® magnetics”]. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Wong, Katz, Lyons and Kim (US 2016/0224176). All reference is to Sako unless otherwise indicated. Regarding Claim 9 (Original), Sako in view of Wong, Katz, and Lyons teaches the head-mounted device defined in Claim 8 wherein the wireless handheld controller [Lyons: fig. 17A @512] is coupled to the head-mounted housing Sako in view of Wong, Katz, and Lyons does not teach the head-mounted device is configured to transfer wireless power to the wireless handheld controller Kim teaches the head-mounted device is configured to transfer wireless power [¶0280, “The PMIC may have a wired and/or wireless charging scheme … and an additional circuit for wireless charging, such as a coil loop circuit, a resonance circuit, a rectifier circuit, and the like may be added”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of wireless charging, as taught by Kim, into the head-mounted device, taught by Sako in view of Wong, Katz, and Lyons, in order to provide a convenient method of recharging the device that does not require connecting or disconnecting electrical wiring Sako in view of Wong, Lyons, and Kim does not teach wirelessly charging the wireless handheld controller One of ordinary skill in the art would understand the advantages of wirelessly recharging a portable remote control, integrated into the HMD system and how to incorporate the additional circuit required for wireless charging, taught by Kim, in order to wirelessly charge the main battery ion the head-up display and the battery on the remote control device with a reasonable expectation of success. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Wong, Katz, and Rochford (US 2017/0068500). All reference is to Sako unless indicated otherwise. Regarding Claim 10 (Original), Sako in view of Wong and Katz teaches the head-mounted device defined in Claim 1 Sako in view of Wong and Katz does not teach the outer display is configured to receive user input for controlling external electrical equipment Rochford teaches the outer display [fig. 1 @110] is configured to receive user input [¶0018, “A touch sensitive device (TSD) 116 may be mounted to the headpiece 102 in a position that permits the person other than the user 104 to activate or otherwise use the TSD 116. While the TSD 116 is shown mounted to a front side of the external display 110”] for controlling [¶0038, “the controller 218 has sufficient processing power to execute a visualization program and provide all functionality of a system … the controller 218 may be communicatively coupled to an external device 206”] external electrical equipment [fig. 2 @206 via 218 and 204] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of controlling external electronic equipment by user input on the head-mounted device, as taught by Rochford, into the head-mounted display, taught by Sako in view of Wong and Katz, in order to wirelessly control functionality that cannot be incorporated into the head-mounted display form factor. Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Lyons and Kim (US 2016/0224176). All reference is to Sako unless indicated otherwise Regarding Claim 14 (Original), Sako in view of Lyons teaches the head-mounted device defined in Claim 13 Sako in view of Lyons does not teach the head-mounted device is configured to transfer wireless power to the head-mounted housing Kim teaches a head-mounted device is configured to transfer wireless power to the head-mounted housing [¶0280, “The PMIC may have a wired and/or wireless charging scheme … and an additional circuit for wireless charging, such as a coil loop circuit, a resonance circuit, a rectifier circuit, and the like may be added”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of wireless charging a head-mounted device, as taught by Kim, into the head-mounted display, taught by Sako in view of Wong and Katz, in order to conveniently recharge head-mounted device batteries without the need for a charging cable Sako in view of Lyons and Kim does not teach incorporating the additional circuitry necessary to wirelessly charge the wireless handheld controller when coupled Before the application was filed one of ordinary skill in the art would have recognized the convenience of recharging the integrated wireless remote control while mounted on the HMD system. One of ordinary skill in the art would also understand how to incorporate the circuitry required for wireless charging, as taught by Sako in view of Lyons and Kim, in order to wirelessly charge the head-up display main battery as well as the battery powering a wireless remote control with a reasonable expectation of success. Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Lyons and Rochford (US 2017/0068500). All reference is to Kim unless indicated otherwise. Regarding Claim 15 (Original), Sako in view of Lyons teaches the head-mounted device defined in Claim 11, wherein receive user input from the input device [¶0142, “ in response to an instruction from the user given via the input operating unit 502, the information to display as the external image may be changed”] while the inner display is off [Sako: Only External Image on State], the user input device is a removable handheld electronic device [Lyons: fig. 1 @30] Sako in view of Lyons does not teach the wireless communications circuitry configured to receive user input and the input device is a removable handheld electronic device Rochford teaches wireless communications circuitry [¶0038, “ … the controller 218 may be communicatively coupled to an external device 206”, fig. 2 @206 via 218 and 204] configured to receive user input [touchscreen (fig. 1 @116 on external display 110)] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of transmitting user input via wireless communications, as taught by Rochford, into the head-mounted device, taught by Sako in view of Lyons, in order to control electronic devices not connected to the head mounted device with input from the user wearing the head-mounted device. Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Rochford and Wong. All reference is to Sako unless indicated otherwise. Regarding Claim 18 (Original), Sako in view of Rochford teaches the head-mounted device defined in Claim 16 Sako in view of Rochford does not teach the input device comprises an image sensor and the user input comprises air gesture input from at least one of: a hand and a wireless handheld controller Wong teaches the input device comprises an image sensor [¶0049, “processor 22 may analyze still images or video images obtained by camera 26 to identify any gesture that corresponds to a control instruction”] and the user input comprises air gesture input from at least one of: a hand [¶0049] and a wireless handheld controller [alternate limitation not addressed] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of detecting air gestures with an image sensor, as taught by Wong, into the head mounted device, taught by Sako in view of Rochford, in order to control the head mounted device without requiring a user to physically contact or manipulate an input device. Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Sako in view of Rochford, Wong, and Short (US 2014/0139717). All reference is to Sako unless indicated otherwise. Regarding Claim 19 (Original), Sako in view of Rochford and Wong teaches the head-mounted device defined in Claim 18 Sako in view of Rochford and Wong does not teach the image sensor comprises an infrared three-dimensional image sensor Short teaches an image sensor comprises an infrared three-dimensional image sensor [¶0043, “As noted above, a touch-free infrared digital stylus has the advantage of allowing input in three dimensions”] Before the application was filed it would have been obvious to one of ordinary skill in the art to incorporate the concept of an infrared three-dimensional image sensor, as taught by Short, into the head mounted device, taught by Sako in view of Rochford and Wong, in order to receive user input in three dimensions (Short: ¶0028). Conclusion Any inquiry concerning this communication or earlier communications from the Examiner should be directed to Douglas Wilson whose telephone number is (571)272-5640. The Examiner can normally be reached 1000-1700 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the Examiner by telephone are unsuccessful, the Examiner’s supervisor, Patrick Edouard can be reached at 571-272-7603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Douglas Wilson/Primary Examiner, Art Unit 2622
Read full office action

Prosecution Timeline

Apr 29, 2024
Application Filed
Jan 24, 2025
Non-Final Rejection — §103
Mar 14, 2025
Examiner Interview Summary
Mar 14, 2025
Applicant Interview (Telephonic)
Mar 31, 2025
Response Filed
Jun 03, 2025
Final Rejection — §103
Aug 05, 2025
Examiner Interview Summary
Aug 05, 2025
Applicant Interview (Telephonic)
Sep 05, 2025
Request for Continued Examination
Sep 09, 2025
Response after Non-Final Action
Sep 24, 2025
Non-Final Rejection — §103
Nov 21, 2025
Examiner Interview Summary
Nov 21, 2025
Applicant Interview (Telephonic)
Dec 02, 2025
Response Filed
Jan 29, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596431
VIRTUAL REALITY CONTENT DISPLAY SYSTEM AND VIRTUAL REALITY CONTENT DISPLAY METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12596279
ACTIVE MATRIX SUBSTRATE AND A LIQUID CRYSTAL DISPLAY
2y 5m to grant Granted Apr 07, 2026
Patent 12583317
INPUT DEVICE FOR A VEHICLE
2y 5m to grant Granted Mar 24, 2026
Patent 12585480
USE OF GAZE TECHNOLOGY FOR HIGHLIGHTING AND SELECTING DIFFERENT ITEMS ON A VEHICLE DISPLAY
2y 5m to grant Granted Mar 24, 2026
Patent 12579947
DISPLAY DEVICE
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
75%
Grant Probability
91%
With Interview (+16.1%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 427 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month