DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Drawings
The drawings were received on 12/11/2023. These drawings are acceptable.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-2, 4, and 8-10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Zhang et al. US Patent 11,307,654 B1 (of record, see IDS dated 12/11/2023, hereinafter, “Zhang”).
Regarding independent claim 1, Zhang discloses a head-mounted device (refer to col. 1, lines 6-7, disclosing a head-mounted display, and see at least Fig. 1 showing a block diagram of a near-eye display system 120, also referred to as a head-mounted display, of artificial reality system environment 100, col. 7, lines 23-26, and see Fig. 2 depicting an example of a near-eye display system in the form of head-mounted display device 200, col. 14, lines 58-61) comprising:
a frame (Fig. 2, head-mounted display device 200 includes body 220, col. 14, lines 64-65, equivalent to a frame, and see Fig. 3 depicting another example of a near-eye display system 300 that includes frame 305, col. 15, lines 50-58);
a lens coupled to the frame (Fig. 3, near-eye display system 300 has display 310 in frame 305, col. 15, lines 56-58, where display 310 is equivalent to a lens when system 300 is in the form of a pair of glasses, and Fig. 4 depicts an optical see-through augmented reality system 400 that includes substrate 420, where substrate 420 may be in the form of a lens of a pair of eyeglasses, col. 16 line 35 to col. 17 line 12), the lens being configured to reflect infrared light from an interior side of the lens and pass visible light (Fig. 4, system 400 includes combiner 415 comprised of substrate 420, input coupler 430, and output coupler 440 that may transmit visible light, i.e., wavelengths from about 400 nm to 650 nm, and reflect infrared light, i.e., wavelengths from 800 nm to 1000 nm, col. 16 line 62 to col. 17 line 3, and Fig. 1, near-eye display system 120 may include eye-tracking system 130, col. 8, lines 3-5, and an example of eye illumination system 800 shown in Fig. 8 includes substrate 820 with shortwave-pass filter 830 on one surface of substrate 820, where filter 830 may reflect infrared light while allowing wavelengths shorter than infrared light to pass through, col. 23, lines 33-46, where Examiner understands wavelengths shorter than infrared light can refer to visible light as disclosed by Zhang in col. 6 lines 23-24 where visible light is defined as wavelengths between 380 nm and 750 nm); and
a camera configured to capture the infrared light reflected from the interior side of the lens and to capture the visible light passing through the lens (Fig. 5 depicts an example eye-tracking system 510 that includes camera 514 which may include a complementary metal-oxide semiconductor (CMOS) pixel array to detect light having a wavelength less than about 750 nm, col. 19, lines 20-34, where Zhang also teaches a complementary metal-oxide semiconductor is a suitable infrared sensor, col. 6, lines 26-33, and Zhang further teaches a CMOS pixel array may be used with light having a wavelength less than 750 nm, col. 19, lines 26-30, therefore Zhang discloses a camera configured to capture infrared and visible light).
Regarding dependent claim 2, Zhang discloses the head-mounted device of claim 1, wherein the lens comprises a hot mirror (a shortwave-pass filter may include a hot mirror, col. 6, lines 6-8, where Zhang teaches a hot mirror reflects infrared light and allows visible light to pass through, col. 6 lines 34-48, and shortwave-pass filter 830 includes a hot mirror to reflect wavelengths greater than 750 nm, col. 24, lines 1-3).
Regarding dependent claim 4, Zhang discloses the head-mounted device of claim 1, further comprising a processor configured to determine a gaze of an eye based on the infrared light captured by the camera (Fig. 1, artificial reality system environment 100 includes near-eye display system 120 that is coupled to an input/output console 110 that includes a processor, col. 12, lines 58-60, and Zhang teaches infrared spectrum, e.g., wavelengths between 750-2500 nm, of the ambient light can be used to illuminate the user's eyes for eye tracking, col. 1, lines 40-43, refer also to col. 4, lines 47-53, and Zhang discloses an eye-tracking system in a display device may include an infrared camera, col. 2, lines 46-51, and tracking the eye may include tracking the position and/or shape of the pupil and/or the cornea of the eye, and determining the rotational position or gaze direction of the eye, col. 4, lines 59-62, and techniques such as centroiding algorithms may be used to determine the locations of the glints on the eye in the captured image, col. 5, lines 10-12, and the gaze direction of the eye may then be determined, col. 5, lines 14-18, where Examiner understands a centroiding algorithm requires a processor to run, and eye-tracking system 130 may be configured to estimate the orientation of the user’s eye to determination the direction of the user’s gaze within near-eye display system 120, col. 11, lines 3-6, and col. 12, lines 7-15, refer also to col. 19, line 35 to col. 20 line 58, where Zhang teaches the interpolation-based techniques may use certain mapping functions to map eye features to the gaze direction).
Regarding dependent claim 8, Zhang discloses the head-mounted device of claim 1, further comprising an infrared light source configured to reflect infrared light off of the lens and onto an eye of a user wearing the head-mounted device (Zhang teaches infrared spectrum, e.g., wavelengths between 750-2500 nm, of the ambient light can be used to illuminate the user's eyes for eye tracking, col. 1, lines 40-43, refer also to col. 4, lines 47-53, and Zhang discloses an eye-tracking system in a display device may include an infrared camera, col. 2, lines 46-51,).
Regarding dependent claim 9, Zhang discloses the head-mounted device of claim 1, wherein the interior side of the lens includes a concave shape (near-eye display system 120 includes display optics 124, where optics 124 may include a concave lens, col. 8, lines 46-52).
Regarding dependent claim 10, Zhang discloses the head-mounted device of claim 1, wherein the head-mounted device does not include an external camera on the frame (Fig. 2, head-mounted display device 200 does not have an external camera on body 220, col. 14, lines 58-67).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang as applied to claim 1 above, in view of Miao et al. US Patent 8,767,306 B1 (hereinafter, “Miao”).
Regarding dependent claim 3, Zhang discloses the head-mounted device of claim 1, wherein the system comprises a filter array with alternating infrared-pass filters and visible-pass filters, the infrared-pass filters passing infrared light and blocking visible light, the visible-pass filters passing visible light and blocking infrared light (near-eye display system may include a shortwave-pass filter with an array of windows or apertures which may allow infrared light to pass through, and the filter may allow visible wavelengths in the ambient light to pass through at any area of the filter and may only allow infrared light in the ambient light to pass through the array, col. 23, lines 18-32, and Zhang discloses combiner 415 may transmit at least 50% of light in a first wavelength range, such as visible light, and reflect at least 25% of light in a second wavelength range, such as infrared light, col. 16, lines 64-66, therefore Zhang discloses a filter array that can pass and block both visible and infrared light).
Zhang does not explicitly disclose the camera comprises a filter array, because combiner 415 comprises substrate 420 which may be in the form of a lens of a pair of eyeglasses (col. 16 line 35 to col. 17 line 12).
In a related field of invention, Miao discloses a display system with an infrared camera 202 that may filter and detect infrared light (refer to at least col. 8, lines 23-27 and claim 9 thereof). Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Miao to the disclosure of Zhang and included a filter array, such as that included on camera 202, to optimize the infrared light received by the camera for imaging (Miao, col. 8, lines 16-36).
Claims 5-7 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang as applied to claim 1 above, in view of Chan et al. US PGPub 2023/0210365 A1 (hereinafter, “Chan”).
Regarding dependent claim 5, Zhang discloses the head-mounted device of claim 1, further comprising a processor (Fig. 1, artificial reality system environment 100 includes near-eye display system 120 that is coupled to an input/output console 110 that includes a processor, col. 12, lines 58-60) configured to determine an orientation of the head-mounted device (near-eye display system 120 may include an inertial measurement unit 132, col. 8, lines 3-8, where an inertial measurement unit combines an accelerometer and a gyroscope, col. 30, lines 39-48, thus allowing the inertial measurement unit to determine orientation).
Zhang does not disclose the processor determines orientation of the head-mounted device based on the visible light by the camera (because Zhang discloses the determination of orientation by the inertial measurement unit 132).
In the same field of invention, Chan discloses a wearable system 200, shown in at least Fig. 2 thereof, which includes an outward-facing imaging system 464, shown in at least Fig. 4 thereof (refer to at least par. [0072] thereof) that images a portion of the world 470 (par. [0107] thereof). Chan discloses image obtained from outward-facing imaging system 464 can be used to detect objects in world 470 (pars. [0082], [0107]), and further discloses objects present in the environment may be detected by computer vision techniques so that the display system may analyze the image acquired by the imaging system to perform motion estimation (par. [0328] thereof). Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Chan to the disclosure of Zhang and included an outward facing imaging system, such as system 464, in the near-eye display system of Zhang to detect and determine properties of objects detected by sensors and cameras in the ambient environment (Chan, pars. [0328-330]). As a result, the prior art combination teaches and renders obvious the use of images captured by an outward facing imaging system to determine motion of the near-eye display through the use of images of the ambient environment in which the near-eye display system is functioning.
Regarding dependent claim 6, Zhang discloses the head-mounted device of claim 1, further comprising a processor configured to: determine a gaze of an eye based on the infrared light captured by the camera (Fig. 1, artificial reality system environment 100 with near-eye display system 120 is coupled to an input/output console 110 that includes a processor, col. 12, lines 58-60, and Zhang teaches the infrared portion of the ambient light can be used to illuminate the user's eyes for eye tracking, col. 1, lines 40-43, refer also to col. 4, lines 47-53, and Zhang discloses an eye-tracking system in a display device may include an infrared camera, col. 2, lines 46-51, and tracking the eye may include tracking the position and/or shape of the pupil and/or the cornea of the eye, and determining the rotational position or gaze direction of the eye, col. 4, lines 59-62); and determine an orientation of the head-mounted device (near-eye display system 120 may include an inertial measurement unit 132, col. 8, lines 3-8, where an inertial measurement unit combines an accelerometer and a gyroscope, col. 30, lines 39-48, thus allowing the inertial measurement unit to determine orientation).
Zhang does not disclose the processor determines orientation of the head-mounted device based on the visible light by the camera (because Zhang discloses the determination of orientation by the inertial measurement unit 132).
In the same field of invention, Chan discloses a wearable system 200, shown in at least Fig. 2 thereof, which includes an outward-facing imaging system 464, shown in at least Fig. 4 thereof (refer to at least par. [0072] thereof) that images a portion of the world 470 (par. [0107] thereof). Chan discloses image obtained from outward-facing imaging system 464 can be used to detect objects in world 470 (pars. [0082], [0107]), and further discloses objects present in the environment may be detected by computer vision techniques so that the display system may analyze the image acquired by the imaging system to perform motion estimation (par. [0328] thereof).
Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Chan to the disclosure of Zhang and included an outward facing imaging system, such as system 464, in the near-eye display system of Zhang to detect and determine properties of objects detected by sensors and cameras in the ambient environment (Chan, pars. [0328-330]). As a result, the prior art combination teaches and renders obvious the use of images captured by an outward facing imaging system to determine motion of the near-eye display through the use of images of the ambient environment in which the near-eye display system is functioning.
Regarding dependent claim 7, Zhang discloses the head-mounted device of claim 1, further comprising: an inertial measurement unit (Fig. 1, near-eye display system 120 includes an inertial measurement unit 132, col. 8, lines 3-8), wherein the head-mounted device is configured to determine motion of the head-mounted device based on data detected by the inertial measurement unit (Fig. 1, console 110 may identify locators 126 in images captured by external imaging device 150 to determine the artificial reality headset's position, orientation, or both, col. 9, lines 19-22, where Examiner understands a time rate-of-change of position and/or orientation determine the motion of the device).
Zhang does not disclose the head-mounted device is configured to determine motion of the head-mounted device based on the visible light.
In the same field of invention, Chan discloses a wearable system 200, shown in at least Fig. 2 thereof, which includes an outward-facing imaging system 464, shown in at least Fig. 4 thereof (refer to at least par. [0072] thereof) that images a portion of the world 470 (par. [0107] thereof). Chan discloses image obtained from outward-facing imaging system 464 can be used to detect objects in world 470 (pars. [0082], [0107]), and further discloses objects present in the environment may be detected by computer vision techniques so that the display system may analyze the image acquired by the imaging system to perform motion estimation (par. [0328] thereof). Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Chan to the disclosure of Zhang and included an outward facing imaging system, such as system 464, in the near-eye display system of Zhang to detect and determine properties of objects detected by sensors and cameras in the ambient environment (Chan, pars. [0328-330]). As a result, the prior art combination teaches and renders obvious the use of images captured by an outward facing imaging system to determine motion of the near-eye display through the use of images of the ambient environment in which the near-eye display system is functioning.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang applied to claim 1 above, in view of Wheelwright US Patent 11,619,808 B1 (hereinafter, “Wheelwright”).
Regarding dependent claim 11, Zhang discloses the head-mounted device of claim 1, wherein the camera is configured to adjust a focus distance from a first distance to a second distance (calibration data may be communicated from external imaging device 150 to console 110, and external imaging device 150 may receive one or more calibration parameters from console 110 to adjust one or more imaging parameters, e.g., focal length and/or focus, col. 9, lines 51-56).
Zhang does not disclose the camera is configured to adjust a first focus distance while capturing infrared light to a second focus distance while capturing visible light, and does not disclose the second distance is greater than the first distance.
In a related field of invention, Wheelwright discloses a head mounted display 100, shown in at least Fig. 1 thereof, where example optical assembly 830, shown in Figs. 8A-8C, provides a color-selective effective focal length since the effective focal length of the optical assembly 830 is dependent on the wavelength of light propagating through the optical assembly (col. 9, lines 25-29 thereof). Wheelwright teaches a longer focal length of 50 mm for rays 845 of first display light and a shorter focal length of 25 mm for rays 847 of second display light (col. 9, lines 4-11 and col. 11, lines 23-54 thereof). It would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Wheelwright to the disclosure of Zhang and provided the near-eye display system of Zhang with a color-selective effective focal length to optimally focus the various wavelengths of light reaching the camera (Wheelwright, col. 9, lines 12-29) and to provide increased optical efficiency with less electrical power required (Wheelwright, col. 2, lines 42-45). As a result, the prior art combination teaches and renders obvious the limitation of adjusting a first focus distance while capturing infrared light to a second focus distance while capturing visible light, because Zhang teaches the use of an infrared camera for eye-tracking (refer to at least claim 15 of Zhang) and Wheelwright teaches an optical assembly with a color-selective (i.e., wavelength dependent) focal lengths (refer to at least col. 2 of Wheelwright), therefore the prior art combination teaches a shorter focal length for a camera focusing on a user’s eye illuminated in infrared light and a longer focal length for a camera focusing on objects in the environment illuminated by visible light, therefore the limitation “the second distance is greater than the first distance” is also rendered obvious by the prior art combination.
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang as applied to claim 1, in view of Yamasaki US PGPub 2005/0174470 A1 (hereinafter, “Yamasaki”).
Regarding dependent claim 12, Zhang discloses the head-mounted device of claim 1, but Zhang does not disclose wherein the head-mounted device further comprises a temple arm hingedly coupled to the frame (Fig. 3, example near-eye display system 300 is in the form of a pair of glasses with a temple arm coupled to frame 305, but no hinge is depicted or disclosed), and Zhang does not disclose the camera is coupled to the temple arm.
In a related field of invention, Yamasaki discloses a head-mounted camera, shown in at least Figs. 2, 3, and 4, where head-mounted unit 2 includes hinges 24 and 25 to allow temples 12 to fold against front portion 11 (par. [0081] thereof) and head-mounted unit 2 includes a first image pickup device 30 and a first photographing optical system 31 that are attached at temple 12 (par. [0082] thereof). It would have been obvious to a person having ordinary skill in the art, before the effective filing date of claimed invention, to have applied the teachings of Yamasaki to the disclosure of Zhang and secured a camera to the temple of near-eye display device disclosed by Zhang, to provide a head-mounted camera which allows a user to easily perform a photographing operation without feeling bothered (Yamasaki, par. [0011]).
Claims 13-15 and 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Chan.
Regarding independent claim 13, Zhang discloses a method performed by a head-mounted device, the method comprising:
capturing, by a camera included in the head-mounted device, infrared light reflected off of a lens included in the head-mounted device, the infrared light including an image of an eye (Fig. 6, flow chart 600 lists operations performed by eye-tracking system 130 or eye-tracking system 510, col. 20, lines 59-63, such that at block 610 one or more light sources may illuminate the user’s eye, col. 20, lines 64-65, and at block 620, an imaging device, e.g., a camera, may collect light reflected by the user's eye and generate one or more images of the user's eye, col. 21 lines 5-7, and Zhang discloses an infrared spectrum of ambient light can be used to illuminate the user’s eyes for eye tracking, col. 23, lines 18-20);
determining, by a processor included in the head-mounted device based on the image of the eye, a direction of a gaze of the eye (Fig. 6, at block 650, the position of the cornea of the user's eye may be determined based on the locations of the glints in the captured image of the user's eye as shown in Fig. 7A, col. 21, lines 39-41, and at block 670, the gaze direction of the user’s eye may be determined, col. 22, lines 22-24, and Zhang discloses in Fig. 1 artificial reality system environment 100 with near-eye display system 120 is coupled to an input/output console 110 that includes a processor, col. 12, lines 58-60, and Zhang discloses modules of console 110 described in conjunction with Fig. 1 may be encoded as instructions in a non-transitory computer-readable storage medium that, when executed by the processor, cause the processor to perform the functions described, col. 12 line 58 to col 13 line 3);
capturing, by a camera included in the head-mounted device, visible light passing through the lens included in the head-mounted device (Fig. 5 depicts an example eye-tracking system 510 that includes camera 514 which may include a complementary metal-oxide semiconductor (CMOS) pixel array to detect light having a wavelength less than about 750 nm, i.e., visible light, col. 19, lines 20-34); and
determining motion of the head-mounted device (Zhang discloses a position sensor may generate one or more measurement signals in response to motion of the HMD device, col. 30 lines 53-55, and position sensors 128 may generate one or more measurement signals in response to motion of near-eye display system 120, col. 9, lines 57-59, and HMD device 200 may include motion sensors, col. 15, lines 30-32, and near-eye display system 300 may include sensors 350 that may include motion sensors, col. 15, line 65 to col. 16 line 2, thus Zhang discloses methods for determining motion of the head-mounted display device disclosed therein).
Zhang does not explicitly disclose the visible light including an image of an object beyond the lens included in the head-mounted device, nor does Zhang disclose determining motion of the head-mounted device based on the image of the object (while Zhang does disclose cameras in the near-eye display system, the cameras are not disclosed as being used to explicitly image objects for the goal of determining motion, as Zhang discloses motion sensors specifically for motion determination).
In the same field of invention, Chan discloses a wearable system 200, shown in at least Fig. 2 thereof, which includes an outward-facing imaging system 464, shown in at least Fig. 4 thereof (refer to at least par. [0072] thereof) that images a portion of the world 470 (par. [0107] thereof). Chan discloses image obtained from outward-facing imaging system 464 can be used to detect objects in world 470 (pars. [0082], [0107]), and further discloses objects present in the environment may be detected by computer vision techniques so that the display system may analyze the image acquired by the imaging system to perform motion estimation (par. [0328] thereof). Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Chan to the disclosure of Zhang and included an outward facing imaging system, such as system 464, in the near-eye display system of Zhang to detect and determine properties of objects detected by sensors and cameras in the ambient environment (Chan, pars. [0328-330]). As a result, the prior art combination teaches and renders obvious the use of images captured by an outward facing imaging system to determine motion of the near-eye display through the use of images of the ambient environment in which the near-eye display system is functioning.
Regarding dependent claim 14, Zhang in view of Chan (hereinafter, “modified Zhang”) discloses the method of claim 13, and Zhang discloses the method further comprising transmitting the infrared light onto the lens (Fig. 4, optical see-through augmented reality system 400 has optical combiner 415 that transmits infrared and visible light, col. 16, lines 64-66).
Regarding dependent claim 15, modified Zhang discloses the method of claim 13, and Zhang further discloses the method wherein the determining motion of the head-mounted device is based on the inertial measurement data detected by an inertial measurement unit included in the head-mounted device (Zhang near-eye display system 120 may include an inertial measurement unit 132, col. 8, lines 3-8, where an inertial measurement unit combines an accelerometer and a gyroscope, col. 30, lines 39-48, thus allowing the inertial measurement unit to determine orientation), and Chan discloses a wearable system 200, shown in at least Fig. 2 thereof, which includes an outward-facing imaging system 464, shown in at least Fig. 4 thereof (refer to at least par. [0072] thereof) that images a portion of the world 470 (par. [0107] thereof). Chan discloses image obtained from outward-facing imaging system 464 can be used to detect objects in world 470 (pars. [0082], [0107]), and further discloses objects present in the environment may be detected by computer vision techniques so that the display system may analyze the image acquired by the imaging system to perform motion estimation (par. [0328] thereof). Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Chan to the disclosure of Zhang and included an outward facing imaging system, such as system 464, in the near-eye display system of Zhang to detect and determine properties of objects detected by sensors and cameras in the ambient environment (Chan, pars. [0328-330]). As a result, the prior art combination teaches and renders obvious the use of images captured by an outward facing imaging system to determine motion of the near-eye display through the use of images of the ambient environment in which the near-eye display system is functioning.
Regarding independent claim 17, Zhang discloses a non-transitory computer-readable storage medium comprising instructions stored thereon (Fig. 1, console 110 may include a processor and a non-transitory computer-readable storage medium storing instructions executable by the processor, col. 12, lines 58-60) that, when executed by at least one processor, are configured to cause a head-mounted device to:
capture, by a camera included in the head-mounted device, infrared light reflected off of a lens included in the head-mounted device, the infrared light including an image of an eye (Fig. 5 depicts an example eye-tracking system 510 that includes camera 514 which may include a complementary metal-oxide semiconductor pixel array, col. 19, lines 20-34, where Zhang teaches a complementary metal-oxide semiconductor is a suitable infrared sensor, col. 6, lines 26-33, and Zhang teaches infrared spectrum, e.g., wavelengths between 750-2500 nm, of the ambient light can be used to illuminate the user's eyes for eye tracking, col. 1, lines 40-43, refer also to col. 4, lines 47-53, and Zhang discloses an eye-tracking system in a display device may include an infrared camera, col. 2, lines 46-51,);
determine, based on the image of the eye, a direction of a gaze of the eye (Zhang discloses an eye-tracking system in a display device may include an infrared camera, col. 2, lines 46-51, and tracking the eye may include tracking the position and/or shape of the pupil and/or the cornea of the eye, and determining the rotational position or gaze direction of the eye, col. 4, lines 59-62, and techniques such as centroiding algorithms may be used to determine the locations of the glints on the eye in the captured image, col. 5, lines 10-12, and the gaze direction of the eye may then be determined, col. 5, lines 14-18);
capture, by the camera included in the head-mounted device, visible light passing through the lens included in the head-mounted device (Fig. 5 depicts an example eye-tracking system 510 that includes camera 514 which may include a complementary metal-oxide semiconductor (CMOS) pixel array to detect light having a wavelength less than about 750 nm, col. 19, lines 20-34); and
determine motion of the head-mounted device (Fig. 2, head-mounted display device 200 may include motion sensors, col. 15, lines 30-32, and Fig. 3, near-eye display system 300 may include various sensors 350 including motion sensors, col. 15 line 65 to col. 16 line 2).
Zhang does not explicitly disclose the visible light including an image of an object beyond the lens included in the head-mounted device, nor does Zhang disclose the system can determine motion of the head-mounted device based on the image of the object (while Zhang does disclose cameras in the near-eye display system, the cameras are not disclosed as being used to explicitly image objects for the goal of determining motion, as Zhang discloses motion sensors specifically for motion determination).
In the same field of invention, Chan discloses a wearable system 200, shown in at least Fig. 2 thereof, which includes an outward-facing imaging system 464, shown in at least Fig. 4 thereof (refer to at least par. [0072] thereof) that images a portion of the world 470 (par. [0107] thereof). Chan discloses image obtained from outward-facing imaging system 464 can be used to detect objects in world 470 (pars. [0082], [0107]), and further discloses objects present in the environment may be detected by computer vision techniques so that the display system may analyze the image acquired by the imaging system to perform motion estimation (par. [0328] thereof). Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Chan to the disclosure of Zhang and included an outward facing imaging system, such as system 464, in the near-eye display system of Zhang to detect and determine properties of objects detected by sensors and cameras in the ambient environment (Chan, pars. [0328-330]). As a result, the prior art combination teaches and renders obvious the use of images captured by an outward facing imaging system to determine motion of the near-eye display through the use of images of the ambient environment in which the near-eye display system is functioning.
Regarding dependent claim 18, modified Zhang discloses the non-transitory computer-readable storage medium of claim 17, and Zhang further discloses wherein the instructions are further configured to cause the head-mounted device to transmit the infrared light onto the lens (Zhang Fig. 4, optical see-through augmented reality system 400 has optical combiner 415 that transmits infrared and visible light, col. 16, lines 64-66).
Regarding dependent claim 19, modified Zhang discloses the non-transitory computer-readable storage medium of claim 17, and Zhang further discloses wherein the determining motion of the head-mounted device is based on the inertial measurement data detected by an inertial measurement unit included in the head-mounted device (Zhang near-eye display system 120 may include an inertial measurement unit 132, col. 8, lines 3-8, where an inertial measurement unit combines an accelerometer and a gyroscope, col. 30, lines 39-48, thus allowing the inertial measurement unit to determine orientation), and Chan discloses a wearable system 200, shown in at least Fig. 2 thereof, which includes an outward-facing imaging system 464, shown in at least Fig. 4 thereof (refer to at least par. [0072] thereof) that images a portion of the world 470 (par. [0107] thereof). Chan discloses image obtained from outward-facing imaging system 464 can be used to detect objects in world 470 (pars. [0082], [0107]), and further discloses objects present in the environment may be detected by computer vision techniques so that the display system may analyze the image acquired by the imaging system to perform motion estimation (par. [0328] thereof). Therefore, it would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Chan to the disclosure of Zhang and included an outward facing imaging system, such as system 464, in the near-eye display system of Zhang to detect and determine properties of objects detected by sensors and cameras in the ambient environment (Chan, pars. [0328-330]). As a result, the prior art combination teaches and renders obvious the use of images captured by an outward facing imaging system to determine motion of the near-eye display through the use of images of the ambient environment in which the near-eye display system is functioning.
Claims 16 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Chan as applied to claims 13 and 17, respectively, above, and further in view of Wheelwright.
Regarding dependent claim 16, modified Zhang discloses the method of claim 13, and Zhang discloses the method further comprising adjusting a focus distance of the camera from a first distance to a second distance (calibration data may be communicated from external imaging device 150 to console 110, and external imaging device 150 may receive one or more calibration parameters from console 110 to adjust one or more imaging parameters, e.g., focal length and/or focus, col. 9, lines 51-56).
Zhang does not disclose the camera is configured to adjust a first focus distance while capturing infrared light to a second focus distance while capturing visible light, and does not disclose the second distance is greater than the first distance.
In a related field of invention, Wheelwright discloses a head mounted display 100, shown in at least Fig. 1 thereof, where example optical assembly 830, shown in Figs. 8A-8C, provides a color-selective effective focal length since the effective focal length of the optical assembly 830 is dependent on the wavelength of light propagating through the optical assembly (col. 9, lines 25-29 thereof). Wheelwright teaches a longer focal length of 50 mm for rays 845 of first display light and a shorter focal length of 25 mm for rays 847 of second display light (col. 9, lines 4-11 and col. 11, lines 23-54 thereof). It would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Wheelwright to the disclosure of Zhang and provided the near-eye display system of Zhang with a color-selective effective focal length to optimally focus the various wavelengths of light reaching the camera (Wheelwright, col. 9, lines 12-29) and to provide increased optical efficiency with less electrical power required (Wheelwright, col. 2, lines 42-45). As a result, the prior art combination teaches and renders obvious the limitation of adjusting a first focus distance while capturing infrared light to a second focus distance while capturing visible light, because Zhang teaches the use of an infrared camera for eye-tracking (refer to at least claim 15 of Zhang) and Wheelwright teaches an optical assembly with a color-selective (i.e., wavelength dependent) focal lengths (refer to at least col. 2 of Wheelwright), therefore the prior art combination teaches a shorter focal length for a camera focusing on a user’s eye illuminated in infrared light and a longer focal length for a camera focusing on objects in the environment illuminated by visible light, therefore the limitation “the second distance is greater than the first distance” is also rendered obvious by the prior art combination.
Regarding dependent claim 20, modified Zhang discloses the non-transitory computer-readable storage medium of claim 17, and Zhang discloses wherein the instructions are further configured to cause the head-mounted device to adjust a focus distance of the camera from a first distance to a second distance (calibration data may be communicated from external imaging device 150 to console 110, and external imaging device 150 may receive one or more calibration parameters from console 110 to adjust one or more imaging parameters, e.g., focal length and/or focus, col. 9, lines 51-56).
Zhang does not disclose the camera is configured to adjust a first focus distance while capturing infrared light to a second focus distance while capturing visible light, and does not disclose the second distance is greater than the first distance.
In a related field of invention, Wheelwright discloses a head mounted display 100, shown in at least Fig. 1 thereof, where example optical assembly 830, shown in Figs. 8A-8C, provides a color-selective effective focal length since the effective focal length of the optical assembly 830 is dependent on the wavelength of light propagating through the optical assembly (col. 9, lines 25-29 thereof). Wheelwright teaches a longer focal length of 50 mm for rays 845 of first display light and a shorter focal length of 25 mm for rays 847 of second display light (col. 9, lines 4-11 and col. 11, lines 23-54 thereof). It would have been obvious to a person having ordinary skill in the art, before the effective filing date of the claimed invention, to have applied the teachings of Wheelwright to the disclosure of Zhang and provided the near-eye display system of Zhang with a color-selective effective focal length to optimally focus the various wavelengths of light reaching the camera (Wheelwright, col. 9, lines 12-29) and to provide increased optical efficiency with less electrical power required (Wheelwright, col. 2, lines 42-45). As a result, the prior art combination teaches and renders obvious the limitation of adjusting a first focus distance while capturing infrared light to a second focus distance while capturing visible light, because Zhang teaches the use of an infrared camera for eye-tracking (refer to at least claim 15 of Zhang) and Wheelwright teaches an optical assembly with a color-selective (i.e., wavelength dependent) focal lengths (refer to at least col. 2 of Wheelwright), therefore the prior art combination teaches a shorter focal length for a camera focusing on a user’s eye illuminated in infrared light and a longer focal length for a camera focusing on objects in the environment illuminated by visible light, therefore the limitation “the second distance is greater than the first distance” is also rendered obvious by the prior art combination.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Lanman US PGPub 2019/0361249 A1 discloses a head-mounted display with eye tracking via infrared light, among other features, thus disclosing many of the limitations of at least the independent claims of the instant application.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Justin W Hustoft whose telephone number is (571)272-4519. The examiner can normally be reached Monday - Friday 8:30 AM - 5:30 PM Eastern Time.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Pham can be reached at (571)272-3689. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JUSTIN W. HUSTOFT/Examiner, Art Unit 2872
/THOMAS K PHAM/Supervisory Patent Examiner, Art Unit 2872