Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 12/01/2025 have been fully considered but they are not persuasive. Regarding claim 1, Applicant argues (pg. 13 of the Remarks) that “Wu at most discloses that the tracking controller 130 controls the alternate activation of two cameras and their corresponding infrared light sources, which does not involve the sending of image capture instructions, let alone the transmission mode of image capture instructions (specifically, simultaneous transmission or separate transmission at different times)” and therefore fails to teach amended limitations. Examiner respectfully disagrees. Wu teaches (¶0067) The tracking controller 130 is configured to control to turn on and turn off the left infrared light source 121 and the right infrared light source 122, so that the left infrared light source 121 and the right infrared light source 122 are alternately turned on in sequence according to odd and even numbers; and control the left tracking camera 111 and the right tracking camera 112 to correspondingly shoot (i.e., image acquisition) the turned-on left infrared light source 121 or right infrared light source 122 to form turned-on odd-frame tracking images and turned-on even-frame tracking images (i.e., separately at different times.); (¶0074 and claim 12) modules or operations of the embodiments of the present disclosure may be implemented with program codes (i.e., instructions) executable by a computing device.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this
Claim(s) 1-4, 8-11, is/are rejected under 35 U.S.C. 103 as being unpatentable over Wu (US 20220394168) in view of
Regarding claim 1, “A method of acquiring an eye image, comprising” Wu teaches (¶0043) An eyeball tracking system/method based on on-off of light sources; (¶0064, ¶0074, claim 12, and Fig. 2) system is implemented with program codes stored in memory and executable by a computing device
As to “sending an image acquisition instruction to a first camera and a second camera, wherein the image acquisition instruction comprises an acquisition time, an acquisition time of the first camera and an acquisition time of the second camera are different; and a sending way of the image acquisition instruction comprises: sending the image acquisition instruction to the first camera and the second camera simultaneously, or sending the image acquisition instruction to the first camera and the second camera separately at different times; according to the acquisition time of the first camera, empowering at least one first infrared light source and controlling the first camera to acquire a first eye image; and according to the acquisition time of the second camera, empowering at least one second infrared light source and controlling the second camera to acquire a second eye image.” Wu teaches (¶0047, ¶0051, and Fig. 1) a left tracking camera is set to correspond to a left infrared light source and a right tracking camera is set to correspond to a right infrared light source; (¶0048) the left infrared light source and the right infrared light source are alternately turned on in sequence according to odd and even numbers, and the left tracking camera and the right tracking camera are controlled to correspondingly shoot the turned-on left infrared light source or the turned-on right infrared light source to form turned-on odd-frame tracking images and turned-on even-frame tracking image; (¶0067) The tracking controller 130 is configured to control to turn on and turn off the left infrared light source 121 and the right infrared light source 122, so that the left infrared light source 121 and the right infrared light source 122 are alternately turned on in sequence according to odd and even numbers; and control the left tracking camera 111 and the right tracking camera 112 to correspondingly shoot (i.e., image acquisition) the turned-on left infrared light source 121 or right infrared light source 122 to form turned-on odd-frame tracking images and turned-on even-frame tracking images (i.e., separately at different times.) Examiner note: this process of turning on and off inherently describes an acquisition time as further demonstrated by Wu’s usage of the terms shooting frame rate/frequency (see ¶0014 and ¶0015) which are time dependent functions that specify how often something happens in time/events that are time-spaced in a predictable way. A delay is also inherently found in any camera system, between when a processor/controller/host tells a camera to acquire an image and when the camera actually captures it, (i.e., shutter lag/ trigger-to-capture latency, see Wu ¶0032.) The alternatively turning on and off the light sources and alternatively shooting the images as described by Wu indicates that the acquisition time and delay end up happening at different times for each camera. See Wu teaches (¶0056, ¶0058, and ¶0063) sources are alternatively turned on (i.e., thus capture to solve/avoid the problem of mutual interference; (¶0074 and claim 12) modules or operations of the embodiments of the present disclosure may be implemented with program codes (i.e., instructions) executable by a computing device. Examiner holds that the
Regarding claim 2, “The method of claim 1, wherein sending the image acquisition instruction to the first camera and the second camera comprises: sending a first image acquisition instruction to the first camera and the second camera in parallel, wherein the first image acquisition instruction comprises a first acquisition time of the first camera and a second acquisition time of the second camera, and the first acquisition time of the first camera and the second acquisition time of the second camera are different.” Wu teaches (¶0051, ¶00057-¶0058) the left infrared light source corresponding to the left tracking camera is turned on while the right infrared light source corresponding to the right tracking camera is turned off (i.e., happens in parallel), the left infrared light source and the right infrared light source are alternately turned on in sequence according to odd and even numbers, and the left tracking camera and the right tracking camera are controlled to correspondingly shoot the turned-on left infrared light source or the turned-on right infrared light source to form turned-on odd-frame tracking images and turned-on even-frame tracking images.. Thus, only one infrared light source is turned on in each frame, the problem of mutual interference of two light sources can be avoided. Examiner note: this process of turning on and off inherently describes an acquisition time as further demonstrated by Wu’s usage of the terms shooting frame rate/frequency (see ¶0014 and ¶0015) which are time dependent functions that specify how often something happens in time/events that are time-spaced in a predictable way. A delay is also inherently found in any camera system, between when a processor/controller/host tells a camera to acquire an image and when the camera actually captures it, (i.e., shutter lag/ trigger-to-capture latency, see Wu ¶0032.) The alternatively turning on and off the light sources and alternatively shooting the images as described by Wu indicates that the acquisition time and delay end up happening at different times for each camera. See Wu teaches (¶0056, ¶0058, and ¶0063) sources are alternatively turned on (i.e., thus capture to solve/avoid the problem of mutual interference.
Regarding claim 3, “The method of claim 1, wherein sending the image acquisition instruction to the first camera and the second camera comprises: according to a predetermined instruction sending rule, sending a second image acquisition instruction to the first camera and sending a third image acquisition instruction to the second camera, wherein the second image acquisition instruction and the third image acquisition instruction carry the same acquisition time.” Wu teaches (¶0058) process is repeated for multiple frames for and tracking is performed for on the left eye on odd frames (for example 1st and 3rd frame) and tracking is performed for on the right eye on even frames (for example 2nd and 4th frames); (¶0014-¶0015) for a shooting frame rate/frequency; (¶0010 and ¶0032) alternately turned on/off in sequence; (¶0056) In a process that the left infrared light source and the right infrared light source are simultaneously turned on, and the left infrared light source is tracked through the left tracking camera and the right infrared light source is tracked through the right tracking camera to form initial frame tracking images, the left tracking camera and the right tracking camera are low-exposure cameras. Examiner notes: A delay is also inherently found in any camera system, between when a processor/controller/host tells a camera to acquire an image and when the camera actually captures it, (i.e., shutter lag/ trigger-to-capture latency, see Wu ¶0032.)
Regarding claim 4, “The method of claim 3, wherein the predetermined instruction sending rule comprises: sending the second image acquisition instruction to the first camera at a first moment, and sending the third image acquisition instruction to the second camera at a second moment; or sending the second image acquisition instruction to the first camera at the second moment, and sending the third image acquisition instruction to the second camera at the first moment, wherein the first moment precedes the second moment.” Wu teaches (¶0058) process is repeated for multiple frames for and tracking is performed for on the left eye on odd frames (for example 1st and 3rd frame) and tracking is performed for on the right eye on even frames (for example 2nd and 4th frames); (¶0014-¶0015) for a shooting frame rate/frequency; (¶0010 and ¶0032) alternately turned on/off in sequence.
Regarding claim 8, its rejection is similar to claim 1.
Regarding claim 9, its rejection is similar to claim 1.
Regarding claim 10, “A computer-readable storage medium, used for storing a computer program, wherein the computer program enables a computer to execute the method for acquiring the eye image according to claim 1.” Wu teaches (¶0064, ¶0074, claim 12, and Fig. 2) system is implemented with program codes stored in memory and executable by a computing device
Regarding claim 11, “A computer program product comprising a program instruction, wherein when the program instruction runs in an electronic device, the program instruction enables the electronic device to execute the method for acquiring the eye image according to claim 1.” Wu teaches (¶0064, ¶0074, claim 12, and Fig. 2) system is implemented with program codes stored in memory and executable by a computing device
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 5-7, 12-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wu in view of Wilson et al. (US 20170262703, hereinafter Wilson.)
Regarding claim 5, Wu does not go into detail about the structure of the virtual reality device and therefore does not teach. “The method of claim 1, wherein the first camera is arranged on a first lens cone, and the second camera is arranged on a second lens cone; or, the first camera is arranged on the second lens cone, and the second camera is arranged on the first lens cone.” However, Wilson teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the virtual reality system as taught by Wu with the structure as taught by Wilson for the benefit of blocking external light from entering the headset/improved optics.
Regarding claim 6, “The method of claim 5, wherein the first lens cone is a left lens cone, and the second lens cone is a right lens cone; the first lens cone is the right lens cone, and the second lens cone is the left lens cone.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Regarding claim 7, “The method of claim 5, wherein the at least one first infrared light source comprise a plurality of first infrared light sources, and the plurality of first infrared light sources are arranged around the first lens cone uniformly; or the at least one second infrared light source comprise a plurality of second infrared light sources, and the plurality of second infrared light sources are arranged around the second lens cone uniformly.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Regarding claim 12, Wu does not teach “The method of claim 2, wherein the first camera is arranged on a first lens cone, and the second camera is arranged on a second lens cone; or, the first camera is arranged on the second lens cone, and the second camera is arranged on the first lens cone.” However, Wilson teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the virtual reality system as taught by Wu with the structure as taught by Wilson for the benefit of blocking external light from entering the headset/improved optics.
Regarding claim 13, Wu does not teach “The method of claim 3, wherein the first camera is arranged on a first lens cone, and the second camera is arranged on a second lens cone; or, the first camera is arranged on the second lens cone, and the second camera is arranged on the first lens cone.” However, Wilson teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the virtual reality system as taught by Wu with the structure as taught by Wilson for the benefit of blocking external light from entering the headset/improved optics.
Regarding claim 14, Wu does not teach “The method of claim 4, wherein the first camera is arranged on a first lens cone, and the second camera is arranged on a second lens cone; or, the first camera is arranged on the second lens cone, and the second camera is arranged on the first lens cone.” However, Wilson teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the virtual reality system as taught by Wu with the structure as taught by Wilson for the benefit of blocking external light from entering the headset/improved optics.
Regarding claim 15, “The method of claim 12, wherein the at least one first infrared light source comprise a plurality of first infrared light sources, and the plurality of first infrared light sources are arranged around the first lens cone uniformly; or the at least one second infrared light source comprise a plurality of second infrared light sources, and the plurality of second infrared light sources are arranged around the second lens cone uniformly.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Regarding claim 16, “The method of claim 13, wherein the at least one first infrared light source comprise a plurality of first infrared light sources, and the plurality of first infrared light sources are arranged around the first lens cone uniformly; or the at least one second infrared light source comprise a plurality of second infrared light sources, and the plurality of second infrared light sources are arranged around the second lens cone uniformly.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Regarding claim 17, “The method of claim 14, wherein the at least one first infrared light source comprise a plurality of first infrared light sources, and the plurality of first infrared light sources are arranged around the first lens cone uniformly; or the at least one second infrared light source comprise a plurality of second infrared light sources, and the plurality of second infrared light sources are arranged around the second lens cone uniformly.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Regarding claim 18, “The method of claim 12, wherein the first lens cone is a left lens cone, and the second lens cone is a right lens cone; or, the first lens cone is the right lens cone, and the second lens cone is the left lens cone.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Regarding claim 19, “The method of claim 13, wherein the first lens cone is a left lens cone, and the second lens cone is a right lens cone; or, the first lens cone is the right lens cone, and the second lens cone is the left lens cone.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Regarding claim 20, “The method of claim 14, wherein the first lens cone is a left lens cone, and the second lens cone is a right lens cone; or, the first lens cone is the right lens cone, and the second lens cone is the left lens cone.” Wilson further teaches (Fig. 1 and ¶0025) a head mounted display (HMD); (Figs. 2-3, ¶0028-¶0029) a lens holding unit/cone; (¶0030, Fig. 2, Fig. 6A) a plurality of infrared light sources 103 disposed around the lens holding unit; (Fig. 3 and ¶0041-¶0042) an infrared camera to capture light emitted from the cornea of the user; (¶0032 and ¶0044) for left and right side.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Hong et al. (US 20190272028) – (¶0038) The processes and systems provided in this disclosure allow for a client device 106-115 or the server 104 to track the eyes of a user while the user is viewing a display. In certain embodiments, client devices 106-115 display VR content while the client devices 106-115 or the server 104 derive the gaze direction of the user. In certain embodiments, client devices 106-115 or the server 104 capture one eye then the other eye of the user in a repetitive manner, rather than capturing both eyes at simultaneously or nearly simultaneously. By capturing one eye then the other eye of a user repetitively simulates a monocular eye tracking system rather than a binocular eye tracking system. A monocular eye tracking system captures only one eye of the user to derive the eye focus of the user, whereas a binocular eye tracking system captures both eyes of the user simultaneously. During a binocular eye tracking the camera image sensor captures both eyes at a similar time and the captured data is transmitted via a communication interface to the processor for processing.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK J JOHNSON whose telephone number is (571)272-9629. The examiner can normally be reached 9:00AM-5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian T. Pendleton can be reached on 571-272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Frank Johnson/Primary Examiner, Art Unit 2425