DETAILED ACTION
Information Disclosure Statement
The information disclosure statement(s) filed on 10/30/2024 is/are in compliance with the provisions of 37 CFR 1.97 and is/are being considered by the Examiner.
Priority
Acknowledgment is made of applicant’s claim for priority for U.S. National Stage under 35 U.S.C. 371 filed on 07/30/2021. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55(f)(2).
Drawings
The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore,
A. the head-mounted device including a camera, a human eye gazing point on the head-mounted device, and the screen display must be shown or the feature(s) canceled from the claim(s) 1-8.
B. a reference picture in a display area must be shown or the feature(s) canceled from the claim(s) 5.
C. AR head-mounted device, a stepper motor, an AR optical machine, and AR lens must be shown or the feature(s) canceled from the claim(s) 3.
D. a plurality of infrared light emitting diodes on a border thereof must be shown or the feature(s) canceled from the claim(s) 2.
E. a wearing state must be shown or the feature(s) canceled from the claim(s) 7.
F. a memory and a processor must be shown or the feature(s) canceled from the claim(s) 9-10.
No new matter should be entered.
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
Claims 1-10 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
1. Claim 1 recites the limitation: “a position of a human eye gazing point on the head-mounted device”. It is unclear where the position of a human eye gazing point is located on the device, especially in light of the fact that none of the instant drawings depict such a position on the device (see accompanying Drawings objection supra). The as-filed specification also fails to elucidate this limitation beyond a generic ipsis verbis appearance of the claim language, thereby resulting in the claimed positional relationship as not sufficiently defined. See MPEP § 2173.05(b), Section II, citing Ex parte Miyazaki, 89 USPQ2d 1207 (Bd. Pat. App. & Inter. 2008) (precedential) and Ex parte Brummer, 12 USPQ2d 1653 (Bd. Pat. App. & Inter. 1989). For the purposes of examination, the limitation will be treated as inherent.
2. Claim 2 recites the limitation: “wherein the head-mounted device comprises a plurality of infrared light emitting diodes on a border thereof”. It is unclear where the border is located on the device, especially in light of the fact that none of the instant drawings depict a border or the light emitting diodes thereon (see accompanying Drawings objection supra). The as-filed specification also fails to elucidate this limitation beyond a generic ipsis verbis appearance of the claim language, thereby resulting in the claimed positional relationship as not sufficiently defined. See MPEP § 2173.05(b), Section II, citing Ex parte Miyazaki, 89 USPQ2d 1207 (Bd. Pat. App. & Inter. 2008) (precedential) and Ex parte Brummer, 12 USPQ2d 1653 (Bd. Pat. App. & Inter. 1989). For the purposes of examination, the limitation will be treated as: “wherein the head-mounted device comprises a plurality of infrared light emitting diodes”.
Claims 2-10 inherit the deficiencies of Claim 1, and are thus rejected under 35 U.S.C. 112(b).
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-2, 5-6 and 8-10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Choubey et al. (US 2020/0394830 A1).
Regarding Claim 1, as best understood, Choubey discloses: A display calibration method for a head-mounted device (¶0033: system 100 may be configured as an HMD system), comprising:
controlling a camera (104) to capture a first eye image of a human eye of a user wearing the head-mounted device; determining a human eye gazing position according to the first eye image, wherein the human eye gazing position comprises a position of a human eye gazing point on the head-mounted device (¶0040-41: Sensors 104 can be eye tracking sensors 104 that provide eye tracking data 148, such as data corresponding to at least one of a position or an orientation of one or both eyes of the user. Sensors 104 can be oriented in a direction towards the eyes of the user…sensors 104 output images of the eyes of the user, which can be processed to detect an eye position or gaze direction (e.g., first gaze direction) of the eyes; ¶0043: Eye tracker 118 can determine gaze vector 136 that can include position data such as at least one of a position or an orientation of each of one or more eyes of the user.);
retrieving a preset gazing position of the head-mounted device and calculating positional deviation information by comparing the preset gazing position to the human eye gazing position (¶0045: Processing circuitry 116 includes tile generator 122 which is configured to receive gaze vector 136 from eye tracker 118 and gaze error 126 from error manager 120…tile generator 122 is configured to define one or more tiles 128 (e.g., tiles 602 shown in FIGS. 6-15 and 21), superpixels, collection of pixels, render areas, resolution areas, etc., for image renderer 124. Tile generator 122 generates tiles 128 based on gaze vector 136; ¶0098: tile generator 122 stores multiple layouts of tiles 602 that are predetermined or predefined in a database for multiple gaze positions/locations 402); and
adjusting a screen display position of the head-mounted device according to the positional deviation information (¶0077: Error manager 120 can receive eye tracking data 148 in real-time, determine gaze error 126, translate gaze error 126 (e.g., the error of angle θ.sub.1 and/or angle θ.sub.2) to a range of positions/locations on display 164 (shown as error 604), and provide error 604 and/or gaze error 126 to tile generator 122; ¶0076, 0097: tile generator 122 uses gaze error 126 to adjust tiles 602…or redefines all tiles 602 that are used to render images on display 164).
Regarding Claim 2, as best understood, Choubey discloses the display calibration method for a head-mounted device according to Claim 1, as above. Choubey further discloses: wherein the head-mounted device comprises a plurality of infrared light emitting diodes on a border thereof; wherein the determining a human eye gazing position according to the first eye image comprises: determining a gazing direction according to a relative position of a projection of the infrared light emitting diodes in the human eye according to the first eye image with respect to a pupil of the human eye; and calculating the human eye gazing position according to the gazing direction and the human eye position (¶0036: Sensors 104a . . . n (generally referred herein as sensors 104) can include infrared cameras; ¶0043: eye tracker 118 can identify, using eye tracking data 148, gaze vector 136 based on pixels corresponding to light (e.g., light from light sources/light emitting diodes/actuators of sensors 104 [infrared light emitting diodes], such as infrared or near-infrared light from actuators of sensors 104, such as 850 nm light eye tracking) reflected by the one or more eyes of the user; ¶0041: sensors 104 optically measure eye motion, such as by emitting light (e.g., infrared light) towards the eyes and detecting reflections of the emitted light; ¶0040: sensors 104 can include at least one fourth sensor 104d to detect sensor data regarding the eyes of the user; see FIG. 2 showing plurality of infrared light emitting diodes 104d on a border thereof).
Regarding Claim 5, Choubey discloses the display calibration method for a head-mounted device according to Claim 1, as above. Choubey further discloses: wherein before the capturing the first eye image the method further comprises: displaying a reference picture in a display area of the head-mounted device, wherein the reference picture comprises a plurality of feature points symmetrically distributed with respect to the preset gazing position (see FIGS. 6-12 showing plurality of feature points symmetrically distributed with respect to the preset gazing position 402; ¶0048: Image renderer 124 determines, computes, or calculates the pixel values of the display or image data to be rendered to provide the desired or predetermined 3D image(s); ¶0077: the positions, locations, sizes, etc., of tiles 602 [feature points] are predefined or predetermined; ¶0098: tile generator 122 stores multiple layouts of tiles 602 that are predetermined or predefined in a database for multiple gaze positions/locations 402).
Regarding Claim 6, Choubey discloses the display calibration method for a head-mounted device according to Claim 1, as above. Choubey further discloses: wherein after the adjusting the screen display position of the head-mounted device according to the positional deviation information, the method further comprises: controlling the camera to capture a second eye image, and judging whether the positional deviation information of the human eye gazing position corresponding to the second eye image compared to the preset gazing position meets a preset condition; if the preset condition is met, outputting a prompt message indicating that a calibration is completed; if the preset condition is not met, adjusting the screen display position of the head-mounted device according to the positional deviation information (¶0046: tile generator 122 redefines tiles 128 periodically or dynamically based on updated or new gaze error 126 and/or gaze vector 136; ¶0077: If error 604 increases, a corresponding number of tiles 602 that the user's gaze may be directed towards also increases, according to some embodiments. Error manager 120 can receive eye tracking data 148 in real-time, determine gaze error 126, translate gaze error 126 (e.g., the error of angle θ.sub.1 and/or angle θ.sub.2) to a range of positions/locations on display 164 (shown as error 604), and provide error 604 and/or gaze error 126 to tile generator 122. Tile generator 122 defines the size, area, number of pixels, location, resolution, etc., of each tile 602 based on gaze error 126 (e.g., based on error 604); ¶0054, 0135-136: processing the received images to calibrate an eye tracking operation…Process 1800 includes updating the tiling by performing process 1700 in response to the difference being greater than (or greater than or equal to) the threshold value (step 1808), if the difference is greater than (or greater than or equal to) the threshold value (step 1804, “YES”), process proceeds).
Regarding Claim 8, Choubey discloses: A display calibrating apparatus for a head-mounted device (¶0033: system 100 may be configured as an HMD system), comprising:
an image capture module configured for controlling a camera to capture a first eye image of a human eye of a user upon wearing the head-mounted device (¶0040-41: Sensors 104 can be eye tracking sensors 104 that provide eye tracking data 148, such as data corresponding to at least one of a position or an orientation of one or both eyes of the user. Sensors 104 can be oriented in a direction towards the eyes of the user…sensors 104 output images of the eyes of the user, which can be processed to detect an eye position or gaze direction (e.g., first gaze direction) of the eyes; ¶0043: Eye tracker 118 can determine gaze vector 136 that can include position data such as at least one of a position or an orientation of each of one or more eyes of the user);
a gazing position detecting module configured for determining a human eye gazing position according to the first eye image, wherein the human eye gazing position comprises a position of a human eye gazing point on the head-mounted device; a deviation determining module configured for retrieving a preset gazing position of the head-mounted device and calculating positional deviation information by comparing the preset gazing position to the human eye gazing position; and (¶0045: Processing circuitry 116 includes tile generator 122 which is configured to receive gaze vector 136 from eye tracker 118 and gaze error 126 from error manager 120…tile generator 122 is configured to define one or more tiles 128 (e.g., tiles 602 shown in FIGS. 6-15 and 21), superpixels, collection of pixels, render areas, resolution areas, etc., for image renderer 124. Tile generator 122 generates tiles 128 based on gaze vector 136; ¶0098: tile generator 122 stores multiple layouts of tiles 602 that are predetermined or predefined in a database for multiple gaze positions/locations 402);
a display calibrating module configured for adjusting a screen display position of the head-mounted device according to the positional deviation information (¶0077: Error manager 120 can receive eye tracking data 148 in real-time, determine gaze error 126, translate gaze error 126 (e.g., the error of angle θ.sub.1 and/or angle θ.sub.2) to a range of positions/locations on display 164 (shown as error 604), and provide error 604 and/or gaze error 126 to tile generator 122; ¶0076, 0097: tile generator 122 uses gaze error 126 to adjust tiles 602…or redefines all tiles 602 that are used to render images on display 164).
Regarding Claim 9, Choubey discloses the display calibration method for a head-mounted device according to Claim 1, as above. Choubey further discloses: A head-mounted device (100) comprising a memory and a processor, wherein the memory stores a computer program therein, and the processor is configured for implementing the display calibration method for a head-mounted device when the computer program in the memory is called (FIGS. 1, 20; ¶0034: Processing circuitry 116 can include any type and form of executable instructions executable by processors or hardware components. The executable instructions may be of any type including applications, programs, services, tasks, scripts, libraries processes and/or firmware. Any of eye tracker 118, error manager 120, tile generator 122, an image renderer 124 may be any combination or arrangement of circuitry and executable instructions to perform their respective functions and operations; ¶0144: Local storage 2006 can be physically or logically divided into various subunits such as a system memory…and can store one or more software programs to be executed by processing unit(s) 2004, such as an operating system and/or programs implementing various server functions such as functions of the system 100 [HMD]).
Regarding Claim 10, Choubey discloses the display calibration method for a head-mounted device according to Claim 1, as above. Choubey further discloses: A storage medium, for storing a computer executable instruction therein, wherein the computer executable instruction is configured to be loaded and executed by a processor to implement the display calibration method for a head-mounted device (FIGS. 1, 20; ¶0034; ¶0144-45: storage 2006 can store one or more software programs to be executed by processing unit(s) 2004, such as an operating system and/or programs implementing various server functions such as functions of the system 100).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 3-4 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Choubey et al. (US 2020/0394830 A1) in view of Salter et al. (US 2014/0152558 A1).
Regarding Claim 3, Choubey discloses the display calibration method for a head-mounted device according to Claim 1, as above. Choubey further discloses: wherein the head-mounted device is an AR head-mounted device (¶0072: augmented reality images of display of a HMD virtual reality system), and the adjusting a screen display position of the head-mounted device according to the positional deviation information, comprises: generating a control command according to the positional deviation information (FIG. 16; ¶0114: process 1600 includes determining if the location/position on the display is off-center of a corresponding tile or is near a tile border (step 1608); ¶0110, 0158: processing power of processing circuitry that performs process 1600… processing components include controller to perform functions).
Choubey does not appear to explicitly disclose: adjusting the screen display position of the head-mounted device by sending the control command to a stepper motor to adjust relative positions of an AR optical machine and an AR lens.
Salter is related to Choubey with respect to a display calibration method for a head-mounted device, characterized by comprising: controlling a camera to take capture a first eye image, determining a human eye gazing position according to the image, calculating positional deviation information; and adjusting a screen display position of the head-mounted device accordingly (FIG. 2; ¶0035-37, 0040-44, 0051) and Salter teaches: adjusting the screen display position of the head-mounted device by sending the control command to a stepper motor to adjust relative positions of an AR optical machine and an AR lens (¶0078: augmented reality environments in which a virtual pointer may be displayed to an end user of an HMD; ¶0051: a display adjustment mechanism comprises one or more motors 203; ¶0055-56: microdisplay assembly 173 comprises light processing elements and a variable focus adjuster 135… which changes the displacement between one or more light processing elements in the optical path or an optical power of an element…An example of a light processing element include one or more optical elements such as one or more lenses of a lens system 122; ¶0059: the adjuster 135 may be an actuator such as a piezoelectric motor. Other technologies for the actuator may also be used).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the display calibration method of Choubey in view of Salter to satisfy the claimed condition because such a motor is known and would be utilized to move the display in any of the three dimensions, as taught in paragraphs ¶0061 of Salter.
Regarding Claim 4, Choubey discloses the display calibration method for a head-mounted device according to Claim 3, as above. Choubey further discloses: wherein the generating a control command according to the positional deviation information, comprises: determining a coordinate offset of the screen display according to the positional deviation information, and generating the control command according to the coordinate offset of the screen display (¶0114: process 1600 includes determining if the location/position on the display is off-center of a corresponding tile or is near a tile border (step 1608)… step 1608 includes comparing the location/position on the display (e.g., the location/position of gaze location 402) to a corresponding location/position of tile 602 that the location/position is within. the location/position on the display deviating from the location/position of the center of the corresponding tile indicates that the user is directing their gaze towards a different location on display 164. step 1608 is performed by tile generator 122; FIG. 16).
Regarding Claim 7, Choubey discloses the display calibration method for a head-mounted device according to Claim 1, as above. Choubey does not appear to explicitly disclose: further comprising: when the head-mounted device is detected to be in a wearing state, judging whether a facial feature of the user is the same as that of a previous user; if not, repeating the controlling a camera to capture a first eye image.
Salter is related to Choubey with respect to a display calibration method for a head-mounted device, characterized by comprising: controlling a camera to take capture a first eye image, determining a human eye gazing position according to the image, calculating positional deviation information; and adjusting a screen display position of the head-mounted device accordingly (FIG. 2; ¶0035-37, 0040-44, 0051) and Salter teaches: further comprising: when the head-mounted device is detected to be in a wearing state, judging whether a facial feature of the user is the same as that of a previous user; if not, repeating the controlling a camera to capture a first eye image (¶0073: image and audio processing engine 194 may apply object recognition and facial recognition techniques…facial recognition may be used to detect the face of a particular person… The particular faces, voices, sounds, and objects to be detected may be stored in one or more memories contained in memory unit 192. Processing unit 191 may execute computer readable instructions stored in memory unit 192 in order to perform processes).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the display calibration method of Choubey in view of Salter to satisfy the claimed condition because such a facial recognition feature is known and would be utilized to perform calibration of the display quickly, as taught in paragraphs ¶0042-43, ¶0073 of Salter.
Other Relevant Documents Considered
Prior art made of record and not relied upon is considered pertinent to Applicant’s disclosure: Thunstrom (US 2018/0224935 A1) and Cho et al. (US 9,619,021 B2) disclose display calibration method for a head-mounted device, characterized by comprising: controlling a camera to take capture a first eye image, determining a human eye gazing position according to the image, calculating positional deviation information; and adjusting a screen display position of the head-mounted device accordingly and further satisfying some of the additional conditions as claimed.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SAMANVITHA SRIDHAR whose telephone number is (571)270-0082. The examiner can normally be reached M-F 930-1800 (EST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, BUMSUK WON can be reached at 571-272-2713. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SAMANVITHA SRIDHAR/Examiner, Art Unit 2872
/BUMSUK WON/Supervisory Patent Examiner, Art Unit 2872