DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Examiner has carefully considered Applicant’s Remarks dated October 27, 2025.
As for Applicant’s argument regarding the amendment to independent claim 1 overcoming the art (Remarks, page 8); a secondary reference is being introduced in this Office Action in view of the amendment to claim 1.
Accordingly, amended independent claim 1 remains rejected. The similarly amended independent claims 16 and 20 remain rejected as well. The dependent claims also remain rejected.
Claim Objections
Claim 16 is objected to because of the following informalities: there is no antecedent basis for “the electronic device”. Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-4 and 6-20 are rejected under 35 U.S.C. 103 as being unpatentable over Pandey (US 2014/0132484 A1) in view of Greenebaum (US 2020/0105226 A1).
Instant Claim 1: A method comprising: at a head-mounted device (HMD) having a processor and a display: (“Means for performing method 300 (fig 3) include one or more: computerized devices, cameras, head-mounted displays, and power sources. Means for performing method 300 may include one or more of the modules of system 100 (fig 1). Means for performing method 300 may include one or more processors.” (Pandey, paragraph 58) The head-mounted display of Pandey corresponds to the head-mounted device of the claim.)
presenting a first view of an extended reality (XR) environment on the display while the HMD is worn by a user; (“A head-mounted display (HMD) which may be part of an augmented reality (AR) device, such as augmented reality glasses, may be used to superimpose virtual objects over a real-world scene being viewed by a user.” (Pandey, paragraph 24) The display described by Pandey corresponds to the extended reality environment of the claim.)
determining a viewing state based at least in part on a first brightness characteristic of the first view, (“At block 420 (fig 4), the brightness of real-world objects present in a region of a real-world scene on which a virtual object is currently superimposed or is going to be superimposed may be determined” (Pandey, paragraph 72) The brightness of objects to be viewed by the user corresponds to the viewing state of the claim.)
determining a second brightness characteristic for at least a portion of a second view of the XR environment based on the viewing state; and presenting the second view of the XR environment. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75) The brightness level of the virtual object of Pandey corresponds to the second brightness characteristic of the claim. The environment including the virtual object corresponds to the second view of the claim.)
Pandey does not explicitly teach the following limitation of this claim:
wherein the viewing state corresponds to adaptation of an eye of the user to an illumination level;
In the same field of endeavor, however, Greenebaum does explicitly teach the adaptation of the user’s eye to an illumination level when the user is wearing a head-mounted display device.
wherein the viewing state corresponds to adaptation of an eye of the user to an illumination level; (“One real-world example occurs in relation to a human's perception of colors, such as levels of black, which occurs because the human eye is more sensitive to gradations of darker regions (e.g., black, etc.) than to gradations of brighter regions (e.g., white, etc.). Due to this difference in sensitivity, it may, in some scenarios, be more efficient to control the gamma function using a scale that mimics human perception. This may, for example, include using more codes for the gamma function in an area of higher human perception (e.g., black regions, etc.) and fewer codes in an area of lower human perception (e.g., white regions, regions that are brighter than black regions, etc.).” (Greenebaum, paragraph 85))
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to combine the head-mounted display as taught by Pandey, wherein the device controls the brightness of objects to be displayed; with the head-mounted display device as taught by Greenebaum, wherein the device controls the brightness of the display based on the user’s eye’s adaptation to such brightness levels. Such a combination incorporates a known feature (Greenebaum) into a known device in order to yield the predictable result of using the user’s eye’s adaptation to a particular brightness level to make adjustments to the display.
Instant Claim 2: The method of claim 1, wherein the first brightness characteristic is an average brightness level of the first view. (“To determine the brightness, the image captured at block 410 (Fig 4) may be used. An average brightness across the region or the brightest measurement within the region may be used. The brighter a region is determined to be, the greater the brightness of a virtual object may need to be for sufficient visibility to the user.” (Pandey, paragraph 72))
Instant Claim 3: The method of claim 1, wherein: the first view is presented using a first range of brightness values; and determining the second brightness characteristic comprises selecting a second range of brightness values different than the first range for presenting the second view. (“The brightness and color of virtual object 250-2 (fig 2B) may be determined based on properties of floor 245 and wall 235.” (Pandey, paragraph 47) Therefore, the brightness of different objects of Pandey are determined based on different properties.)
Instant Claim 4: The method of claim 1, wherein the viewing state comprises an eye perception state of the user determined based on an average brightness level of the first view. (“The system may a camera, configured to monitor a focus of a user's eyes. The controller may be further configured to decrease a brightness level of the virtual object when the user's eyes are focused away from the virtual object.” (Pandey, paragraph 5))
Instant Claim 6: The method of claim 1 further comprising identifying virtual content for inclusion in the second view, wherein determining the second brightness characteristic comprises selecting a range of brightness values for the virtual content based on the viewing state. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75))
Instant Claim 7: The method of claim 1 further comprising identifying virtual content for inclusion in the second view, wherein the second brightness characteristic is based on the first brightness characteristic of the first view. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75))
Instant Claim 8: The method of claim 1 further comprising identifying virtual content for inclusion in the second view, wherein the second brightness characteristic comprises selecting a range of brightness values or an average brightness for the virtual content based on a brightness range or an average brightness of the first view. (“To determine the brightness, the image captured at block 410 (fig 4) may be used. An average brightness across the region or the brightest measurement within the region may be used. The brighter a region is determined to be, the greater the brightness of a virtual object may need to be for sufficient visibility to the user.” (Pandey, paragraph 72))
Instant Claim 9: The method of claim 1, wherein the second brightness characteristic is based on a requirement for the second view. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75) The brightness level of the region of the real-world scene of Pandey corresponds to the requirement of the claim.)
Instant Claim 10: The method of claim 9, wherein the requirement is determined based on an intended viewing state for content to be included in the second view of the XR environment. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75))
Instant Claim 11: The method of claim 9, wherein the requirement is determined based on: a virtual content priority; a reality priority; or a collaboration priority. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75) Determining the brightness level of the region of the real-world scene of Pandey is based on a reality priority.)
Instant Claim 12: The method of claim 1 further comprising adjusting virtual content to be added to passthrough video to provide the second view, wherein the virtual content is adjusted based on the second brightness characteristic during a tone-mapping process prior to being combined with the passthrough video. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75) The determining the brightness level to display the virtual object of Pandey corresponds to the tone-mapping process of the claim.
The video of the real-world scene corresponds to the pass-through video of the claim.)
Instant Claim 13: The method of claim 1 further comprising adjusting virtual content to be added to passthrough video to provide the second view, wherein the virtual content is adjusted based on the second brightness characteristic during a tone-mapping process after combination with the passthrough video such that the passthrough video and virtual content utilize a common compositional space. (“At block 440 (fig 4), the virtual object may be displayed to the user using a brightness level based on the brightness level of the region of the real-world scene determined at block 420” (Pandey, paragraph 75) The determining the brightness level to display the virtual object of Pandey occurs after combining with the real-world scene.)
Instant Claim 14: The method of claim 1, wherein the XR environment comprises a passthrough view of a physical environment from a viewpoint position within the physical environment, (“A head-mounted display (HMD) which may be part of an augmented reality (AR) device, such as augmented reality glasses, may be used to superimpose virtual objects over a real-world scene being viewed by a user.” (Pandey, paragraph 24))
wherein an average brightness of the first view of the XR environment is different than an average brightness of a view of the physical environment from the viewpoint position. (The brightness of an environment is different based on the viewpoint position. The brightness of a forest is different when standing in the middle of the forest than when viewing the forest from an airplane.)
Instant Claim 15: The method of claim 1, wherein first view of the XR environment comprises depictions of a physical environment and depictions of a virtual content. (“A head-mounted display (HMD) which may be part of an augmented reality (AR) device, such as augmented reality glasses, may be used to superimpose virtual objects over a real-world scene being viewed by a user.” (Pandey, paragraph 24))
Instant Claim 16: a non-transitory computer-readable storage medium; …wherein the non-transitory computer-readable storage medium comprises program instructions that, when executed on the one or more processors, cause the electronic device to perform operations (“In some embodiments, a computer program product residing on a non-transitory processor-readable medium for controlling an augmented reality display is presented. The computer program product may comprise processor-readable instructions configured to cause a processor to cause, via a head-mounted display, a virtual field of view to be displayed comprising a virtual object superimposed on a real-world scene.” (Pandey, paragraph 10) The non-transitory processor-readable medium of Pandey corresponds to the non-transitory computer-readable storage medium of the claim.)
wherein the display produces substantially all of the light that is visible to an eye of the user; (“Display module 130 (fig 1) may be a head mounted display (HMD). For instance, display module 130 may include a projector that either projects light directly into one or both eyes of the user or projects the light onto a reflective surface that the user views.” (Pandey, paragraph 37))
Apparatus claim 16 and method claim 1 are related as apparatus and the method of using same, with each claimed element’s function corresponding to the claimed method step. Accordingly, the remainder of claim 16 is similarly rejected under the same rationale as applied above with respect to method claim 1.
Instant Claim 17: (Claim 17 is substantially identical to claim 4, and thus, is rejected under similar rationale.)
Instant Claim 18: (Claim 18 is substantially identical to claim 6, and thus, is rejected under similar rationale.)
Instant Claim 19: (Claim 19 is substantially identical to claim 8, and thus, is rejected under similar rationale.)
Instant Claim 20: (Claim 20 is substantially identical to claim 16, and thus, is rejected under similar rationale.)
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Pandey, in view of Greenebaum, and further in view of Grundhoefer (US 2021/0157143 A1).
Instant Claim 5: The method of claim 1, wherein the viewing state comprises an eye perception state of the user determined based on determining a pupil dilation of the user. (Pandey, in view of Greenebaum, teaches the method in accordance with claim 1, but does not disclose pupil dilation of the user. However, in the same field of endeavor, Grundhoefer teaches such a feature for an electronic device: “In a variety of implementations, the user attributes include a combination of pupil dilation of a user, how long the user has been in the physical environment, where the user is located within the physical environment, eye gaze (e.g., eye focus) of the user and duration of the eye gaze at a particular location within the physical environment, and/or the like.” (Grundhoefer, paragraph 58) Therefore, as Pandey does teach monitoring the direction of focus of the user’s eyes, it would be obvious for Pandey to also include determining the pupil dilation of the user.)
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to combine the head-mounted display as taught by Pandey/Greenebaum, wherein the device monitors the direction of focus of the user’s eyes; with the electronic device as taught by Grundhoefer, wherein the device determines the pupil dilation of the user, in addition to monitoring the direction of focus of the user’s eyes. Such a combination incorporates a known feature (Grundhoefer) into a known device in order to yield the predictable result of obtaining even more specific information on the user’s interest and attention.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Yaron Cohen whose telephone number is (571)270-7995. The examiner can normally be reached Monday - Friday 8:30 AM to 5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Temesghen Ghebretinsae can be reached at 571-272-3017. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/YARON COHEN/Examiner, Art Unit 2626