DETAILED ACTION
This action is in response to communications: Preliminary-Amendment filed July 1, 2024.
Claims 1-15 are pending in this case. Claims 1-15 have been newly amended. No claims have been newly added or cancelled. This action is made Non-Final.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Drawings
The drawings were received on July 1, 2024. These drawings are accepted.
Specification
The disclosure is objected to because of the following informalities:
The specification interchanges between ghosting “artifacts” and “artefacts,” e.g. as in paragraphs [0003], [0005], [0009], [0024, [0025, [0029], [0037], [0039], [0070], and [0071] of the present application’s publication (US 2026/0003190). It is suggested amend “artefacts” to “artifacts” to maintain consistency in the language throughout.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3, 5, 8-10, 14, and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over VLACHOS (US 2022/0130297).
As to claim 9, VLACHOS discloses a system for compensating for ghosting artifacts in a display apparatus (Figure 1, system 100, e.g. as a head-mounted display (HMD), where [0021] notes system utilize leak value in a light leak compensation operation to generate an updated color vale for presentation by a pixel, [0023] notes compensating or correcting light leakage to eliminate ghosting artifacts), the display apparatus comprising a display panel configured to display an image (Figures 2-4, [0034] notes pixel 200 may be one of several pixels of a display panel of display system(s) 114 (e.g. an LCD panel), such that pixel 200 may be configured to present different color values to form an image (e.g. in combination with other pixels)) and a lens unit configured to focus light from the display panel to a user’s eye, the lens unit being characterized by an optical axis (Figure 3, optics 302, [0032] notes display system(s) 114 may comprise any type of hardware device that uses one or more color filters to form images for viewing by an image detector or sensor (e.g. a camera and/or an eye of a user), [0033], [0045] further notes display system(s) 114 may include optics 116 that may be configured to refract light emitted by display panels of the display system(s) 114), the system comprising a processor (processor(s) 102) configured to (perform processes of Figures 2-7): receive an input image intended for display on the display panel within the display apparatus (step 602, [0091] notes capturing a test image of a pixel displayed by a display device, e.g. test image 232 of Figure 2 or test image 312 of Figure 3, captured by image detector 230, where [0094] further notes test image may include a plurality of pixels); analyze the input image to identify pixel regions susceptible to ghosting based on proximity to the optical axis of the lens unit in the display apparatus (step 604, [0092] notes detecting, within the test image, a representation of the pixel displayed by the display device that includes an image color value, where [0094] further notes the representation of the plurality of pixels displayed by the display device and captured by the test image may include a plurality of image color values, where at least some of the plurality of image color values that differ from corresponding component values for at least some of the plurality of pixels displayed by the display device, where Figure 3, [0045] thru [0048] notes the light transmitted through color filters 206 diffracts through the optics 302 toward image detector 230, and because some of the color filters 206 transmit multiple colors, light received by the optics 302 from one color filter may diffract differently through the optics, [0046] notes such diffraction may cause light leakage through the color filters 206 to result in image ghosting artifacts that are observable by users viewing pixel 200 while the pixel displays the color value 216, e.g. the representation 314 of the pixel 200 displaying the color value 216, as captured/observed within the test image 312, includes ghosting artifacts represented by multiple offset circles representing different portions of the light emitted by the pixel 200); calculate a compensatory color value for each pixel in the identified pixel regions using a convolution process (step 606, [0093] notes generating a light leak value based on the image color value, e.g. light leak value 402 of Figure 4, [0094] notes the light leak value may be based on an average of at least some of the plurality of image color values, where [0028] further notes processes, e.g. generating light leak value performed by convolutional neural networks), wherein the compensatory color value incorporates effects of light leakage from one or more surrounding pixels on the corresponding pixel (Figure 4, [0054], [0055] notes a light leak value 402 is determined based on a test image of a pixel 200 (or more than one pixel) without any optics intervening between a pixel 200 and the image detector 230 that captures the pixel 200, [0056] notes a light leak value 402 is determined based on a test image of a pixel 200 (or more than one pixel) with optics 302 intervening between a pixel 200 and the image detector 230 that captures the pixel 200, e.g. a plurality of pixels of a display panel may be configured to display a test pattern (e.g. a series of parallel lines of a single color) with all other color values for other pixels set to zero (e.g. black), the image detector may observe or capture a test image, and an analysis of the test image may be performed to determine the existence of image pixels with nonzero image color values in regions of the test image that deviate from expected diffraction patterns (e.g. based on the properties of the optics for diffracting the particular color of the parallel lines of the test pattern), the image color values for one or more pixels in such regions may be quantified (e.g. by averaging or other techniques), and these image color values may be used as a basis for determining or refining a light leak value); apply the compensatory color values to the corresponding pixels in the identified pixel regions to generate an adjusted image (step 506 (e.g. after step 504, obtaining a light leak value associated with the pixel, where [0087] notes the obtained light leak value may be generated, e.g. as in step 606 noted above), [0088] and [0089] notes generating an updated color value, e.g. updated color value 406 of Figure 4, for the pixel by applying a light leak compensation operation to the color value for the pixel, e.g. using the light leak value 402); and display the adjusted image on the display panel of the display apparatus (step 508, [0090] notes triggering display of the pixel on the display device using the updated color value)(please reference at least Figure 4 for additional details).
As noted above, VLACHOS describes a pixel 200 may be one of several pixels of a display panel of display system(s) 114, e.g. an LCD panel, the pixel 200 may be configured to present different color values to form an image, e.g. in combination with other pixels. VLACHOS further describes generating light leak value for a pixel, e.g. pixel 200, or a plurality of pixels displayed by display system(s). Therefore, it would have been obvious to one of ordinary skill in the art at the time of the invention to recognize the multiple pixels may be considered “pixel regions” and further may be considered “surrounding pixels” to pixel 200, yielding predictable results, without changing the scope of the invention.
Claim 1 is similar in scope to claim 9, and is therefore rejected under similar rationale.
As to claims 3 and 10, VLACHOS discloses the processor is further configured to refine the adjusted image by repeating the convolution process and compensatory color value calculation steps for multiple iterations, wherein the iterations are performed until a loss function between consecutive iterations falls below a predefined threshold ([0072] notes multiple test images may be analyzed to determine a light leak value, e.g. captured sequentially or iteratively, [0073] notes the test images may be assessed to determine a particular test light leak value that resulted in a test image that had a lowest amount of color tinting or image ghosting, [0074] further notes other iterative approaches for determining a light leak value, e.g. repeating process until light leakage artifacts are determined to be minimized (e.g. incremental changes in test light leak values fail to provide an improvement to detected artifacts)).
As to claim 5, VLACHOS discloses masking non-visible areas of the display panel to black ([0056] notes a plurality of pixels of a display panel may be configured to display a test pattern (e.g. a series of parallel lines of a single color) with all other color values for other pixels set to zero (e.g. black)).
As to claims 8 and 14, VLACHOS discloses the processor is further configured to implement a neural network configured to process each pixel series independently to approximate the compensatory color value for each pixel in the identified pixel regions ([0028] notes processor(s) may comprise and/or utilize hardware components or computer-executable instructions operable to carry out function blocks and/or processing layers configured in the form of, by way of non-limiting example, single-layer neural networks, feed forward neural networks, radial basis function networks, deep feed-forward networks, recurrent neural networks, long-short term memory (LSTM) networks, gated recurrent units, autoencoder neural networks, variational autoencoders, denoising autoencoders, sparse autoencoders, Markov chains, Hopfield neural networks, Boltzmann machine networks, restricted Boltzmann machine networks, deep belief networks, deep convolutional networks (or convolutional neural networks), deconvolutional neural networks, deep convolutional inverse graphics networks, generative adversarial networks, liquid state machines, extreme learning machines, echo state networks, deep residual networks, Kohonen networks, support vector machines, neural Turing machines, and/or others).
As to claim 15, VLACHOS discloses the display apparatus is a head-mounted display (HMD) as part of an extended reality (XR) system (Figure 1, [0025] notes system 100 as a head-mounted display (HMD), and may comprise a virtual reality (VR) system or any other type of HMD).
Claim(s) 4 and 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over VLACHOS (US 2022/0130297) as applied to claims 1 and 9 above, and further in view of Melax et al. (US 2025/0104580).
As to claims 4 and 11, VLACHOS does not disclose, but Melax et al. disclose the processor is further configured to prioritize calculating and applying of the compensatory color values to the pixels corresponding to a user’s gaze over peripheral regions ([0078] notes dynamic mitigation technique using user’s gaze to determine if an how to apply mitigations or to otherwise enhance a view to provide a better user experience, e.g. gazed based dimming of peripheral portions of a view to mitigate optical artifacts, see also [0079] thru [0084], where Figure 8, [0086], compared to Figure 7, [0085], illustrates an enhancement applied to alter the appearance of the portions of the view 705 outside of the foveal gaze zone (FGZ) 725 that are in the user’s periphery, e.g. dimmed or modified by a contrast flattening technique).
It would have been obvious to one of ordinary skill in the art at the time of the invention to modify VLACHOS’ system and method of compensating for ghosting artifacts in a head-mounted display (HMD) with Melax et al.’s method of applying compensatory color values to pixels corresponding to a user’s gaze to enhance a user’s view when using the HMD, thus providing a better user experience ([0078] of Melax et al.).
Allowable Subject Matter
Claims 2, 6, 7, 12, and 13 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
Regarding claim 2, the prior art of record fails to teach or suggest, singly or combined, the limitations of the claim as recited.
Regarding claims 6 and 12, the prior art of record fails to teach or suggest, singly or combined, the limitations of the claims as recited.
Regarding claims 7 and 13, the prior art of record fails to teach or suggest, singly or combined, the limitations of the claims as recited.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACINTA M CRAWFORD whose telephone number is (571)270-1539. The examiner can normally be reached 8:30a.m. to 4:30p.m.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Y. Poon can be reached at (571)272-7440. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JACINTA M CRAWFORD/Primary Examiner, Art Unit 2617