DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 12/03/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Drawings
The drawings received on 1/18/2024 are accepted to by the Examiner.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 7, 8, 15, 16, 19 and 20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre- AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claims 7, 8, 15, 16, 19 and 20 the term "slightly different" is a relative term which renders the claim indefinite. The term "slightly different" is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. For the purposes of examination, the limitation “slightly different” is being interpreted as “different” by the prior art.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 2, 9 and 10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Rao et al. (US 2021/0070176).
Regarding claim 1, Rao teaches a system for generating a floating image for a passenger within a vehicle (refer to US 20210070176), comprising:
a passenger monitoring system (Eye tracking module 216, [0032]; See-through display creating overlaying images as floating image for user and passenger; overlays images in a projected space that occupies up to two planes, e.g., a near plane and a far plane [0004]) adapted to monitor the position of the passenger’s head and eyes (Eye tracking module 216 as passenger monitoring system, tracking the movement of eyes and focal point moves/gaze tracking/head motion as position of head, [[0022]-[0024], Eye tracking module 216 may include and/or be in communication with one or more image sensors to track movement of eyes of a user, the eye tracking module 216 may include and/or may be in communication with one or more processing modules e.g., local modules within the vehicle, configured to analyze the captured images of the eyes and determine the gaze direction and/or other eye tracking details … ,[0032]; receiving occupant monitoring data from one or more occupant monitoring sensors and controlling an output of the image light based on the occupant monitoring data, wherein the occupant monitoring data includes an occupant identifier and/or an occupant location [0053-0054]);
a compute engine (controller 214, 616, as compute engine, see at least [0024], [0050], Figs. 2A; 6) in communication with the passenger monitoring system (Eye tracking 216, Fig. 2A; controller 214 to control mechanism of display 208 include an eye tracking module 216, [0031]; eye tracking module 216 may include and/or may be in communication with one or more processing modules [0032]; receiving occupant monitoring data from one or more occupant monitoring sensors and controlling an output of the image light based on the occupant monitoring data, wherein the occupant monitoring data includes an occupant identifier and/or an occupant location [0053-0054]) and adapted to calculate a holographic image and encode the holographic image to a display of a picture generating unit (PGU) hologram generator (display 208; display configuration may include lasers that create multiple holographic sheets, [0010]; micro lens array 210a and/or 210b, and a 3D element, e.g., 3D element 212, [0024]; display configuration may include lasers that create multiple holographic sheets, [0046]; display controller 214 and display controller 616 as computer engine in communication with eye tracking module 216 performs eye tracking and adjusting position of displayed data to align with eye gaze direction into micro lens arrays 210a/210b forming display unit 207 as picture generating unit creating multiple holographic sheets; controlling a display based on eye tracking, [0035]; computing system may receive information from one or more sensors of the vehicle or in communication with the vehicle, to determine the location of the occupant (e.g., using visual tracking of the occupant, [0044]; see at least in [0022-0024], [0035], [0044-0046], [0049]); and
a display screen positioned for viewing by the passenger (display system 102 includes a light source with display unit 106, which may project light in a controlled manner to form virtual images. .. may project light toward a fold mirror 108, which may be planar or aspherical, and which reflects received light toward a rotatable mirror 110, which may be aspherical. The rotatable mirror may direct light toward a glare trap 112 and light trap 114, usable to control the light to appear in a position that is viewable through a windshield 116 to appear at a virtual location 118., [0022]; Fig. 1; FIG. 2A may be included inside a vehicle 204 and configured to project light onto and/or through a windshield 206. The display configuration may include a display 208, one or more micro lens arrays 210a and 210b, and a three-dimensional element 212 positioned between the display 208 and micro lens array 210a … a selected portion of the windshield serves as a display surface, e.g., a transparent plane onto which three-dimensional display images are projected, [0023], Fig. 2A),
the display screen adapted to selectively switch between a first mode, wherein the display screen is adapted to display images for viewing by the passenger (display 604 ... through a windshield 608 to produce an image at a first depth of field indicated at 610a, [0049]), and
a second mode, wherein the display screen is adapted to function as a beam steering device; (fold mirror as a beam steering device reflect the light receives, [0022];responsive to moving the movable optic to position 602b, image light from the display 604 may be transmitted through the windshield 608 to produce the image at a second depth of field indicated at 610b, [0049]; see Figs.1, 2A and 6, paragraph [0023], Fig.1 shows fold mirror 108, rotatable mirror 110, and paragraph [0049], the movable optic 602 be incorporated into any suitable display configuration, see Fig.6, movable optic 602, as beam steering device receives eyebox information from display controller 214/616 and eye-tracking module 216. selectively switch because lens actuator 612 may include one or more components to physically move the movable optic 602 to change a physical location of the movable optic within the display configuration. movable optic 602 may be moved in any suitable manner, [0050])
wherein, when the display screen is operating in the second mode, the display is adapted to project the holographic image to the display screen and the display screen is adapted to re-direct the projected holographic image to the eyes of the passenger, based on the information received from the passenger monitoring system (the display system 102 includes a light source with display unit 106, which may project light in a controlled manner to form virtual images. The display unit 106 may project light toward a fold mirror 108, which may be planar or aspherical, and which reflects received light toward a rotatable mirror 110, which may be aspherical. The rotatable mirror may direct light toward a glare trap 112 and light trap 114, usable to control the light to appear in a position that is viewable through a windshield 116 to appear at a virtual location 118. The virtual location 118 may be controlled to be within an optical path 120 that originates from a head motion and eyebox 122 of the user 104 and represents at least a portion of a viewable range of the user 104 [0022]; Fig. 2A shows display configuration may include a display 208, one or more micro lens arrays 210a and 210b, and a three-dimensional element 212 positioned between the display 208 and micro lens array 210a. The three-dimensional element 212 may include a parallaxial or lenticular element (e.g., film) that generates auto-stereoscopic images from the output of display 208, [0023]. display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect., [0046]; disclosure provides for a three-dimensional augmented reality display system that includes an optical element to split the left and right eye image for every pixel of the image and then, using eye tracking, ties the left and right eye image back to where a user's eyes are focused. In this way, the disclosure provides a dynamic way of changing the focal point of where the image is tied together to create a continuous plane wherever the user is focusing, create a stereoscopic image, [0004]).
Regarding claim 2, Rao teaches the system according to claim 1 (see above), wherein the compute engine is further adapted to encode a lens function into the holographic image based on information received from the passenger monitoring system (light adjusting element 512 and/or one or more other optical elements (e.g., a lens system) may be used to shift planes of the light. display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect, [0046]).
Regarding claim 9, Rao teaches the system according to claim 1 (see above), wherein the display screen is mounted within the vehicle adjacent to or above passenger seating that is opposite the passenger (see Figs. 1, 2C])
Regarding claim 10, Rao teaches method of generating a floating image for a passenger within a vehicle (refer to US 20210070176), comprising:
monitoring, with a passenger monitoring system, the position of the passenger’s head and eyes (Eye tracking module 216, [0032]; See-through display creating overlaying images as floating image for user and passenger; overlays images in a projected space that occupies up to two planes, e.g., a near plane and a far plane [0004]; Eye tracking module 216 as passenger monitoring system, tracking the movement of eyes and focal point moves/gaze tracking/head motion as position of head, [[0022]-[0024]);
calculating, with a compute engine in communication with the passenger monitoring system, a holographic image (controller 214, 616, as compute engine, see at least [0024], [0050], Figs. 2A; 6; Eye tracking 216, Fig. 2A; controller 214 to control mechanism of display 208 include an eye tracking module 216, [0031]; eye tracking module 216 may include and/or may be in communication with one or more processing modules [0032]; display 208; display configuration may include lasers that create multiple holographic sheets, [0010]; computing system may receive information from one or more sensors of the vehicle or in communication with the vehicle, to determine the location of the occupant, e.g., using visual tracking of the occupant, [0044]; see at least in [0022-0024], [0035], [0044-0046], [0049]); encoding, with the compute engine (controller 214, 616, as compute engine, see at least [0024], [0050], Figs. 2A; 6), a lens function into the holographic image based on information received from the passenger monitoring system (The display configuration may include a display 208, one or more micro lens arrays 210a and 210b, and a three-dimensional element 212 positioned between the display 208 and micro lens array 210a … a selected portion of the windshield serves as a display surface, e.g., a transparent plane onto which three-dimensional display images are projected, [0023], Fig. 2A; display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect, [0046]), encoding the holographic image to a display of a picture generating unit (PGU) hologram generator (see Paragraph [0023]-[0024], [0031]-[0035], [0046] & [0050], display controller 214 and display controller 616 as computer engine [Figs. 2A and 6] in communication with eye tracking module 216 performs eye tracking and adjusting position of displayed data to align with eye gaze direction into micro lens arrays 210a/210b [Fig. 6; lens actuator 612 to physically move the movable optic 602 to change a physical location of the movable optic] forming display unit 207 as picture generating unit [Fig. 2A], display configuration create multiple holographic sheets to produce the multiple image plane effect, [0046]); projecting, with the display, the holographic image to a display screen (a display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect, [0046]) that is positioned for viewing by the passenger (see paragraphs [0022]-[0204], [0031]-[0035], [0044]-[0045] & [0049], Fig. 2A-2G, display controller 214/616 controls display of overlapping images and directs beams through fold mirror 108, rotatable mirror 110, and movable optic 602, to generate autostereoscopic images upon windshield]) and adapted to selectively switch between a first mode, wherein the display screen is adapted to display images for viewing by the passenger, and a second mode, wherein the display screen is adapted to function as a beam steering device (display 604 ... through a windshield 608 to produce an image at a first depth of field indicated at 610a, [0049]); and when the display screen is operating in the second mode, re-directing, with the display screen, the projected holographic image to the eyes of the passenger, based on the information received from the passenger monitoring system (fold mirror as a beam steering device reflect the light receives, [0022]; responsive to moving the movable optic to position 602b, image light from the display 604 may be transmitted through the windshield 608 to produce the image at a second depth of field indicated at 610b, [0049]; see Figs.1, 2A and 6, paragraph [0023], Fig.1 shows fold mirror 108, rotatable mirror 110, and paragraph [0049], the movable optic 602 be incorporated into any suitable display configuration, see Fig.6, movable optic 602, as beam steering device receives eyebox information from display controller 214/616 and eye-tracking module 216. selectively switch because lens actuator 612 may include one or more components to physically move the movable optic 602 to change a physical location of the movable optic within the display configuration. movable optic 602 may be moved in any suitable manner, [0050]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 3 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Rao as applied to claim 1 above, and further in view of Vincent et al. (US 2003/0071780).
Regarding claim 3, the modified Rao teaches the system according to claim 2 (see above), with display screen (see above).
Rao doesn’t explicitly teach the display screen includes a selectively reversible electromagnetic coating, wherein, when the display screen is operating in the first mode, the reversible electromagnetic coating is substantially transparent, and when the display screen is operating in a second mode, the reversible electromagnetic coating is reflective.
Rao and Vincent are related as optical display systems.
Vincent teaches the display screen includes a selectively reversible electromagnetic coating, wherein, when the display screen is operating in the first mode, the reversible electromagnetic coating is substantially transparent, and when the display screen is operating in a second mode, the reversible electromagnetic coating is reflective (display device including at least one writeable imaging stratum forming a pixel array of a bi-modal molecular colorant, and an addressing device mounted for selectively switching colorant molecules of the imaging stratum, [0016]; optical switch," involves changes in the electromagnetic properties of the molecules, both within and outside that detectable by the human eye, Optical switching includes changes in properties such as absorption, reflection, refraction, diffraction, and diffuse scattering of electromagnetic radiation, [0053]; layer 401 is an addressable pixel array that employs electrical field switchable, reconfigurable, bi-modal molecules … switchable between an image color state (e.g. black) and transparent state, [0060].). It would have been obvious to one of ordinary skill in the art at the time the application was filed to modify the system of Rao to include a selectively reversible electromagnetic coating, wherein, when the display screen is operating in the first mode, the reversible electromagnetic coating is substantially transparent, and when the display screen is operating in a second mode, the reversible electromagnetic coating is reflective, as taught by Vincent for the predictable advantage of providing a cost-efficient, erasable and reusable, high contrast, high resolution displays, as taught by Vincent in [0014].
Regarding claim 11, the modified Rao teaches the system according to claim 10 (see above), with display screen (see above). Rao doesn’t explicitly teach the display screen includes a selectively reversible electromagnetic coating, wherein, when the display screen is operating in the first mode, the reversible electromagnetic coating is substantially transparent, and when the display screen is operating in a second mode, the reversible electromagnetic coating is reflective, the method further including actuating the selectively reversible electromagnetic coating to cause the display screen to operate in the second mode
Rao and Vincent are related as optical display systems. Vincent teaches the display screen includes a selectively reversible electromagnetic coating, wherein, when the display screen is operating in the first mode, the reversible electromagnetic coating is substantially transparent, and when the display screen is operating in a second mode, the reversible electromagnetic coating is reflective, the method further including actuating the selectively reversible electromagnetic coating to cause the display screen to operate in the second mode (display device including at least one writeable imaging stratum forming a pixel array of a bi-modal molecular colorant, and an addressing device mounted for selectively switching colorant molecules of the imaging stratum, [0016]; optical switch," involves changes in the electromagnetic properties of the molecules, both within and outside that detectable by the human eye, Optical switching includes changes in properties such as absorption, reflection, refraction, diffraction, and diffuse scattering of electromagnetic radiation, [0053]; layer 401 is an addressable pixel array that employs electrical field switchable, reconfigurable, bi-modal molecules … switchable between an image color state (e.g. black) and transparent state, [0060]). It would have been obvious to one of ordinary skill in the art at the time the application was filed to modify the system of Rao to include a selectively reversible electromagnetic coating, wherein, when the display screen is operating in the first mode, the reversible electromagnetic coating is substantially transparent, and when the display screen is operating in a second mode, the reversible electromagnetic coating is reflective, as taught by Vincent for the predictable advantage of providing a cost-efficient, erasable and reusable, high contrast, high resolution displays, as taught by Vincent in [0014].
Claim 4, 5, 8, 12, 13, 17 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Rao as applied to claim 1 above, and further in view of Rabolt et al (US 2003/0071216).
Regarding claim 4, Rao teaches the system according to claim 2 (see above), and the compute engine and passenger monitoring system and holographic image (see claim 1 above). Rao doesn’t explicitly teach the system adapted to calculate and encode an adjustable diffraction grating into the holographic image, wherein the diffraction grating is adapted to selectively adjust the angle of the projected holographic image from the display based on feedback from the passenger monitoring system.
Rao and Rabolt are related as optical systems.
Rablot disclosed the diffraction grating is adapted to selectively adjust the angle of the projected holographic image from the display based on feedback from the passenger monitoring system (a diffraction grating, optically dispersive element 350 may be adjustable with respect to an angle of incidence between its surface and incident light which is projected onto the surface, [0065] and [claim 36]). It would have been obvious to one of ordinary skill in the art at the time the application was filed to modify the system of Rao to include wherein the diffraction grating is adapted to selectively adjust the angle of the projected holographic image from the display based on feedback from the passenger monitoring system, as taught by Rablot for the predictable advantage of the method may also be used in various industrial applications to measure and detect the thickness, either in transmission or reflection mode, the chemical structure and orientation of coatings/films, without the use of moving parts, or calculation-intensive Fourier Transform interferometric techniques, as taught by Rablot in [0037 and 0045].
Regarding claim 5, the modified Rao teaches the system according to claim 4 (see above), wherein the holographic image comprises a single two-dimensional holographic image, and the display screen, when operating in the second mode, is adapted to re-direct the single two-dimensional holographic image directly to both a right eye of the passenger and a left eye of the passenger simultaneously, wherein the passenger perceives the two-dimensional holographic image floating within the vehicle in front of the passenger (heads-up displays often provide a two-dimensional display and/or augmented reality experience that overlays images in a projected space that occupies up to two planes .. includes an optical element to split the left and right eye image for every pixel of the image and then, using eye tracking, ties the left and right eye image back to where a user's eyes are focused. In this way, the disclosure provides a dynamic way of changing the focal point of where the image is tied together to create a continuous plane wherever the user is focusing, [0004]; display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect, [0046], see [0054]).
Regarding claims 8 and 20, Rao teaches the system/vehicle according to claim 2/17 (see above), wherein: the holographic image includes a right-eye image and a left-eye image; the display comprises a right-eye display and a left-eye display; the compute engine is adapted to calculate the right-eye image and a first adjustable system, the left-eye image and a second adjustable diffraction grating, and to simultaneously encode the first diffraction grating into the right-eye image and encode the right-eye image to the right-eye display and encode the second diffraction grating into the left-eye image and encode the left-eye image to the left-eye display; the right-eye display and the left-eye display are adapted to project, simultaneously, the right-eye image to the display screen and the left-eye image to the display screen; (display system that includes an optical element to split the left and right eye image for every pixel of the image and then, using eye tracking, ties the left and right eye image back to where a user's eyes are focused. In this way, the disclosure provides a dynamic way of changing the focal point of where the image is tied together to create a continuous plane wherever the user is focusing, [0004], a display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect, [0046]; the optical element is additionally or alternatively configured to split each pixel of a selected image to create a left eye image and a right eye image, the left eye image and the right eye image forming the generated three-dimensional image, [0054]), and wherein, the first system is adapted to adjust the angle of the projected right-eye image from the right-eye display based on feedback from the passenger monitoring system, such that the display screen, when in the second mode, re-directs the right-eye image directly to the right eye of the passenger, and, simultaneously, the second system is adapted to adjust the angle of the projected left-eye image from the left-eye display based on feedback from the passenger monitoring system, such that the display screen, when in the second mode, re-directs the left-eye image directly to the left eye of the passenger, wherein the right-eye image and the left-eye image are slightly different perspectives of a single image such that when the right eye of the passenger receives the right-eye image and the left eye of the passenger receives the left-eye image, the passenger perceives a three-dimensional image floating within the vehicle in front of the passenger (FIG. 4 shows an example flow chart of a method for adjusting a display based on occupant monitoring [0017]; User/vehicle input may include occupant monitoring, such that information from vehicle systems may be used to control display personalization based on an occupant. FIG. 4 shows an example method 400 for adjusting a display based on occupant monitoring. For example, method 400 may be performed in order to provide the adjustments at 326 of method 300. Method 400 may be performed by an in-vehicle computing system, which may include a display controller, such as display controller 214 of FIG. 2, [0041]; display appearance preferences, user interface preferences, display content preferences, etc., which may be learned via occupant monitoring e.g., identifying display usage patterns of the occupant and/or identifying reactions of the occupant to different display configurations) and/or input directly from the occupant.
Rao doesn’t explicitly teach diffraction grating is adapted to adjust the angle of the projected right-eye image from the right-eye display based on feedback from the passenger monitoring system.
Rao and Rabolt are related as optical systems. Rablot disclosed the diffraction grating is adapted to adjust the angle based on feedback from the passenger monitoring system (a diffraction grating, optically dispersive element 350 may be adjustable with respect to an angle of incidence between its surface and incident light which is projected onto the surface, [0065] and [claim 36]). It would have been obvious to one of ordinary skill in the art at the time the application was filed to modify the system of Rao to include the diffraction grating is adapted to adjust the angle based on feedback from the passenger monitoring system, as taught by Rablot for the predictable advantage of the method may also be used in various industrial applications to measure and detect the thickness, either in transmission or reflection mode, the chemical structure and orientation of coatings/films, without the use of moving parts, or calculation-intensive Fourier Transform interferometric techniques, as taught by Rablot in [0037 and 0045].
Regarding claim 12, Rao teaches the method according to claim 10 (see above), the projecting, with the display, the holographic image to the display screen that is positioned for viewing by the passenger (display system 102 of a vehicle 103 is controlled to project virtual images into an environment of a user 104, [0021]; a display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect, [0046]), further includes encoding, with the compute engine (controller 214, 616, as compute engine, see at least [0024], [0050], Figs. 2A; 6),
Rao doesn’t explicitly teach an adjustable diffraction grating into the holographic image and adjusting, with the adjustable diffraction grating, the angle of the projected holographic image from the display based on feedback from the passenger monitoring system.
Rao and Rabolt are related as optical systems. Rablot disclosed an adjustable diffraction grating into the holographic image and adjusting, with the adjustable diffraction grating, the angle of the projected holographic image from the display based on feedback from the passenger monitoring system (a diffraction grating, optically dispersive element 350 may be adjustable with respect to an angle of incidence between its surface and incident light which is projected onto the surface, [0065] and [claim 36]). It would have been obvious to one of ordinary skill in the art at the time the application was filed to modify the system of Rao to include an adjustable diffraction grating into the holographic image and adjusting, with the adjustable diffraction grating, the angle of the projected holographic image from the display based on feedback from the passenger monitoring system , as taught by Rablot for the predictable advantage of the method may also be used in various industrial applications to measure and detect the thickness, either in transmission or reflection mode, the chemical structure and orientation of coatings/films, without the use of moving parts, or calculation-intensive Fourier Transform interferometric techniques, as taught by Rablot in [0037 and 0045].
Regarding claim 13, the modified Rao teaches the method according to claim 12 (see above), wherein the holographic image comprises a single two-dimensional holographic image, and the display screen, when operating in the second mode, is adapted to re-direct the single two-dimensional holographic image directly to both a right eye of the passenger and a left eye of the passenger simultaneously, wherein the passenger perceives the two-dimensional holographic image floating within the vehicle in front of the passenger (heads-up displays often provide a two-dimensional display and/or augmented reality experience that overlays images in a projected space that occupies up to two planes .. includes an optical element to split the left and right eye image for every pixel of the image and then, using eye tracking, ties the left and right eye image back to where a user's eyes are focused. In this way, the disclosure provides a dynamic way of changing the focal point of where the image is tied together to create a continuous plane wherever the user is focusing, [0004]; display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect, [0046], see [0054]).
Regarding claim 17, Rao teaches a vehicle having a system for generating a floating image for a passenger within the vehicle, the system (refer to US 20210070176), comprising: a passenger monitoring system (Eye tracking module 216, [0032]; See-through display creating overlaying images as floating image for user and passenger; overlays images in a projected space that occupies up to two planes, e.g., a near plane and a far plane [0004]) adapted to monitor the position of the passenger’s head and eyes (Eye tracking module 216 as passenger monitoring system, tracking the movement of eyes and focal point moves/gaze tracking/head motion as position of head, [[0022]-[0024], Eye tracking module 216 .. within the vehicle, configured to analyze the captured images of the eyes and determine the gaze direction and/or other eye tracking details … ,[0032 and receiving occupant monitoring data from one or more occupant monitoring sensors and controlling an output of the image light based on the occupant monitoring data, wherein the occupant monitoring data includes an occupant identifier and/or an occupant location [0053-0054]); a compute engine (controller 214, 616, as compute engine, see at least [0024], [0050], Figs. 2A; 6) in communication with the passenger monitoring system (Eye tracking 216, Fig. 2A; controller 214 to control mechanism of display 208 include an eye tracking module 216, [0031]; eye tracking module 216 may include and/or may be in communication with one or more processing modules [0032]) and adapted to: calculate a holographic image; encode a lens function into the holographic image based on information received from the passenger monitoring system (display 208; display configuration may include lasers that create multiple holographic sheets, [0010]; see [0023]-[0024], [0031]-[0035], [0046] & [0050], display controller 214 and display controller 616 as computer engine in communication with eye tracking module 216 performs eye tracking and adjusting position of displayed data to align with eye gaze direction into micro lens arrays 210a/210b forming display unit 207 [Fig. 6] as picture generating unit creating multiple holographic sheets); encode the holographic image to a display of a picture generating unit (PGU) hologram generator; (micro lens array 210a and/or 210b, and a 3D element, e.g., 3D element 212, [0024]; display configuration that create multiple holographic sheets, [0046]; display controller 214 and display controller 616 as computer engine in communication with eye tracking module 216 performs eye tracking and adjusting position of displayed data to align with eye gaze direction into micro lens arrays 210a/210b forming display unit 207 as picture generating unit creating multiple holographic sheets; controlling a display based on eye tracking, [0035]; computing system may receive information from one or more sensors of the vehicle or in communication with the vehicle, to determine the location of the occupant (e.g., using visual tracking of the occupant, [0044]; see at least in [0022-0024], [0035], [0044-0046], [0049]) and a display screen positioned for viewing by the passenger, (display system 102 includes a light source with display unit 106, which may project light in a controlled manner to form virtual images . .. may project light toward a fold mirror 108, which may be planar or aspherical, and which reflects received light toward a rotatable mirror 110, which may be aspherical. The rotatable mirror may direct light toward a glare trap 112 and light trap 114, usable to control the light to appear in a position that is viewable through a windshield 116 to appear at a virtual location 118., [0022]), the display screen including a selectively switch (selectively switch because lens actuator 612 may include one or more components to physically move the movable optic 602 to change a physical location of the movable optic within the display configuration. movable optic 602 may be moved in any suitable manner, [0050]) between a first mode, wherein the reversible electromagnetic coating is substantially transparent, and a second mode, wherein the reversible electromagnetic coating is reflective and the display screen is adapted to function as a beam steering device (micro lens array 210a and/or 210b, and a 3D element, e.g., 3D element 212, [0024]; display configuration that create multiple holographic sheets, [0046]; display controller 214 and display controller 616 as computer engine in communication with eye tracking module 216 performs eye tracking and adjusting position of displayed data to align with eye gaze direction into micro lens arrays 210a/210b forming display unit 207 as picture generating unit creating multiple holographic sheets; controlling a display based on eye tracking, [0035]; computing system may receive information from one or more sensors of the vehicle or in communication with the vehicle, to determine the location of the occupant (e.g., using visual tracking of the occupant, [0044]; see at least in [0022-0024], [0035], [0044-0046], [0049]); and wherein, the display is adapted to project the holographic image, (micro lens array 210a and/or 210b, and a 3D element, e.g., 3D element 212, [0024]; display configuration that create multiple holographic sheets, [0046]), to the display screen and the display screen is adapted to re-direct the projected holographic image to the eyes of the passenger, based on the information received from the passenger monitoring system (The display system 102 includes a light source with display unit 106, which may project light in a controlled manner to form virtual images. The display unit 106 may project light toward a fold mirror 108, which may be planar or aspherical, and which reflects received light toward a rotatable mirror 110, which may be aspherical. The rotatable mirror may direct light toward a glare trap 112 and light trap 114, usable to control the light to appear in a position that is viewable through a windshield 116 to appear at a virtual location 118. The virtual location 118 may be controlled to be within an optical path 120 that originates from a head motion and eyebox 122 of the user 104 and represents at least a portion of a viewable range of the user 104 [0022]; Fig. 2A shows display configuration may include a display 208, one or more micro lens arrays 210a and 210b, and a three-dimensional element 212 positioned between the display 208 and micro lens array 210a. The three-dimensional element 212 may include a parallaxial or lenticular element (e.g., film) that generates auto-stereoscopic images from the output of display 208, [0023]. display configuration may include lasers that create multiple holographic sheets to produce the multiple image plane effect., [0046]; disclosure provides for a three-dimensional augmented reality display system that includes an optical element to split the left and right eye image for every pixel of the image and then, using eye tracking, ties the left and right eye image back to where a user's eyes are focused. In this way, the disclosure provides a dynamic way of changing the focal point of where the image is tied together to create a continuous plane wherever the user is focusing, create a stereoscopic image, [0004]).
Rao doesn’t explicitly teach a diffraction grating adapted to selectively adjust the angle based on feedback from the passenger monitoring system; encode the diffraction grating into the holographic image.
Rao and Rabolt are related as optical systems. Rablot disclosed a diffraction grating adapted to selectively adjust the angle based on feedback from the passenger monitoring system; encode the diffraction grating into the holographic image (a diffraction grating, optically dispersive element 350 may be adjustable with respect to an angle of incidence between its surface and incident light which is projected onto the surface, [0065] and [claim 36]). It would have been obvious to one of ordinary skill in the art at the time the application was filed to modify the system of Rao to include the diffraction grating is adapted to adjust the angle based on feedback from the passenger monitoring system, as taught by Rablot for the predictable advantage of the method may also be used in various industrial applications to measure and detect the thickness, either in transmission or reflection mode, the chemical structure and orientation of coatings/films, without the use of moving parts, or calculation-intensive Fourier Transform interferometric techniques, as taught by Rablot in [0037 and 0045].
Allowable Subject Matter
Claims 6-7, 14-16 and 18-19 are objected to as being dependent upon a rejected base claim, would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAHMAN ABDUR whose telephone number is (571)270-0438. The examiner can normally be reached 8:30 am to 5:30 pm PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bumsuk Won can be reached at (571) 272-2713. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.A/Examiner, Art Unit 2872
/BUMSUK WON/Supervisory Patent Examiner, Art Unit 2872