DETAILED ACTION
This Office action is in response to the communication filed on December 23, 2025. Claims 1 and 4-9 remain pending and claims 2-3 have been cancelled in this application. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d) based on application filed in Japan on April 12, 2023 has been acknowledged and considered by Examiner. Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d) that are placed on record in the application file.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Response to Arguments
Applicant’s arguments with respect to amended claims 1, 8, and 9 in the Remarks section (pages 6-10) have been fully considered but are not persuasive.
Applicant argues the sensor of Kim detects user gaze and not user position and Duan teaches in some embodiments, teach the user sees only the image rendered specifically the user.
However, Kim clearly teaches in paragraph [0011] the display apparatus teaches a transparent display unit configured to display content including a first part and second part on the basis of the direction in which the user is located. So the gaze corresponds to the user location. Kim is the primary reference and Duan is relied on how transparent display are implemented, specifically, the naked-eye (non-projection) transparent 3d display was implemented, or including a pair of transparent substrates and a polymer dispersed liquid crystal layer held between the pair of transparent substrates, wherein the display device was viewed from one side of the transparent substrates, and viewed from another side of the transparent substrates, and that a gaze sensor is an image capturing device. Position information of multiple eyes of a plurality of users relative to a target screen in space is obtained, and the position information includes spatial coordinate information and angle information by a camera including information about the eyes of the user relative to the target screen an image of the target screen can be collected by the camera. The processing of display content of Kim with the structure of the transparent 3d display meets the claim limitations. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
Applicant's arguments have been fully considered with respect to 4-7 in the Remarks section (pages 9-10) but they are not persuasive as the claims depend upon the features recited in the amended independent claims.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4-6 and 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication 2015/0235346 A1 by Kim et al. (“Kim”) in view of U.S. Patent Publication 2024/0062341A1 A1 by Duan.
Regarding claim 1, Kim teaches a display system (Fig. 1, display apparatus 100) comprising: a transparent display device (Fig. 1, transparent display unit 110); and a control device which controls display of images in the display device (Kim Fig. 1, controller 120 [0045], The display apparatus 100 may include a transparent display unit 110, a controller 120), wherein the display device is transparent to a background on one side when viewed from another side and is transparent to a background on the other side when viewed from the one side (Kim Fig 2; [0046] and [0070], The transparent display unit 110 may display content including a first part and a second part on the basis of a first direction in which a first user is located where the first user 10 and the second user 20 view the same content positioned on opposite sides and second parts 32b are further viewable backgrounds),
the control device detects a position of a person with respect to the display device based on a first image containing the person obtained by a device which captures the person ([0015], The sensor unit may include a gaze sensor configured to sense gaze of the first user), and displays a second image containing an object on the display device based on the detected position ([0046], The transparent display unit 110 may display content… on the basis of a first direction in which a first user is located),
wherein the position of the person with respect to the display device includes an angle at which the person is positioned with respect to the display device (Figs. 2-4), and
the second image is generated from the data of the object based on the angle containing an object directed in accordance with the angle at which the person is positioned with respect to the display device (Fig. 4; [0046]-[0047] and [0071], The transparent display unit 110 may display content including a first part and a second part on the basis of a first direction in which a first user is located. The gaze sensor 133 may receive gaze information of the first user 10 and gaze information of the second user 20. The controller of the display apparatus may determine a display direction of each second part based on input gaze information. That is, the controller may maintain a display direction of the text “habitat” 34 to which the gaze of the first user 10 is directed at an angle toward the text. The controller may implement mirror flipping on the text “whale” 35 a to which the gaze of the second user 20 is directed at an angle.)
However, Kim did not teach how the transparent display was implemented, or including a pair of transparent substrates and a polymer dispersed liquid crystal layer held between the pair of transparent substrates, wherein the display device was viewed from one side of the transparent substrates, and viewed from another side of the transparent substrates, and that a gaze sensor is an image capturing device. Kim did not teach the control device includes a storage device which stores three-dimensional image data of an object, the second image is generated from the three-dimensional image data.
In the analogous art of multi-user large-scale transparent display devices, Duan teaches a naked-eye 3D screen can be typically configured to display a 3D image based on a light barrier 3D technology. An implementation method of the light barrier 3D technology can include using a switch liquid crystal display, a polarization film, and a polymer liquid crystal layer and producing perpendicular stripes with a direction of 90° using the liquid crystal layer and the polarization film. A switch/switching liquid crystal display technology as in the prior art was switching a scattering state of a polymer dispersed liquid crystal between a first substrate and a second substrate as in U.S. Patent Publication 2023/0350257 A1 by Muramoto paragraph [0003]. The stripes can be several tens of micrometers wide. Light passing through the stripes can form a perpendicular grating pattern, which can be referred to as a parallax barrier. In this technology, by using the parallax barrier that is arranged between the backlight module and the LCD panel, in a 3D display mode, when the image that should be seen by the left eye is displayed on the liquid crystal screen, a non-transparent stripe can block a right eye. Similarly, when the image that should be seen by the right eye is displayed on the liquid crystal screen, the non-transparent stripe can block the left eye. By separating the viewable images of the left eye and the right eye, the user can see the 3D image (Duan [0003] and [0057]).
Thus, in some embodiments, when the target screen is the naked-eye 3D screen, each user can only see the rendered image corresponding to the user on the target screen. Thus, each user can only see the augmented reality content or mixed image related to the user and may not need to care about the augmented reality content or the mixed images of the other users. Position information of multiple eyes of a plurality of users relative to a target screen in space is obtained, and the position information includes spatial coordinate information and angle information by a camera including information about the eyes of the user relative to the target screen an image of the target screen can be collected by the camera (Duan [0018], [0023], and [0057]). It would have been obvious before the effective filing date of the invention to have presented content was 3-dimensional to multiple users using polymer liquid crystal shutters. One having ordinary skill in the art would have been motivated to have each user can only see the augmented reality content or mixed image related to the user and may not need to care about the augmented reality content or the mixed images of the other users in large scale holographic transparent screens (Duan [0003] and [0057]).
Regarding claim 4, Kim of the combination of references further teaches the display system of claim 1, wherein the control device displays on the display device a second image expressing a state in which the object is viewed from the detected position (Fig. 4; [0046]-[0047] and [0071], The transparent display unit 110 may display content including a first part and a second part on the basis of a first direction in which a first user is located. The gaze sensor 133 may receive gaze information of the first user 10 and gaze information of the second user 20. The controller of the display apparatus may determine a display direction of each second part based on input gaze information. That is, the controller may maintain a non-mirrored state display direction of the text “habitat” 34 to which the gaze of the first user 10 is directed at an angle toward the text. The controller may implement mirror flipping state on the text “whale” 35 a to which the gaze of the second user 20 is directed at an angle.)
Regarding claim 5, Kim of the combination of references further teaches the display system of claim 1, wherein the control device displays on the display device a second image expressing a state in which the object is facing a direction of the detected position (Fig. 4; [0046]-[0047] and [0071], The transparent display unit 110 may display content including a first part and a second part on the basis of a first direction in which a first user is located. The gaze sensor 133 may receive gaze information of the first user 10 and gaze information of the second user 20. The controller of the display apparatus may determine a display direction of each second part based on input gaze information. That is, the controller may maintain a non-mirrored state display direction of the text “habitat” 34 to which the gaze of the first user 10 is directed at an angle toward the text facing the first user. The controller may implement mirror flipping state on the text “whale” 35 a to which the gaze of the second user 20 is directed at an angle.)
Regarding claim 6, Kim of the combination of references further teaches the display system of claim 1, wherein the capturing device acquires a first capture containing the person by capturing the person in front of or behind the display device (Fig. 4; [0068], Referring to FIG. 4(1), the sensor unit of the display apparatus may include a gaze sensor 133 to sense the gaze of the first user 10 and the gaze of the second user 20. In this case, the gaze sensor 133 may include a first sensor to sense the gaze of the first user 10 located in a first direction and a second sensor to sense the gaze of the second user 20 located in a second direction opposite to device);
the control device displays a second image containing a first side of the object when the first image contains the person present in front of the display device, and displays a second image containing a second side of the object when the first image contains the person present behind the display device, the second side being a side opposite to the first side (Fig. 4; [0046]-[0047] and [0071], The transparent display unit 110 may display content including a first part and a second part on the basis of a first direction in which a first user is located. The gaze sensor 133 may receive gaze information of the first user 10 and gaze information of the second user 20. The controller of the display apparatus may determine a display direction of each second part based on input gaze information. That is, the controller may maintain a display direction of the text “habitat” 34 to which the gaze of the first user 10 is directed at an angle toward the text. The controller may implement mirror flipping on the text “whale” 35 a to which the gaze of the second user 20 is directed at an angle.)
Kim does not teach the image capturing device acquires a first image.
In the analogous art of multi-user large-scale transparent display devices, Duan teaches a naked-eye 3D screen can be typically configured to display a 3D image based on a light barrier 3D technology. An implementation method of the light barrier 3D technology can include using a switch liquid crystal display, a polarization film, and a polymer liquid crystal layer and producing perpendicular stripes with a direction of 90° using the liquid crystal layer and the polarization film. A switch/switching liquid crystal display technology as in the prior art was switching a scattering state of a polymer dispersed liquid crystal between a first substrate and a second substrate as in U.S. Patent Publication 2023/0350257 A1 by Muramoto paragraph [0003]. The stripes can be several tens of micrometers wide. Light passing through the stripes can form a perpendicular grating pattern, which can be referred to as a parallax barrier. In this technology, by using the parallax barrier that is arranged between the backlight module and the LCD panel, in a 3D display mode, when the image that should be seen by the left eye is displayed on the liquid crystal screen, a non-transparent stripe can block a right eye. Similarly, when the image that should be seen by the right eye is displayed on the liquid crystal screen, the non-transparent stripe can block the left eye. By separating the viewable images of the left eye and the right eye, the user can see the 3D image (Duan [0003] and [0057]).
Thus, in some embodiments, when the target screen is the naked-eye 3D screen, each user can only see the rendered image corresponding to the user on the target screen. Thus, each user can only see the augmented reality content or mixed image related to the user and may not need to care about the augmented reality content or the mixed images of the other users. Position information of multiple eyes of a plurality of users relative to a target screen in space is obtained, and the position information includes spatial coordinate information and angle information by a camera including information about the eyes of the user relative to the target screen an image of the target screen can be collected by the camera (Duan [0018], [0023], and [0057]). It would have been obvious before the effective filing date of the invention to have presented content was 3-dimensional to multiple users using polymer liquid crystal shutters. One having ordinary skill in the art would have been motivated to have each user can only see the augmented reality content or mixed image related to the user and may not need to care about the augmented reality content or the mixed images of the other users in large scale holographic transparent screens (Duan [0003] and [0057]).
Regarding claim 8, Kim teaches a a display device (Fig. 1, transparent display unit 110 of display apparatus 100); and a control device which controls display of images in the display device (Kim Fig. 1, controller 120 [0045], The display apparatus 100 may include a transparent display unit 110, a controller 120), wherein the display device is transparent to a background on one side when viewed from another side and is transparent to a background on the other side when viewed from the one side (Kim Fig 2; [0046] and [0070], The transparent display unit 110 may display content including a first part and a second part on the basis of a first direction in which a first user is located where the first user 10 and the second user 20 view the same content positioned on opposite sides and second parts 32b are further viewable backgrounds), and
the control device detects a position of a person with respect to the display device based on a first image containing the person obtained by a device which captures the person ([0015], The sensor unit may include a gaze sensor configured to sense gaze of the first user), and displays a second image containing an object on the display device based on the detected position ([0046], The transparent display unit 110 may display content… on the basis of a first direction in which a first user is located), wherein the position of the person with respect to the display device includes an angle at which the person is positioned with respect to the display device (Figs. 2-4), and
the second image is generated from the data of the object based on the angle containing an object directed in accordance with the angle at which the person is positioned with respect to the display device (Fig. 4; [0046]-[0047] and [0071], The transparent display unit 110 may display content including a first part and a second part on the basis of a first direction in which a first user is located. The gaze sensor 133 may receive gaze information of the first user 10 and gaze information of the second user 20. The controller of the display apparatus may determine a display direction of each second part based on input gaze information. That is, the controller may maintain a display direction of the text “habitat” 34 to which the gaze of the first user 10 is directed at an angle toward the text. The controller may implement mirror flipping on the text “whale” 35 a to which the gaze of the second user 20 is directed at an angle.)
However, Kim did not teach how the transparent display was implemented, or including a pair of transparent substrates and a polymer dispersed liquid crystal layer held between the pair of transparent substrates, wherein the display device was viewed from one side of the transparent substrates, and viewed from another side of the transparent substrates, and that a gaze sensor is an image capturing device. Kim did not teach the control device includes a storage device which stores three-dimensional image data of an object, the second image is generated from the three-dimensional image data.
In the analogous art of multi-user large-scale transparent display devices, Duan teaches a naked-eye 3D screen can be typically configured to display a 3D image based on a light barrier 3D technology. An implementation method of the light barrier 3D technology can include using a switch liquid crystal display, a polarization film, and a polymer liquid crystal layer and producing perpendicular stripes with a direction of 90° using the liquid crystal layer and the polarization film. A switch/switching liquid crystal display technology as in the prior art was switching a scattering state of a polymer dispersed liquid crystal between a first substrate and a second substrate as in U.S. Patent Publication 2023/0350257 A1 by Muramoto paragraph [0003]. The stripes can be several tens of micrometers wide. Light passing through the stripes can form a perpendicular grating pattern, which can be referred to as a parallax barrier. In this technology, by using the parallax barrier that is arranged between the backlight module and the LCD panel, in a 3D display mode, when the image that should be seen by the left eye is displayed on the liquid crystal screen, a non-transparent stripe can block a right eye. Similarly, when the image that should be seen by the right eye is displayed on the liquid crystal screen, the non-transparent stripe can block the left eye. By separating the viewable images of the left eye and the right eye, the user can see the 3D image (Duan [0003] and [0057]).
Thus, in some embodiments, when the target screen is the naked-eye 3D screen, each user can only see the rendered image corresponding to the user on the target screen. Thus, each user can only see the augmented reality content or mixed image related to the user and may not need to care about the augmented reality content or the mixed images of the other users. Position information of multiple eyes of a plurality of users relative to a target screen in space is obtained, and the position information includes spatial coordinate information and angle information by a camera including information about the eyes of the user relative to the target screen an image of the target screen can be collected by the camera (Duan [0018], [0023], and [0057]). It would have been obvious before the effective filing date of the invention to have presented content was 3-dimensional to multiple users using polymer liquid crystal shutters. One having ordinary skill in the art would have been motivated to have each user can only see the augmented reality content or mixed image related to the user and may not need to care about the augmented reality content or the mixed images of the other users in large scale holographic transparent screens (Duan [0003] and [0057]).
Regarding claim 9, the above rejection of the display system in claim 1 stands for the corresponding method claimed.
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication 2015/0235346 A1 by Kim et al. (“Kim”) in view of U.S. Patent Publication 2024/0062341A1 A1 by Duan, and further in view of U.S. Patent Publication 2013/0265232 A1 by Yun et al. (“Yun.”)
Regarding claim 7, Kim in view of Duan does not teach the display system of claim 1, wherein when the first image contains a plurality of persons, the control device detects a position of one of the plurality of persons with respect to the display device, whose distance from the display device is a closest among the plurality of persons.
However, in the analogous art of a transparent display apparatus and method for displaying information thereon, Yun teaches sensing a position of an object, sensing a position of a user, determining an area of the transparent display through which the object is viewable by the user, and displaying the information on the transparent display based on the area (Yun Abstract). If two people are determined to view the transparent display apparatus 100, a closest viewer may be prioritized or viewers within an effective space of the transparent display apparatus 100 may be prioritized (Yun [0224]). It would have been obvious before the effective filing date of the invention to have prioritized closer users for the orienting of second part as modified by Duan. One having ordinary skill would have motivated to have prioritized a user and utilized effective space at which information is viewable based on size of font, or a space in which both information and a product behind the transparent display apparatus 100 may be accurately viewed (Yun [0224]).
Conclusion
THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MAHEEN I JAVED whose telephone number is (571)272-0825. The examiner can normally be reached on Mon-Fri 9:00 am-5:00 pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, AMR AWAD can be reached on 571-272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MAHEEN I JAVED/Examiner, Art Unit 2621
/AMR A AWAD/Supervisory Patent Examiner, Art Unit 2621