Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over SINGH (US 2022/0392170 Al) in view of CHOI et al (US 2013/0314344 Al) and LEE et al (US 2019/0212966 Al).
As per claim 1, Singh teaches the claimed “method for handling a display device in an extended reality (XR) environment,” comprising: “obtaining, by an XR device, at least one media of a scene comprising a plurality of display devices” (Singh); “determining, by the XR device, at least one pixel group comprising locations of the plurality of display devices in the at least one media” (Singh, [0049] - In step 503, the computing device may process the images received in step 502 to detect one or more portions of one or more display devices. The images received in step 502 might capture images of one or more display devices in a physical environment of a user. For example, the images might capture all or portions of the first monitor 401, the second monitor 402, the television 403, and the laptop screen 404. In this manner, the images might indicate a location, with respect to the XR device 202, of a display device; [0051] - Detecting the one or more portions of the one or more display devices may comprise detecting, based on the one or more images, a bounding box corresponding to a location, in the one or more images, of the display device. A bounding box may correspond to an indication of the boundaries of a display device. For example, a bounding box may be drawn over a display device such that the four corners of the bounding box substantially correspond to the four corners of the display device); “determining, by the XR device, at least one parameter based on the at least one determined pixel group” (Singh, [0050] - the images received in step 502 to detect the one or more portions of the one or more display devices, a location of the one or more display devices may be determined… this location information may allow the computing device to determine, for example, which display device, of a plurality of display devices, input may be intended for. For example, by comparing the gaze of a user with the location information of a display, the computing device may be able to determine whether a user is looking at a particular display). It is noted that Singh does not explicitly teach “wherein the at least one parameter comprises at least one of a frequency and a duty cycle; extracting, by the XR device, a pulse width modulation (PWM) signal corresponding to the display device from the plurality of display devices based on the at least one parameter”; however, Sing’s identification of a selected display device (Singh, [0050] - the images received in step 502 to detect the one or more portions of the one or more display devices, a location of the one or more display devices may be determined… this location information may allow the computing device to determine, for example, which display device, of a plurality of display devices, input may be intended for. For example, by comparing the gaze of a user with the location information of a display, the computing device may be able to determine whether a user is looking at a particular display; [0056] - the XR device 202 may modify the frame rate of the cameras 203d to substantially match that of the display devices to avoid the appearance of flicker) suggests any conventional mean to recognize the selected display device among a plurality of display devices can be used; furthermore, Choi teaches that a display device can be defined based on its pulse width modulation (PWM) associated with its display frequency or duty cycle, or the XR device can “recognize, by the XR device, the display device from the plurality of display devices by correlating the extracted PWM signal with an identifier (ID) of the display device stored in at least one memory of the XR device” (see Choi, [0083] - a PWM signal applied from a backlight driver to drive a panel has a predetermined duty ratio and a predetermined frequency. The luminance sensor 220 measures only a part of backlight emitted from a BLU through the panel. The luminance sensor 220 converts the emitted backlight to an electric signal corresponding to the PWM signal; [0046] - The plurality of display units are respectively supplied with backlight through pulse width modulation (PWM) signals having different frequencies. The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units; [0061] - In detail, the backlight driver 135 applies a driving signal (e.g., a PWM signal) having a particular frequency and a particular duty ratio to the BLU 136 under control of the controller 150; [0063] - Each of the plurality of display units 130 may be realized like the display unit 131 described above. However, frequencies of driving signals applied to BLUs of the display units 130 may be differently set to distinguish the display units 130 from one another) (see also Lee, [0073]-[0074] - the screen 120 is a screen recognized by an electronic apparatus when the display 100 outputs the screen 110 at a rate in which the display 100 blinks 100 times per second. In an embodiment, the screen 130 is a screen recognized by an electronic apparatus when the display 100 outputs the screen 110 at a rate in which the display 100 blinks 120 times per second. In an embodiment, the screen 140 is a screen recognized by an electronic apparatus when the display 100 outputs the screen 110 at a rate in which the display 100 blinks 240 times per second. As in the screen 120, screen 130 and screen 140 illustrated in FIG. 1, VLC data having data values that vary according to the number of blinks (or blink rate) of the LED array included in the display 100 may be transmitted or received; [0241] - Referring to FIG. 13, when a plurality of external displays, for example external display 1320, external display 1330, and external display 1340 exist around the electronic apparatus 750, the electronic apparatus 750 may receive VLC data output by any one of external display 1320, external display 1330, and external display 1340, for example external display 1330. The electronic apparatus 750 may easily identify one external display from among the external display 1320, external display 1330, and external display 1340, and may perform pairing with the identified external display; [0242] - In an embodiment, the electronic apparatus 750 may allow the receiver 410 included therein to recognize a screen of any one of the external display 1320, external display 1330, and external display 1340, for example external display 1330, thereby obtaining the identification information about the external display 1330 quickly and conveniently. The electronic apparatus 750 may, based on the obtained identification information, display a user interface screen 1310, which is a user interface for performing the pairing operation. Accordingly, the user, by referring to the identification information 1360 included in the user interface screen 1310, may conveniently identify whether the external display 1330 is an external display with which the pairing operation is to be performed). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 2 adds into claim 1 “performing, by the XR device, at least one action based on at least one of a user input and a defined input, wherein the at least one action comprises at least one of controlling the display device and interacting with the display device” (Sinh, [0058] - Detecting input in the XR environment might comprise detecting any form of interaction by a user. Such interaction might be captured by the motion sensitive devices 203c, the cameras 203d, the position tracking elements 203e, and/or the input/output 203f of the XR device 202. For example, the motion sensitive devices 203c might track movement of a user's arms and/or hands, such that the interaction might comprise a gesture, a pointing motion, or the like. As another example, the position tracking elements 203e might track movement of a user in three-dimensional space, such that the user moving from one position in a room to another might be input by the user. As another example, the cameras 203d might capture motion of the limbs of the user while those limbs are in frame, such as hand and/or arm gestures made by the user might be captured by the cameras 203d; [0061] - The computing device may then present, in the XR environment and based on the one or more user interface elements, one or more virtual interface elements. For example, buttons might be represented in the XR environment as three-dimensional large push-buttons which the user might be able to push. As another example, an input field might be represented in the XR environment as a typewriter with a large keyboard such that the user might be able to push the keys of the keyboard to input text. As yet another example, sliders might be represented in the XR environment as large physical sliders such that the user might grab and pull the slider in various directions. The computing device may then receive, in the XR environment and via the one or more virtual interface elements, the input. In this manner, the user might be more easily interact with user interface elements in a real-world display).
Claim 3 adds into claim 2 “wherein the performing, by the XR device, the at least one action comprises: fetching, by the XR device, a mapping between the ID and the display device from the plurality of display devices stored in at least one memory; querying, by the XR device, the at least one memory using the ID to retrieve information of the display device from the plurality of display devices; retrieving, by the XR device, at least one of a device name, a manufacturer information, and a metadata for processing to identify the display device from the plurality of display devices based on the ID” (Choi, [0091]-[0092] - The storage unit 250 stores patterns engraved on display units, a map table in which coordinates corresponding to the patterns are recorded, and frequency information of a driving signal applied to each of the display units… Therefore, the controller 230 compares patterns captured by the image pickup unit 210 with the map table to recognize a coordinate corresponding to a captured area. Also, the controller 230 compares a frequency of a luminance signal measured by the luminance sensor 220 with a frequency of the driving signal applied to each of the display units to identify one of the display units; [0100] - the input apparatus 200 may compare the received frequency information with a frequency of a PWM signal of each of the plurality of display units 130 pre-stored in a storage unit to identify one of the plurality of display units 130); and “performing, by the XR device, the at least one action based on at least one of the device name, the manufacturer information, and the metadata” (Singh, [0064] - Because the input received in the XR environment might be gesture input, the input might need to be translated into corresponding input for the content displayed by a particular display device; [0070] - FIG. 7 depicts a table in a database 700 that indicates correlations between gestures in an XR environment and inputs for different applications). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a prestored ID to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent desired display unit.
Claim 16 adds into claim 2 “wherein the performing, by the XR device, the at least one action based on at least one of a user input and a defined input comprises: presenting, by the XR device, a virtual control interface to the user to provide a user input; and performing, by the XR device, the at least one action based on the user input” (Sinh, [0058] - Detecting input in the XR environment might comprise detecting any form of interaction by a user. Such interaction might be captured by the motion sensitive devices 203c, the cameras 203d, the position tracking elements 203e, and/or the input/output 203f of the XR device 202. For example, the motion sensitive devices 203c might track movement of a user's arms and/or hands, such that the interaction might comprise a gesture, a pointing motion, or the like. As another example, the position tracking elements 203e might track movement of a user in three-dimensional space, such that the user moving from one position in a room to another might be input by the user. As another example, the cameras 203d might capture motion of the limbs of the user while those limbs are in frame, such as hand and/or arm gestures made by the user might be captured by the cameras 203d; [0061] - The computing device may then present, in the XR environment and based on the one or more user interface elements, one or more virtual interface elements. For example, buttons might be represented in the XR environment as three-dimensional large push-buttons which the user might be able to push. As another example, an input field might be represented in the XR environment as a typewriter with a large keyboard such that the user might be able to push the keys of the keyboard to input text. As yet another example, sliders might be represented in the XR environment as large physical sliders such that the user might grab and pull the slider in various directions. The computing device may then receive, in the XR environment and via the one or more virtual interface elements, the input. In this manner, the user might be more easily interact with user interface elements in a real-world display).
Claim 17 adds into claim 16 “providing, by the XR device, a visual and auditory feedback to the user confirming the user action” (Singh, [0033] - The audio devices 203b may be any devices which may receive and/or output audio associated with an XR environment. For example, the audio devices 203b may comprise speakers which direct audio towards the ears of a user... The audio devices 203b may be used to provide an audio-based XR environment to a user of the XR device 202; [0059] - The computing device may determine, based on a motion property of the user gesture, the input. Such motion properties might be a direction of the user gesture, a speed of the user gesture, an orientation of the user gesture, or the like. For example, for a volume control gesture, movement upward might signify turning volume up, whereas movement downward might signify turning volume down. As another example, a quick swipe by a user to the left might signify going back in a web browser, whereas a slow swipe by a user to the left might signify scrolling horizontally to the left).
Claim 4 adds into claim 1 “wherein the determining, by the XR device, the at least one pixel group representing the display device from the plurality the locations in the at least one media comprises: identifying, by the XR device, a region of interest (RoI) corresponding to the display device by analyzing a temporal variation in a pixel intensity of the at least one media” (Singh, [0051] - Detecting the one or more portions of the one or more display devices may comprise detecting, based on the one or more images, a bounding box corresponding to a location, in the one or more images, of the display device. A bounding box may correspond to an indication of the boundaries of a display device. For example, a bounding box may be drawn over a display device such that the four corners of the bounding box substantially correspond to the four corners of the display device; Choi, [0046] - The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units) (Noted: Singh’s bounding box showing the boundaries of a display device allows a measurement of PWM signal (i.e., frequency of the measured luminance signal, or a temporal variation in a pixel intensity)); and “determining, by the XR device, the at least one pixel group representing the display device from the plurality the locations in the at least one media” (Choi, [0046] - The plurality of display units are respectively supplied with backlight through pulse width modulation (PWM) signals having different frequencies. The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units; [0061] - In detail, the backlight driver 135 applies a driving signal (e.g., a PWM signal) having a particular frequency and a particular duty ratio to the BLU 136 under control of the controller 150; [0063] - Each of the plurality of display units 130 may be realized like the display unit 131 described above. However, frequencies of driving signals applied to BLUs of the display units 130 may be differently set to distinguish the display units 130 from one another). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 5 adds into claim 1 “wherein the extracting, by the XR device, the PWM signal comprises extracting, by the XR device, the PWM signal from the at least one pixel group by: processing the at least one obtained media” (Singh, [0051] - Detecting the one or more portions of the one or more display devices may comprise detecting, based on the one or more images, a bounding box corresponding to a location, in the one or more images, of the display device. A bounding box may correspond to an indication of the boundaries of a display device. For example, a bounding box may be drawn over a display device such that the four corners of the bounding box substantially correspond to the four corners of the display device; Choi, [0046] - The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units) (Noted: Singh’s bounding box showing the boundaries of a display device allows a measurement of PWM signal (i.e., frequency of the measured luminance signal, or a temporal variation in a pixel intensity)); “mapping the at least one processed media onto a grid to spatially organize the display device present in a field of view (FOV)” (Choi, [0059] - In detail, the panel 134 includes a plurality of pixels, and the panel driver 133 respectively drives the pixels to display the pixels in response to the signal-processed image in order to display the signal-processed image) (Noted: A grid of pixels on an electronic display is a matrix of tiny, addressable light-emitting or light-modulating elements (pixels) arranged in rows and columns to form images, text, and video. These displays, including LCDs, OLEDs, and LEDs, use active or passive matrix technology to control the brightness and color of each pixel—often composed of red, green, and blue (RGB) subpixels—using a grid of wires) and “analyzing the mapped grid for locating the pixels to extract the PWM signal” (Choi, [0046] - The plurality of display units are respectively supplied with backlight through pulse width modulation (PWM) signals having different frequencies. The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units; [0061] - In detail, the backlight driver 135 applies a driving signal (e.g., a PWM signal) having a particular frequency and a particular duty ratio to the BLU 136 under control of the controller 150; [0063] - Each of the plurality of display units 130 may be realized like the display unit 131 described above. However, frequencies of driving signals applied to BLUs of the display units 130 may be differently set to distinguish the display units 130 from one another). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by mapping the media onto a pixel grid to spatially organize the display device present in a field of view (FOV). The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 6 adds into claim 1 “wherein the extracting, by the XR device, the PWM signal comprises extracting, by the XR device, the PWM signal by analyzing temporal variations in pixel intensity in the at least one pixel group” (Choi, [0046] - The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units) (Noted: Choi’s measurement of PWM signal (i.e., frequency of the measured luminance signal) implies a temporal variation in a pixel intensity)). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 7 adds into claim 1 “wherein the extracting, by the XR device, the PWM signal comprises extracting, by the XR device, the PWM signal after applying a noise filtering technique on the PWM signal” which is well-known in the art of signal processing in which the noise is removed to eliminate unwanted, high-frequency electrical interference and to prevent the XR device from receiving disruptive electromagnetic noise, enhancing performance of detecting the PWM signal. The motivation to use a noise filter is to enhance the PWM signal in recognizing the selected display device.
Claim 8 adds into claim 1 “wherein the determining, by the XR device, the at least one parameter comprises determining, by the XR device, the at least one parameter by measuring a time interval between high and low states of the PWM signal” (Choi, Figure 6, [0083] - a PWM signal applied from a backlight driver to drive a panel has a predetermined duty ratio and a predetermined frequency) (Noted: Choi’s PWM frequency is determined by the period showing the time interval between high and low PWM states). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 9 adds into claim 1 “wherein the ID of the display device from the plurality of display devices is stored in the at least one memory by at least one of a frequency value and a hash value representation” (Choi, [0046] - The plurality of display units are respectively supplied with backlight through pulse width modulation (PWM) signals having different frequencies. The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units; [0061] - In detail, the backlight driver 135 applies a driving signal (e.g., a PWM signal) having a particular frequency and a particular duty ratio to the BLU 136 under control of the controller 150; [0063] - Each of the plurality of display units 130 may be realized like the display unit 131 described above. However, frequencies of driving signals applied to BLUs of the display units 130 may be differently set to distinguish the display units 130 from one another; [0091]-[0092] - The storage unit 250 stores patterns engraved on display units, a map table in which coordinates corresponding to the patterns are recorded, and frequency information of a driving signal applied to each of the display units… Therefore, the controller 230 compares patterns captured by the image pickup unit 210 with the map table to recognize a coordinate corresponding to a captured area. Also, the controller 230 compares a frequency of a luminance signal measured by the luminance sensor 220 with a frequency of the driving signal applied to each of the display units to identify one of the display units; [0100] - the input apparatus 200 may compare the received frequency information with a frequency of a PWM signal of each of the plurality of display units 130 pre-stored in a storage unit to identify one of the plurality of display units 130). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a pre-stored ID to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 10 adds into claim 1 “wherein the obtaining at least one media of the scene comprising the display device from the plurality of display devices comprises: generating the PWM signal unique to the display device from the plurality of display devices by at least one state of a lighting device” (Choi, [0091]-[0092] - The storage unit 250 stores patterns engraved on display units, a map table in which coordinates corresponding to the patterns are recorded, and frequency information of a driving signal applied to each of the display units… Therefore, the controller 230 compares patterns captured by the image pickup unit 210 with the map table to recognize a coordinate corresponding to a captured area. Also, the controller 230 compares a frequency of a luminance signal measured by the luminance sensor 220 with a frequency of the driving signal applied to each of the display units to identify one of the display units; [0100] - the input apparatus 200 may compare the received frequency information with a frequency of a PWM signal of each of the plurality of display units 130 pre-stored in a storage unit to identify one of the plurality of display units 130). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 11 adds into claim 1 “wherein before the extracting, by the XR device, the PWM signal corresponding to the display device comprises: monitoring, by the XR device, the display device after activating a pass-through mode on the XR device” (Sign, [0054]-[0055] - the XR device 202 may allow the user to see through a portion of the display to view the one or more display devices… In some circumstances (e.g., AR environments), portions of real-world environments may be displayed as part of the XR environment; Choi, [0044] - The display apparatus 100 includes a plurality of display units (e.g., displays)) (Noted: The pass-through mode of Singh’s XR device allows the system to analyze the contents on the display units as in Choi to derive the PWM signals associated with the display devices respectively). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit appeared on a see-through XR display. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 12 adds into claim 11 “wherein, in the pass-through mode, the XR device is configured to collect an identifier (ID) of the display device from the plurality of display devices by using the extracted PWM signal” (Choi, Figure 6, [0083] - a PWM signal applied from a backlight driver to drive a panel has a predetermined duty ratio and a predetermined frequency) (Noted: Choi’s PWM’s uniqueness to each type of the display device is used to identify the display device). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a pre-stored ID to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 13 adds into claim 1 “wherein obtaining, by an XR device, at least one media of a scene comprising a plurality of display devices, comprises: scanning, by an XR device, an environment in which a display device from a plurality of display devices is located to emit a pulse width modulation (PWM) signal” (Choi, [0046] - The plurality of display units are respectively supplied with backlight through pulse width modulation (PWM) signals having different frequencies. The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units; [0061] - In detail, the backlight driver 135 applies a driving signal (e.g., a PWM signal) having a particular frequency and a particular duty ratio to the BLU 136 under control of the controller 150; [0063] - Each of the plurality of display units 130 may be realized like the display unit 131 described above. However, frequencies of driving signals applied to BLUs of the display units 130 may be differently set to distinguish the display units 130 from one another), “when a user of the XR device approaches an area in which the display device is located” (Singh, [0041] - FIG. 4 depicts a physical environment around the XR device 202. Depicted in FIG. 4 are four different display devices: a first monitor 401, a second monitor 402, a television 403, and a laptop screen 404. All such display devices may be referred to as physical display devices or real display devices in that they exist in a real-world physical environment about the XR device 202, and might not necessarily be displayed in any sort of XR environment). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 14 adds into claim 13 “detecting, by the XR device, the PWM signal emitted by the display device based on the scanning of the environment” (Choi, [0046] - The plurality of display units are respectively supplied with backlight through pulse width modulation (PWM) signals having different frequencies. The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units; [0061] - In detail, the backlight driver 135 applies a driving signal (e.g., a PWM signal) having a particular frequency and a particular duty ratio to the BLU 136 under control of the controller 150; [0063] - Each of the plurality of display units 130 may be realized like the display unit 131 described above. However, frequencies of driving signals applied to BLUs of the display units 130 may be differently set to distinguish the display units 130 from one another). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a PWM signal to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claim 15 adds into claim 1 “wherein the recognizing, by the XR device, the display device from the plurality of display devices comprises: determining an identifier (ID) corresponding to the display device from the plurality of display devices by comparing, by the XR device, the detected PWM signal with a stored PWM signal in at least one memory of the XR device” (Choi, [0046] - The plurality of display units are respectively supplied with backlight through pulse width modulation (PWM) signals having different frequencies. The input apparatus 200 measures a luminance signal of light emitted from a panel of the display unit into which the user command is input and compares a frequency of the measured luminance signal with the frequencies of the PWM signals of the other display units in order to distinguish the display unit of the plurality of display units into which the user command is input, from the other display units; [0061] - In detail, the backlight driver 135 applies a driving signal (e.g., a PWM signal) having a particular frequency and a particular duty ratio to the BLU 136 under control of the controller 150; [0063] - Each of the plurality of display units 130 may be realized like the display unit 131 described above. However, frequencies of driving signals applied to BLUs of the display units 130 may be differently set to distinguish the display units 130 from one another); and “performing, by the XR device, at least one action comprising at least one of controlling the display device and interacting with the display device” (Sinh, [0058] - Detecting input in the XR environment might comprise detecting any form of interaction by a user. Such interaction might be captured by the motion sensitive devices 203c, the cameras 203d, the position tracking elements 203e, and/or the input/output 203f of the XR device 202. For example, the motion sensitive devices 203c might track movement of a user's arms and/or hands, such that the interaction might comprise a gesture, a pointing motion, or the like. As another example, the position tracking elements 203e might track movement of a user in three-dimensional space, such that the user moving from one position in a room to another might be input by the user. As another example, the cameras 203d might capture motion of the limbs of the user while those limbs are in frame, such as hand and/or arm gestures made by the user might be captured by the cameras 203d; [0061] - The computing device may then present, in the XR environment and based on the one or more user interface elements, one or more virtual interface elements. For example, buttons might be represented in the XR environment as three-dimensional large push-buttons which the user might be able to push. As another example, an input field might be represented in the XR environment as a typewriter with a large keyboard such that the user might be able to push the keys of the keyboard to input text. As yet another example, sliders might be represented in the XR environment as large physical sliders such that the user might grab and pull the slider in various directions. The computing device may then receive, in the XR environment and via the one or more virtual interface elements, the input. In this manner, the user might be more easily interact with user interface elements in a real-world display). Thus, it would have been obvious, in view of Choi and Lee, to configure Singh’s method as claimed by using a pre-stored ID to recognize a specific desired display unit. The motivation is to allow the user of the XR system to interact with the correspondent display device.
Claims 18-19 and 20 claim an extended reality (XR) device and a non-transitory computer-readable storage medium storing instructions based on the methods of claims 1-17; therefore, they are rejected under a similar rationale.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PHU K NGUYEN whose telephone number is (571)272-7645. The examiner can normally be reached M-F 8-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel F. Hajnik can be reached at (571) 272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PHU K NGUYEN/Primary Examiner, Art Unit 2616