Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 13 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
Claim 13 describes a computer program per se.
Computer programs claimed as computer listings per se, i.e., the descriptions or expressions of the programs, are not physical "things." They are neither computer components nor statutory processes, as they are not "acts" being performed. Such claimed computer programs do not define any structural and functional interrelationships between the computer program and other claimed elements of a computer which permit the computer program's functionality to be realized. In contrast, a claimed non-transitory computer-readable medium encoded with a computer program is a computer element which defines structural and functional interrelationships between the computer program and the rest of the computer which permit the computer program's functionality to be realized, and is thus statutory. See Lowry, 32 F.3d at 1583-84, 32 USPQ2d at 1035.
As an additional note, a computer program claimed as stored on a non-transitory computer readable medium having executable programming instructions stored thereon is considered statutory.
For the sake of further prosecution, the Examiner will treat the “VR presence enhancement program” of claim 13 as a computer program claimed as stored on a non-transitory computer readable medium having executable programming instructions stored thereon.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 6-11, and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa (US20210286432A1) and Kyoungsoo (US10561374B2).
Regarding claim 1, Ishikawa teaches a VR presence enhancement system, comprising: a VR image generator that generates VR image data including sensory temperature information, based on VR space data in which the sensory temperature information is embedded (Fig. 1 shows a VR headset (20e) and, ¶0045, describes the head mounted display (HMD) displaying virtual reality (VR). ¶0057, virtual objects associated with attribute information stored in an object DB. ¶0065, the output control unit controls display of a picture including a virtual object by a HMD, ¶0061, the system specifies original sensory stimulation information for a virtual object and touch context, ¶0075, converting original temperature to an actual temperature for output, ¶79-80, describes the output control unit controls display of an additional image that can include an indication of the original temperature superimposed on the displayed picture and, Fig. 5, shows the user interacting with a hot virtual object and the resultant image with the temperature information displayed. This teaches VR image data that includes sensory temperature information, based on VR space/object data with embedded temperature related attributes.) an XR goggles controller that controls XR goggles to display a VR image based on the VR image data; (Fig. 1 shows a VR headset (20e) and, ¶0045, describes the head mounted display (HMD) displaying virtual reality (VR)) a cooling/heating device controller that controls a cooling/heating device based on the sensory temperature information extracted by the temperature detector, the cooling/heating device adjusting a sensory temperature of a VR experiencer wearing the XR goggles (¶0032, describes the stimulation output unit outputs tactile stimulation including thermal sensation and can generate heat, ¶0036, describes using an air conditioner (cooling), ¶0066, output control unit controls output of sensory stimulation by the stimulation output unit, and, Fig. 4 and ¶0075, converting original temperature to actual temperature for thermal output control. This teaches a cooling/heating device controller based on sensory temperature information to adjust the sensory temperature of the VR experiencer.)
However, Ishikawa does not explicitly disclose a temperature detector that extracts the sensory temperature information from the VR image data generated by the VR image generator.
Kyoungsoo discloses, Fig. 44 and pg. 106, col. 69 lines 12-24, the content reproduction device reproduces multimedia content including (col. 69 lines 27-30) VR/AR applications (S1) and obtains thermal feedback information according to reproduction of the multimedia content (S2) by decoding thermal feedback data included with the content (lines 46-52), and (lines 65-67) the thermal feedback information includes thermal feedback type (hot/cold). This teaches extracting sensory temperature information from the reproduced VR content data. It would have been obvious to one of ordinary skill in the art, before the effective filing date, to modify Ishikawa’s VR thermal feedback system to incorporate Kyoungsoo’s approach of embedding thermal feedback data within the multimedia content and extracting it during reproduction in order to improve user immersion by ensuring the thermal sensations correspond to the displayed virtual environment.
Regarding claim 2, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 1, wherein the VR image generator generates the VR image data in which the sensory temperature information is superimposed on the VR image as graphics (Ishikawa; ¶0076, describes controlling display of an “additional image corresponding to the output control information of sensory stimulation”, and, Fig. 5, shows the temperature information superimposed over the image. This teaches VR image data that includes a graphical overlay conveying sensory temperature information superimposed on the image.)
Regarding claim 3, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 2, wherein the VR image generator superimposes, based on attribute information of a part constituting a VR space, graphics indicating the sensory temperature information of the part on the VR image (Ishikawa; ¶0057-58, describes virtual objects in the VR space are associated with attribute information such as texture information for individual faces/parts stored in an object DB; ¶0061-62, the system specifies original tactile stimulation, including thermal stimulation/temperature, for a touched face/part based on the attribute information of that face/part and a table/DB; ¶0077, the output control unit controls display of an additional image associated with the virtual image and contact position, ¶0079-80, the additional image is super imposed on the displayed VR image and includes an indication of the original temperature (see also Fig. 5). This teaches super imposing based on attribute information of a part constituting the VR space, graphics information indicating sensory temperature information.)
Regarding claim 4, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 2, wherein the graphics indicate the sensory temperature information by a number (Ishikawa; Fig. 5, shows the graphics indicating the temperature as a number, and, ¶0080, describes indicating the degree of temperature using on a character string (“180° C”). This teaches graphics indicating the sensory temperature information by a number.)
Regarding claim 6, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 1, wherein the VR image data includes the sensory temperature information associated with an area that is set in a VR space (Kyoungsoo; pg. 117, col. 91 lines 22-32, further describes a virtual space that includes a “flame area” and a “glacier area” with hot and cold-temperature properties respectively. This teaches the sensory temperature information being associated with areas in a VR space.)
Regarding claim 7, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 6, wherein the sensory temperature information is set so that a sensory temperature within the area is constant (Kyoungsoo; pg. 117, col. 91 lines 22-26, describe a virtual space that includes a “flame area” and a “glacier area” with hot and cold-temperature properties respectively(see claim 6) and, lines 39-54, describe the thermal feedback being determined by the temperature property assigned to the corresponding area when entered by a user. This teaches the set area sensory temperature information is constant.)
Regarding claim 8, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 6, wherein the area is constituted by a 3D object for setting a sensory temperature, the 3D object allowing entry into the 3D object, and the sensory temperature information is set as an attribute of the 3D object (Kyoungsoo; pg. 117, col. 92 lines 29-45, describe a hot spring as a sub-area (an enterable area in a VR space is implemented as a 3D volume/object that the user can enter) of the glacier area, that has a temperature property assigned to provide hot thermal feedback and when the user leaves the hot spring area (exit/entry) cold feedback is output. This reads on an area constituted by a 3D object for setting sensory temperature (user in a cold area enter a hot spring and is provided with hot thermal feedback), the sensory temperature information set as an attribute of the object (hot spring).)
Regarding claim 9, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 8, wherein the VR image generator reads out the sensory temperature information of the 3D object when an avatar of the VR experiencer enters the 3D object in the VR space, and superimposes the sensory temperature information on the VR image as graphics (Kyoungsoo; pg. 117, col. 91 lines 39-54, describe the thermal feedback being determined by the temperature property assigned to the corresponding area when entered by a user (see claim 7). This reads on the sensory temperature information (temperature property/attribute) being read out upon entry into the 3D object/area. Ishikawa; ¶79-80, describes the output control unit controls display of an additional image that can include an indication of the original temperature superimposed on the displayed picture and, Fig. 5, shows the user interacting with a hot virtual object and the resultant image with the temperature information displayed (see claim 1). The combination teaches reading out temperature information upon entry and, based on that entry triggered readout, superimposing the corresponding temperature information as graphics.)
It would have been obvious to one of ordinary skill in the art, before the effective filing date, to modify Ishikawa’s temperature overlay graphics with Kyoungsoo’s entry-triggered temperature determination in order to enhance user immersion.
Regarding claim 10, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 9, wherein the VR image generator acquires an operation state of a cooling/heating device from the cooling/heating device controller, and generates the VR image data in which the sensory temperature information is superimposed on the VR image as graphics when the cooling/heating device is in operation (Kyoungsoo; pg. 116, col. 90 lines 30-48, describes that the feedback controller transmits a thermal feedback report signal to the content reproduction device to report the operating status of the heat output module and that the report signal may include whether thermal feedback is being output and the type/intensity of the thermal feedback being currently output. Ishikawa; Fig. 5 and ¶0079-80, describes generating/displaying an additional image including a numerical temperature (“180° C”) and superimposing it on the VR image. Together this teaches acquiring the cooling/heating device’s operation state and generating VR image data with the sensory temperature information superimposed as graphics when the cooling/heating device is in operation.)
It would have been obvious to one of ordinary skill in the art, before the effective filing date, to modify Ishikawa’s temperature display with Kyoungsoo’s operational status reporting in order to display the temperature graphics only when the device is operating, providing more accurate user feedback.
Regarding claim 11, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 2, wherein the VR image generator superimposes environmental information other than the sensory temperature information on the VR image as graphics (Ishikawa; ¶0076, describes that the output control unit controls display of an additional image that can include indications of original sensory stimulations; ¶0032, the original sensory stimulation includes not only thermal sensation, but also tactile stimulation such as vibration and pressure sensation. This teaches superimposing environmental information other than temperature, such as pressure or vibration information, on the VR image as graphics) and controls an environment adjustment device adjusting an environment of the VR experiencer, based on the environmental information other than the sensory temperature information detected from the graphics (Ishikawa; ¶0076, describes the output control unit controls display of an additional image corresponding to sensory stimulation output control information and, ¶0032, the output control unit controls stimulation output units to output tactile stimulation including vibration and pressure sensation, ¶0033-35, the tactile stimulation units include devices such as actuators worn by the user that can output vibrations. Together, this teaches detecting the non-temperature environmental information from the graphics and controlling the environment adjustment device based on that detected information.)
Claim 13 has similar limitations as of Claim 1 except it is a CRM claim, therefore it is rejected under the same rationale as Claim 1. Ishikawa, ¶0109-112, describes a hardware configuration including a CPU with RAM, ROM, and a storage device to store the program to be executed by the CPU.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa (US20210286432A1), Kyoungsoo (US10561374B2), and Skidmore (US20190244431A1).
Regarding claim 5, Ishikawa in view of Kyoungsoo fails to teach, but Skidmore teaches the VR presence enhancement system according to claim 1, wherein the sensory temperature information is superimposed on the VR image as a digital watermark (Skidmore; ¶0010, describes virtual markers are added to image/frame content, incorporated into image/video data, and may involve an alteration to pixel information. ¶0011, the marker can be visually blended and “a virtual marker may be a watermark”. In the combination this teaches superimposing (embedding into the image data) the sensory temperature information on the VR image as a digital watermark.) It would have been obvious to one of ordinary skill in the art, before the effective filing date, to modify the VR thermal feed system of Ishikawa in view of Kyoungsoo to incorporate Skidmore’s technique of embedding information as watermarks in order to not disrupt the user’s view and increasing immersion.
Claim 12 are rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa (US20210286432A1), Kyoungsoo (US10561374B2), and Baumbach (US20210018977A1).
Regarding claim 12, Ishikawa in view of Kyoungsoo teaches the VR presence enhancement system according to claim 1, further comprising a VR generator that places an object in an area of a VR space for setting a sensory temperature when creating a VR space (Kyoungsoo; pg. 117, col. 91 lines 22-32, describes a virtual space partitioned into areas/regions to which temperature properties are assigned (flame area assigned a hot temperature property and a glacier area assigned a cold temperature property) and, lines 27-47, the user can enter and move through the areas. This teaches creating/placing an enterable zone/area in the VR space for setting sensory temperature.) and sets, to the VR space data, an operation of displaying the sensory temperature information as graphics in a field of view of the XR goggles when the avatar enters the object, based on the sensory temperature information set as an attribute of the object (as previously discussed, Kyoungsoo; pg. 117 col. 91 22-32 and 27-47, describes that temperature properties are assigned to areas/regions and that, when the player character enters an area, the controller determines thermal feedback information according to the temperature property assigned to the entered area with reference to an area temperature property table (lines 8-17). Ishikawa; Fig. 5 and ¶0079-80 describes superimposing an additional image including a numerical temperature indication (“180° C”) on the displayed VR image.)
It would have been obvious to one of ordinary skill in the art to modify Ishikawa’s thermal feedback system with Kyoungsoo’s temperature-assigned areas to provide the benefit of enhanced realism when users enter different locations.
However, Ishikawa in view of Kyoungsoo fails to teach, but Baumbach teaches the object being transparent and not interrupting an action of an avatar (Baumbach; ¶0042, describes adjusting rendering of a virtual element to increase visibility of an underlying real-world element by increasing transparency, including, ¶0045, rendering the virtual element partially or fully transparent, and, ¶0037, describes where a user may attempt to place an arm/head into a virtual window, but a real-world wall is what blocks physical movement, not the rendered virtual object. This teaches a rendered virtual object/zone that can be transparent and that does not itself interrupt user/avatar action. Together, with the combination, this teaches that the VR space data/metadata (assigned area temperature property and table) defines an entry-triggered operation to display the corresponding sensory temperature information as graphics in the headset field of view upon entry.) It would have been obvious to one of ordinary skill in the art, before the effective filing date, to modify the VR thermal feedback system of Ishikawa in view of Kyoungsoo to incorporate Baumbach’s technique of rendering a transparent, non-obstructive virtual zone/object to allow a user to enter a temperature-setting zone/object without interruption allowing better immersion.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAN F KALHORI whose telephone number is (571)272-5475. The examiner can normally be reached Mon-Fri 8:30-5:30 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Devona Faulk can be reached at (571) 272-7515. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DAN F KALHORI/Examiner, Art Unit 2618
/DEVONA E FAULK/Supervisory Patent Examiner, Art Unit 2618