DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claims 1, 3-10, 12-13 and 15-23 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claims 1, 3, 12-13 and 15 have been amended, claims 2, 11 and 14 have been canceled and claims 22-23 have been newly added.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-10, 12-13 and 15-23 are rejected under 35 U.S.C. 103 as being unpatentable over Laaksonen (US 2019/0180509) in view of Ekpar (US 2004/0169724).
Regarding claim 1, Laaksonen discloses an information exchange method, comprising:
receiving composite video configuration information, wherein the composite video configuration information comprises virtual reality space information (providing signaling information and information identifying the location of an object of interest in the VR/AR content; 0127-0128, 0150, 0156-0157, 0169, 0184, 0204 and 0260), a plurality of virtual reality subspace information corresponding to the virtual reality space information (sub-space information; 0133-0136, 0143 and 0149-0156), and video configuration information corresponding to the virtual reality subspace information (signaling information; see at least paragraphs 0128 and 0176);
determining a target virtual reality subspace in which a user is located in a virtual reality Space (sub-space information; 0133-0136, 0143 and 0149-0156); and
determining, based on the composite video configuration information, video configuration information corresponding to the target virtual reality subspace, and presenting video content in the target virtual reality subspace based on the determined video configuration information (based on the above information presenting VR/AR content to multiple users; see at least paragraphs 0105-0106, 0125, 0156, 0161-0162 and 0164-0169).
Laaksonen is not clear about a composite video configuration information comprises each of virtual reality space information, a plurality of pieces of virtual reality subspace information and video configuration information, a virtual reality space corresponding to the virtual reality space information comprises a plurality of reality subspaces, the plurality of reality subspaces is in one-to-one correspondence with the plurality of pieces of virtual reality subspaces of the plurality of virtual reality subspaces provides different viewing angles for a same object.
Ekpar discloses a composite video configuration information comprises each of virtual reality space information, a plurality of pieces of virtual reality subspace information and video configuration information, a virtual reality space corresponding to the virtual reality space information comprises a plurality of reality subspaces, the plurality of reality subspaces is in one-to-one correspondence with the plurality of pieces of virtual reality subspaces of the plurality of virtual reality subspaces provides different viewing angles for a same object; creating an interactive virtual tour packages from one or several spherical environment maps, the virtual tour package can be arranged as a series of interconnected or linked spherical environment maps with information describing how the individual maps are interconnected or linked and additional information specifying multimedia content to rendered in response to the activation of interactively defined regions of the maps. Once the package has been created it is exported to a viewing engine that provides a means for the user to interactively select a particular spherical environment map from the package and specify viewing parameters such as a lateral angle; see at least paragraphs 0043-0044 and 0047.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify Laaksonen by the teachings of Ekpar by having the above limitations so to be able to create interactive tours; see at least the Abstract.
Regarding claim 3, Laaksonen in view of Ekpar disclose the method according to claim 1, wherein the viewing angle comprises one or more of the following: a stage-side viewing angle, a close-up viewing angle, and a long-shot viewing angle (Laaksonen; see at least paragraph 0114 and the viewing angle of Ekpar; see at least the rejection of claim 1).
Regarding claim 4, Laaksonen in view of Ekpar disclose the method according to claim 1, wherein a same virtual reality subspace corresponds to two or more than two pieces of video configuration information (Laaksonen; hybrid virtual space; see at least Fig. 6 and paragraphs 0139-0149 and Ekpar; see at least the rejection of claim 1).
Regarding claim 5, Laaksonen in view of Ekpar disclose the method according to claim 4, wherein the two or more than two pieces of video configuration information comprise video configuration information for presenting a 3D video image and video configuration information for presenting a 2D video image (Laaksonen; real world object and 3D objects; see at least paragraphs 0107-0108, 0110 and 0171 and Ekpar; see at least the rejection of claim 1).
Regarding claim 6, Laaksonen in view of Ekpar disclose the method according to claim 1, wherein different video presentation environments are presented in different virtual reality spaces, and the video presentation environment comprises one or more of the following elements: a stage, setting, lighting, props, special effect elements, and choreography (Laaksonen; VR/AR content; see at least paragraphs 0105-0106, 0109-0110 and 0161).
Regarding claim 7, Laaksonen in view of Ekpar disclose the method according to claim 1, wherein the virtual reality space information comprises a scene identifier (Laaksonen; see at least paragraphs 0103-0107).
Regarding claim 8, Laaksonen in view of Ekpar disclose the method according to claim 1, wherein the video configuration information comprises video presentation mode information, and the video presentation mode information comprises one or more pieces of the following: screen shape information, screen quantity information, video dimension type information, and virtual camera information (Laaksonen; see at least paragraphs 0117-0118, 0131-0134 and 0157).
Regarding claim 9, Laaksonen in view of Ekpar disclose the method according to claim 1, wherein the video configuration information comprises live-streaming phase information, and the live-streaming phase information comprises one or more of the following: a pre-live-streaming phase, an in-live-streaming phase, and a post-live-streaming phase (Laaksonen; see at least paragraphs 0104-0105, 0140 and 0161-0162).
Regarding claim 10, Laaksonen in view of Ekpar disclose the method according to claim 1, wherein the determining a target virtual reality subspace in which a user is located in a virtual reality space comprises:
determining, in response to an instruction triggered by the user to enable a virtual character controlled by the user to enter the target virtual reality subspace, the target virtual reality subspace in which the user is located in the virtual reality space (Laaksonen; see at least paragraphs 0217-0218).
Claims 12 and 13 are rejected on the same grounds as claim 1.
Claim 15 is rejected on the same grounds as claim 3.
Claim 16 is rejected on the same grounds as claim 4.
Claim 17 is rejected on the same grounds as claim 5.
Claim 18 is rejected on the same grounds as claim 6.
Claim 19 is rejected on the same grounds as claim 7.
Claim 20 is rejected on the same grounds as claim 8.
Claim 21 is rejected on the same grounds as claim 9.
Claim 22 is rejected on the same grounds as claim 10.
Claim 23 is rejected on the same grounds as claim 3.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to YASSIN ALATA whose telephone number is (571)270-5683. The examiner can normally be reached Mon-Fri 7-4 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nasser Goodarzi can be reached at 571-272-4195. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/YASSIN ALATA/Primary Examiner, Art Unit 2426