DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-2, 9-10, 17-18 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wang et al (US Pub 2022/0383634 A1).
Regarding Claim 1, Wang et al teaches a method comprising: at a first electronic device (508d in Fig. 5B) in communication with one or more displays, one or more input devices, and a second electronic device (506a in Fig. 5B): presenting, via the one or more displays, a computer-generated environment (see Paragraph [0004] “Virtual reality (“VR”), augmented reality (“AR”), mixed reality (“MR”), and related technologies (collectively, “XR”) share an ability to present, to a user of an XR system, sensory information corresponding to a virtual environment represented by data in a computer system.”),
wherein the first electronic device is located at a first location relative to a first origin in a first physical environment (See Paragraph [0032] “FIG. 1A illustrates an example real environment 100 in which a user 110 uses a mixed reality system 112. … The real environment 100 shown includes a rectangular room 104A, in which user 110 is standing; and real objects 122A (a lamp), 124A (a table), 126A (a sofa), and 128A (a painting). Room 104A further includes a location coordinate 106, which may be considered an origin of the real environment 100. “) of a user of the first electronic device, and has a first orientation relative to the first origin in the first physical environment of the user of the first electronic device (see Figs. 5A-5C, 8; Paragraph [0068] “FIGS. 5A-5C illustrate an exemplary MR collaboration session, according to some embodiments. FIG. 5A illustrates an exemplary mixed reality collaboration session where users 508a, 508b, and 508c may be at a first location (e.g., a first room) together….”);
while presenting the computer-generated environment, detecting a request to display a portal through which to visually communicate with a user of the second electronic device, wherein the second electronic device is located at a second location, different from the first location, relative to a second origin in a second physical environment (See Paragraph [0032] “FIG. 1A illustrates an example real environment 100 in which a user 110 uses a mixed reality system 112…The real environment 100 shown includes a rectangular room 104A, in which user 110 is standing; and real objects 122A (a lamp), 124A (a table), 126A (a sofa), and 128A (a painting). Room 104A further includes a location coordinate 106, which may be considered an origin of the real environment 100. “), different from the first physical environment, of the user of the second electronic device and has a second orientation, different from the first orientation, relative to the second origin in the second physical environment of the user of the second electronic device (see Figs. 5A-5C, 8, Paragraph [0068] “FIG. 5B illustrates an exemplary mixed reality collaboration session where users 508d and 508e may be at a second location (e.g., a second room) together. FIG. 5C illustrates an exemplary mixed reality collaboration session where a session handle has been moved.”);
(See Figs. 5A-5C, 8, Paragraph [0069] “In some embodiments, users 508a, 508b, 508c, 508d, and 508e may all be part of the same mixed reality collaboration session 500. In some embodiments, a collaboration session can include a session handle 502a (which may be a virtual object). Session handle 502a may serve as a local anchor for a session. For example, all session users in the same location (e.g., users 508a, 508b, and 508c may be considered in the same location if they share common persistent coordinate data) may be presented virtual content positioned relative to session handle 502a, which may give the virtual content the appearance of being located in a particular location and orientation in the real world, similar to a real/physical object.”); and
in response to detecting the request: displaying, via the one or more displays, a portal including a representation of the user of the second electronic device in the computer-generated environment, wherein a respective portion of the representation of the user of the second electronic device is oriented based on the second location and the second orientation. (See Fig. 8, Paragraph “[0092] “FIG. 8 illustrates an exemplary mixed reality collaboration session, according to some embodiments. In some embodiments, users 802 and 804 may use one or more MR systems (e.g., MR system 806, which can correspond to MR systems 112, 200) to collaborate on 3D virtual content. In some embodiments, users 802 and 804 may utilize a session to display and/or collaborate on virtual content. For example, virtual model 808 may be presented to users 802 and 804. In some embodiments, virtual model 808 can be presented to users 802 and 804 in the same position (e.g., location and/or orientation) ….”),
(See Fig. 8, Paragraph [0095] “ In some embodiments, user 802 may be remote from user 804. For example, user 802 may be in a first room, and user 804 may be in a second room different than the first room. In some embodiments, users 802 and 804 may collaborate on virtual model 808 using a session instance. In some embodiments, user 802 may see virtual model 808 in the first room, and user 804 may see virtual model 808 in the second room. …”).
Regarding Claims 2, 10, 18, Wang et al teaches the method wherein the respective portion of the representation of the user of the second electronic device is oriented to face toward a viewpoint of the user of the first electronic device. (See Fig. 10; Paragraph [0095] “ [0101] In some embodiments, a user may select location indicator 1008, which may be displayed on and/or near virtual objects 1002 and/or 1004. In some embodiments, selecting location indicator 1008 may cause virtual object 1010 to be displayed to a user. Virtual object 1010 can include a prism, which may include a virtual comment bubble. In some embodiments, virtual object 1010 can be displayed near location indicator 1008. In some embodiments, virtual object 1010 may not intersect with virtual object 1002 (e.g., because it may obscure a view of a virtual comment bubble). In some embodiments, virtual objects 1012 and/or 1010 may continually face a user as a user moves around an environment. In some embodiments, virtual objects 1012 and/or 1010 may reposition themselves if their view becomes obstructed (e.g., if a user moves such that virtual object 1002 obstructs a view from the user to virtual object 1010).“).
Regarding Claim 9, the apparatus Claim 9 is rejected for same reason as the method Claim 1, since claim limitations are same in both claims.
Regarding Claim 17, the CRM Claim 17 is rejected for same reason as the method Claim 1, since claim limitations are same in both claims (the CRM non-transitory computer readable storage medium is shown in Paragraph 0105).
Allowable Subject Matter
Claims 3-8, 11-16, 19-24 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Examiner cites particular columns and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
It is noted that any citation to specific pages, columns, figures, or lines in the prior art references any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331-33, 216 USPQ 1038-39 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275, 277 (CCPA 1968)).
Examiner’s Note
Examiner has cited particular paragraphs/columns and line numbers or figures in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested from the applicant, in preparing the responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
Applicant is reminded that the Examiner is entitled to give the broadest reasonable interpretation to the language of the claims. Furthermore, the Examiner is not limited to Applicant’s definition which is not specifically set forth in the claims.
In the case of amending the claimed invention, Applicant is respectfully requested to indicate the portion(s) of the specification which dictate(s) the structure relied on for proper interpretation and also to verify and ascertain the metes and bounds of the claimed invention.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VIJAY SHANKAR whose telephone number is (571)272-7682. The examiner can normally be reached M-F 9 am- 6 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Eason can be reached at 571-270-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
VIJAY SHANKAR
Primary Examiner
Art Unit 2624
/VIJAY SHANKAR/Primary Examiner, Art Unit 2624