DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 10-18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because it is directed to products that do not have a physical or tangible form, such as information (often referred to as "data per se") or a computer program per se (often referred to as "software per se") claimed as a product without any structural recitations; and because a computer readable storage medium can be comprised of transitory matter (ie. Carrier waves).
Examiner suggests adding "non-transitory" to the claim as “Claim 10. A computer program product, the computer program product comprising a non-transitory computer readable storage medium having program instructions embodied therewith, ….” in order to overcome the rejection.
Claim 10. A computer program product, the computer program product comprising a non-transitory computer readable storage medium having program instructions embodied therewith, ….
Variations of the term “storage”, for example in the term “computer readable storage medium” are not considered to limit a media claim to non-transitory embodiments because content may be considered to be stored on a signal during propagation and because many disclosures conflate storage media and signals. For example, US Patent 6,286,104 discloses: “the methods described herein may be implemented by a series of computer-executable instructions residing on a storage medium such as a carrier wave”. See the Board decision in ex parte Mewherter (10/685,192) where the Board affirmed 101 rejection of a “machine readable storage medium”. The decision is precedential, and while even precedential Board decisions are not considered to be examining guidance, the decision can be cited in an examiner’s answer.
Note that the decision also refers to Official guidance in the form of training delivered to the Corps: U.S. Patent & Trademark Office, Evaluating Subject Matter Eligibility Under 35 USC § 101 (Aug. 2012 Update); pp. 11-14, available at http://www.uspto.gov/patents/law/exam/101_training_aug2012.pdf.
Please note that even if the transitory types of machine readable medium are removed from the examples of machine readable medium cited in the disclosure, the broadest reasonable interpretation of a machine readable medium would still include transitory types unless there is a closed definition excluding them in the disclosure.
A claim drawn to a computer readable medium that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 USC 101 by adding the limitation store in “non-transitory" or “tangible” computer readable storage medium to the claim.
Furthermore, according to the new "Subject Matter Eligibility of Computer Readable Medium" memo dated January, 2010, https://patentlyo.com/media/docs/2012/06/101_crm_20100127.pdf
A claim drawn to a computer readable medium that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 USC 101 by adding the limitation store in “non -transitory" or “tangible” computer readable storage medium to the claim.
http://www.uspto.gov/patents/law/notices/101_crm_20100127.pdf
"Subject Matter Eligibility of Computer Readable Medium" memo dated January, 2010,
https://patentlyo.com/media/docs/2012/06/101_crm_20100127.pdf
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 7-12, 16-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by
Wang et al (US Pub 2022/0383634 A1).
Regarding Claim 1, Wang et al teaches a computer-implemented method, comprising: outputting a first virtual reality (VR) environment (Figs. 5A-5C, 8) representation for display on a plurality of VR devices associated with a VR collaboration session, wherein each of the VR devices is used by a different participant of the VR collaboration session (see Paragraph [0068] FIGS. 5A-5C illustrate an exemplary collaboration session, according to some embodiments. FIG. 5A illustrates an exemplary mixed reality collaboration session where users 508a, 508b, and 508c may be at a first location (e.g., a first room) together. FIG. 5B illustrates an exemplary mixed reality collaboration session where users 508d and 508e may be at a second location (e.g., a second room) together. FIG. 5C illustrates an exemplary mixed reality collaboration session where a session handle has been moved.”);
analyzing first inputs received from the participants to determine whether to output a second VR environment representation (Figs. 5A-5C, 8) for display on the plurality of VR devices [see Paragraph [0068] “… FIG. 5B illustrates an exemplary mixed reality collaboration session where users 508d and 508e may be at a second location (a second room) together. ...”);
determining, based on the analysis, a second VR environment representation and a first transitional sequence for the VR devices to output while transitioning from displaying the first VR environment representation to displaying the second VR environment representation (see Paragraph [0095] In some embodiments, user 802 may be remote from user 804. For example, user 802 may be in a first room, and user 804 may be in a second room different than the first room. In some embodiments, users 802 and 804 may collaborate on virtual model 808 using a session instance. In some embodiments, user 802 may see virtual model 808 in the first room, and user 804 may see virtual model 808 in the second room. In some embodiments, virtual model 808 may be presented relative to a first session handle for user 802, and virtual model 808 may be presented relative to a second session handle for user 804. In some embodiments, virtual annotations made by one user (e.g., user 802) may be visible to all session users (e.g., user 804).”);
outputting the second VR environment representation and the first transitional sequence for display on the VR devices; and in response to a determination that a first of the participants has performed a predetermined gesture (see Paragraph [0055] In some examples, the depth cameras 444 can supply 3D imagery to a hand gesture tracker 411, which may be implemented in a processor of the wearable head device 400A.”),
outputting a second transitional sequence for the VR devices to output while transitioning from the second VR environment representation back to the first VR environment representation (see Paragraph [0093] In some embodiments, user 802 may annotate virtual model 808 (e.g., by creating virtual markup and/or adding virtual comments). For example, user 802 may create virtual markup 812, which may indicate that a café could be placed at a location in virtual model 808. In some embodiments, user 804 may see virtual markup 812. In some embodiments, user 804 may see virtual markup 812 as user 802 is creating virtual markup 812... In some embodiments, a virtual comment can include location indicator 814 and/or comment bubble 816.… In some embodiments, comment bubble 816 can be presented in different positions for different users. For example, comment bubble 816 may be presented to user 802 as facing user 802, and comment bubble 816 may be presented to user 804 as facing user 804. In some embodiments, comment bubble 816 may be configured to continually face a user as a user looks in different locations. In some embodiments, comment bubble 816 can be presented in the same position for multiple users (e.g., all local users) of a session.
[0094] “…. data corresponding to virtual annotations may be transmitted from a capability instance (e.g., capability instance 608c) to a session instance (e.g., session instance 606b). In some embodiments, data corresponding to virtual annotations may be transmitted from a capability instance to collaboration core 610b. In some embodiments, collaboration core 610b may transmit data corresponding to virtual annotations to one or more remote servers (e.g., one or more remote servers configured to handle data synchronization and/or synchronization conflicts). In some embodiments, one or more remote servers may transmit data corresponding to virtual annotations to other session users. In some embodiments, data corresponding to virtual annotations can be stored in a session instance. In some embodiments, a session instance can be closed and re-opened, and one or more capability instances (e.g., virtual model 808 and/or virtual markup 812) can be loaded and/or displayed to users.“).
Regarding Claims 2, 11, 20, Wang et al teaches the computer-implemented method wherein the first inputs are selected from the group consisting of: a gesture performed by one of the participants (see Paragraph [0055] In some examples, the depth cameras 444 can supply 3D imagery to a hand gesture tracker 411, which may be implemented in a processor of the wearable head device 400A. The hand gesture tracker 411 can identify a user's hand gestures, for example by matching 3D imagery received from the depth cameras 444 to stored patterns representing hand gestures. Other suitable techniques of identifying a user's hand gestures will be apparent.”).
Regarding Claims 3, 12, Wang et al teaches the computer-implemented method wherein the second transitional sequence is selected from the group consisting of: a visual effect of an avatar associated with the first participant throwing away the second VR environment representation (see Paragraph [0007] “… Where a user of a two-dimensional screen may have to hunt through one of forty open tabs to re-open a desired application, a user of an XR system may be able to pinpoint a desired virtual object displayed on a desk (like picking up a real folder placed on a desk). Furthermore, XR systems may enable users to see virtual avatars of other users to simulate the live presence of other people…. It can therefore be desirable to develop systems and methods for enabling deep user collaboration on XR systems.
[0008] XR systems can offer a uniquely heightened sense of immersion and realism by combining virtual visual and audio cues with real sights and sounds. Accordingly, it is desirable in some XR systems to present a virtual environment that enhances, improves, or alters a corresponding real environment. This disclosure relates to XR systems that enable consistent placement of virtual objects across multiple XR systems.“).
Regarding Claims 7, 16, Wang et al teaches the computer-implemented method wherein the first VR environment representation includes an avatar for each of the participants, wherein the second VR environment representation includes the avatars (Figs. 5A-5C, 8).
Regarding Claims 8, 17, Wang et al teaches the computer-implemented method wherein the avatars are cartoon representations based on a physical appearance of the participants, wherein the first VR environment representation is a real-world depiction of a geographical location (Figs. 5A-5C, 8).
Regarding Claims 9, 18, Wang et al teaches the computer-implemented method comprising: receiving second inputs from the participants, wherein the second inputs define customized participant-specific inputs for initiating a currently displayed VR environment representation to be changed to a different VR environment representation; and building a dictionary of customized participant-specific inputs, wherein the analysis is, at least in part, based on the dictionary. (see Paragraph [0094] “…. data corresponding to virtual annotations may be transmitted from a capability instance (e.g., capability instance 608c) to a session instance (e.g., session instance 606b). In some embodiments, data corresponding to virtual annotations may be transmitted from a capability instance to collaboration core 610b. In some embodiments, collaboration core 610b may transmit data corresponding to virtual annotations to one or more remote servers (e.g., one or more remote servers configured to handle data synchronization and/or synchronization conflicts). In some embodiments, one or more remote servers may transmit data corresponding to virtual annotations to other session users. In some embodiments, data corresponding to virtual annotations can be stored in a session instance. In some embodiments, a session instance can be closed and re-opened, and one or more capability instances (e.g., virtual model 808 and/or virtual markup 812) can be loaded and/or displayed to users.“).
Regarding Claim 10, the CRM Claim 10 is rejected for same reason as the method Claim 1, since claim limitations are same in both claims (the CRM non-transitory computer readable storage medium is shown in Paragraph 0105).
Regarding Claim 19, the apparatus Claim 19 is rejected for same reason as the method Claim 1, since claim limitations are same in both claims.
Allowable Subject Matter
Claims 4-6 and 13-15 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is an examiner’s statement of reasons for allowance:
The prior arts fail to teach the computer-implemented method wherein the first transitional sequence is selected from the group consisting of: an object with the second VR environment representation depicted therein traversing across the first VR environment representation, objects of the second VR environment representation gradually populating the first VR environment representation, objects of the second VR environment representation appearing in the first VR environment representation and gradually increasing in size as claimed in Claims 4 and 13.
Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.”
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Chand et al US 20240104859 A1, Faaborg et al (US 10101803 B2), McCracken et al (US 20240096033 A1), and Baszucki et (US 20230334743 A1) teaches the VR devices is used by a different participant of the VR collaboration session.
Felman (US 20220132214 A1) (US 11792485 B2) and Dedonato et al (US 20220269333 A1) teaches the sensing tags relate to sight, touch, sound, taste and smell being emitted by the VR devices.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VIJAY SHANKAR whose telephone number is (571)272-7682. The examiner can normally be reached M-F 9 am- 6 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Eason can be reached at 571-270-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
VIJAY SHANKAR
Primary Examiner
Art Unit 2624
/VIJAY SHANKAR/Primary Examiner, Art Unit 2624