Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claim 10 objected to because of the following informalities:
Claim 10 depends upon claim 9 which is a canceled claim. Instead, claim 10, should be amended to depend upon claim 2. If claim 10 is amendment to reflect this claim 10 will be allowable as indicated below.
Appropriate correction is required.
Allowable Subject Matter
Claims 1 – 3, 5-8, 10-14, 16-19, 58 and 59 are allowed.
Claim 10 will be allowable provided it is resolved in the manner suggested in claim objections above.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 56 and 57 are rejected under 35 U.S.C. 103 as being unpatentable over Perkins (US 20200319702 A1) in view of Goldberg et al. (US 20200233502 A1) currently cited in an 892.
Regarding claim 56, Perkins teaches a method for operating a supervisory device comprising (See abstract, “The present teaching relates to method, system, medium, and implementations for augmenting data via data crowd sourcing. Local data are acquired by one or more sensors deployed on a first device. Augmented data are obtained where such augmented data are generated based on the local data and remote data from at least one second device and are rendered on the first device.” The examiner notes that the first device may be considered the supervisory device, but other devices such as group master device 400, as detailed in Fig. 4, and described in ¶57-¶58 may be considered to be supervisory device as well. The examiner notes that all other recitations of supervisory device should be interpreted as any device or the previous devices mentioned as being supervisory device.):
determining positional data […] of the supervisory device (See abstract, “The present teaching relates to method, system, medium, and implementations for augmenting data via data crowd sourcing. Local data are acquired by one or more sensors deployed on a first device. Augmented data are obtained where such augmented data are generated based on the local data and remote data from at least one second device and are rendered on the first device.” The examiner notes that the first device may be considered the supervisory device, but other devices such as group master device 400, as detailed in Fig. 4, and described in ¶57-¶58 may be considered to be supervisory device as well. The examiner notes that all other recitations of supervisory device should be interpreted as any device or the previous devices mentioned as being supervisory device.);
generating for display, on a display screen of the supervisory device, a plurality of XR device representations, wherein each of the plurality of XR device representations is based on a respective XR device of a plurality of XR devices (¶44, “In some embodiments, the augmented reality data may be generated by stitching data from different devices in a smart way to generate, e.g., a panoramic view of the scene that each of the different devices captured only a portion of. This is illustrated in FIG. 2E. Other alternative augmented reality data include 3D views which may be generated based on data captured from different perspectives of the same scene.” Fig. 2E: Figure 2E is a panoramic photo that is according to a spatial arrangement);
determining, based at least in part on the positional data […] of the supervisory device, that the supervisory device is oriented towards a selected XR device, wherein the selected XR device is from a plurality of XR devices (¶35, “Alternatively, the device may specify to select data streams from specific devices that are located in a certain physical locations specified. For example, the device may reside near the south gate of the sports stadium and it may specify to use the data stream from a first device near the North gate, a second device near the West gate, and a third device near the East gate of the same sports stadium. In this way, the device may acquire data from 4 different viewpoints of the stadium to generate an augmented reality data stream that captures the sports scene in a more complete manner.” ¶36, “The frameworks 100 or 200 may also be configured to perform centralized augmented reality data generation via crowd sourcing. In this scheme, data acquired by individual devices at different locations may be sent or made available to the augmented reality data generator 150. Augmented reality data are generated by the augmented reality data generator 150 based on e.g., what is requested either expressed in the subscription database 140 or via an on-demand request. The subscription may be connected to accounts associated with either individual devices (e.g., individual accounts) or groups of individual devices (e.g., group accounts). Associated with each account, the subscription data may include specification of desired data (e.g., video), selection criteria in terms of quality (e.g., audio with less noise), location (e.g., from the same locale), or volume (e.g., limited to two best sources from the crowd).”)); […]
retrieving a replica stream of the selected XR device (¶51, “According to the present teaching, the augmented reality data generator 150 received data (or data streams) from the devices 110 and for some of the devices, augmented reality data need to be generated in accordance with their subscription of on-demand requests. To generate augmented reality data for each device (that either subscribed or sent on-demand request), data to be used for generating the requested augmented reality data may first be selected and then used for creating the augmented reality data. As shown, the filtered data streams from the stream data filter 300 may be sent to the stream data selector 305 so that data streams to be used for generating augmented reality data may then be selected for each request based on a corresponding data selection configuration. This is achieved by retrieving, at 314, a selection configuration for the next request, which is then used by the stream data selector 305 to determine, at 316, specific data streams to be used for generating the augmented reality data. As discussed herein, the restrictions on sources of data may be provided in the subscription or in an on-demand request. For instance, although there may be hundreds of devices that are connected to the augmented reality data generator 150, a specific subscription/request may specify that only three data streams should be selected to generate the augmented reality data based on certain specified criteria, e.g., three devices that are located at certain coordinates.”); and
generating for display, on the display screen, the replica stream of the selected XR device (¶51, “According to the present teaching, the augmented reality data generator 150 received data (or data streams) from the devices 110 and for some of the devices, augmented reality data need to be generated in accordance with their subscription of on-demand requests. To generate augmented reality data for each device (that either subscribed or sent on-demand request), data to be used for generating the requested augmented reality data may first be selected and then used for creating the augmented reality data. As shown, the filtered data streams from the stream data filter 300 may be sent to the stream data selector 305 so that data streams to be used for generating augmented reality data may then be selected for each request based on a corresponding data selection configuration. This is achieved by retrieving, at 314, a selection configuration for the next request, which is then used by the stream data selector 305 to determine, at 316, specific data streams to be used for generating the augmented reality data. As discussed herein, the restrictions on sources of data may be provided in the subscription or in an on-demand request. For instance, although there may be hundreds of devices that are connected to the augmented reality data generator 150, a specific subscription/request may specify that only three data streams should be selected to generate the augmented reality data based on certain specified criteria, e.g., three devices that are located at certain coordinates.”) but doesn’t explicitly disclose and orientation data; based at least in part on the supervisory device being oriented towards the selected XR device.
Goldberg and orientation data (See ¶4, orientation data); based at least in part on the supervisory device being oriented towards the selected XR device (¶24, “When the user is aiming at the controllable device, the mobile computing device determines a coordinate corresponding to the location and a direction. For example, the position may be determined using a visual positioning module of the mobile computing device and the direction may be determined based on an orientation of the mobile computing device as determined using the visual positioning module or as measured using, for example, an inertial motion unit. Some implementations include a head-mounted display device and the user may aim at the device by looking at the controllable device.”).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Perkins in view of Goldberg as doing so enables more intuitive, context aware, and efficient XR interactions, thus increasing the flexibility offered.
Regarding claim 57, Perkins teaches the method of claim 56, wherein:each of the plurality of XR device representations are arranged on the display screen based on a spatial arrangement of the plurality of XR devices (¶44, “In some embodiments, the augmented reality data may be generated by stitching data from different devices in a smart way to generate, e.g., a panoramic view of the scene that each of the different devices captured only a portion of. This is illustrated in FIG. 2E. Other alternative augmented reality data include 3D views which may be generated based on data captured from different perspectives of the same scene.” Fig. 2E: Figure 2E is a panoramic photo that is according to a spatial arrangement; and the spatial arrangement is based on positional data for each of the plurality of XR devices with respect to the positional data of the supervisory device (¶51, “According to the present teaching, the augmented reality data generator 150 received data (or data streams) from the devices 110 and for some of the devices, augmented reality data need to be generated in accordance with their subscription of on-demand requests. To generate augmented reality data for each device (that either subscribed or sent on-demand request), data to be used for generating the requested augmented reality data may first be selected and then used for creating the augmented reality data. As shown, the filtered data streams from the stream data filter 300 may be sent to the stream data selector 305 so that data streams to be used for generating augmented reality data may then be selected for each request based on a corresponding data selection configuration. This is achieved by retrieving, at 314, a selection configuration for the next request, which is then used by the stream data selector 305 to determine, at 316, specific data streams to be used for generating the augmented reality data. As discussed herein, the restrictions on sources of data may be provided in the subscription or in an on-demand request. For instance, although there may be hundreds of devices that are connected to the augmented reality data generator 150, a specific subscription/request may specify that only three data streams should be selected to generate the augmented reality data based on certain specified criteria, e.g., three devices that are located at certain coordinates.”).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT J CRADDOCK whose telephone number is (571)270-7502. The examiner can normally be reached Monday - Friday 10:00 AM - 6 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Devona E Faulk can be reached at 571-272-7515. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ROBERT J CRADDOCK/Primary Examiner, Art Unit 2618