Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
2. Claims 1-19 are pending.
Claim Objections
3. Claim 11 is objected to because of the following informalities: the claim states “said augmented reality system”. There is insufficient antecedent basis for “said”. Appropriate correction is required.
Claims 15-16 and 18 objected to because of the following informalities: the method claims are dependent upon the system of claim 10 instead of the method of claim 11.
Claim Rejections - 35 USC § 102
4. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-19 is/are rejected under 35 U.S.C. 102a1 as being anticipated by Bicanic et al. (US Patent Application Publication 2019/0333404), herein after referred to as Bicanic.
Regarding independent claim 1, Bicanic discloses an augmented reality system (100) (Figure 1 platform 100 utilizing augmented and virtual reality display device 106 as described in paragraphs [0042]-[0048].) for a spectator event (806) involving one or more moving participants (802+804) in a real space (US+UK) (Figure 8 reference two real world aircraft 802 and 804 each respectively flying/moving in respective geographically different real world airspace (US and UK), each pilot using the disclosed system 100+106, and virtually 806 viewing each other (each in their own respective real space) as described in paragraphs [0106]-[0111]. The virtual space event 806 detailed in figure 6 and paragraphs [0095]-[0104]), each participant (802+804) having a participant field of view of said real space (figure 7) through a screen (106) (Figure 7 and paragraph [0105] describes the augmented reality display view of the pilot using display/screen device 106.), said spectator event (806) also involving one or more cameras (210+220) (Figure 2 depicts first and second vehicles 208+218, described in paragraphs [0051]-[0052] to be aircraft flown by pilots, with sensors 210 and 220 respectively, described in paragraph [0069] to be cameras.), each camera (210+220) having a camera field of view of said real space (Paragraph [0069] describes to image (field of view) the pilot in the real space inside of the vehicle.), said augmented reality system (100) comprising:
a computer content presentation system (100) for generating virtual space (806) corresponding to said real space (US+UK) and at least one virtual object (802, 804, 708, or 710) in said virtual space (Figure 1 platform 100 described in paragraph [0042] to further include facilitating provisioning of virtual experience. Figure 5 and paragraph [0088] describes facilitating provision of a virtual experience includes method 500 of generating presentation data 510+512 which is presented on an augmented display as described in figure 6 and paragraphs [0095]-[0104].), said virtual object's location in said virtual space being independent of said moving participants in said real space (paragraph [0109]);
a participant display (106) configured to cause an image of a portion of said virtual space (806) corresponding to said participant field of view (Figure 7) of said real space (US+UK) to overlay said field of view (Figure 7 comprises virtual objects, paragraph [0105]) of said real space for each participant (paragraph [0110]); and
a spectator display (simulator) configured to cause an image of a portion of said virtual space (806) corresponding to said camera field of view of said real space to overlay said camera field of view of said real space (Figure 1 user 112 and paragraph [0049] describes that the user 112 may access platform 100 via a browser, desktop, mobile app, each examples of a display. Figure 7 708 and paragraphs [0097], [0100], and [0105] describes virtual aircraft 708 to be real people on the ground flying virtual aircraft via simulators (inherent to comprise displays) displaying the augmented and virtual reality content.).
Regarding claim 2, Bicanic discloses the augmented reality system of claim 1, wherein said at least one virtual object (802, 804, 708, or 710) is an obstacle (Paragraphs [0105]-[0106] describes 802 and 804 may be virtual representations of real aircraft (obstacle in a physical sense), which may be an enemy (obstacle in the mental sense); 708 and 710 may be virtual representations of virtual aircraft (virtual obstacles in the mental sense).) or a boundary marker.
Regarding claim 3, Bicanic discloses the augmented reality system of claim 1, wherein said real space is real airspace (US+UK), and said virtual space is virtual airspace (806) (Figure 8 and paragraphs [0106]-[0111]).
Regarding claim 4, Bicanic discloses the augmented reality system of claim 3, wherein said moving participants are selected from aircraft (802+804 paragraph [0106]) or wingsuit racers.
Regarding claim 5, Bicanic discloses the augmented reality system of claim 1, wherein said at least one virtual object is a virtual moving participant (Figure 7 and paragraph [0105] describes 708 and 708 to be virtual aircraft. Further from the perspective of real pilot 802 display, aircraft 804 is virtual. From the perspective of real pilot 804 display, aircraft 802 is virtual, paragraph [0109].).
Regarding claim 6, Bicanic discloses the augmented reality system of claim 1, wherein said screen is a see-through display (paragraph [0068]).
Regarding claim 7, Bicanic discloses the augmented reality system of claim 6, wherein said screen comprises at least one of a head-mounted display (HMD), eyeglasses, Head-Up Display (HUD), smart contact lenses, a virtual retinal display, an eye tap, a Primary Flight Display (PFD) or cockpit windshield (paragraph [0045]).
Regarding claim 8, Bicanic discloses the augmented reality system of claim 1, wherein said system comprises a helmet worn by each of said moving participants, said helmet comprising said display (paragraphs [0039] and [0045]).
Regarding claim 9, Bicanic discloses the augmented reality system of claim 6, further comprising a helmet position sensor system configured to determine a location and orientation of said helmet within said real space (Paragraph [0069]).
Regarding claim 10, Bicanic discloses the augmented reality system of claim 1, wherein at least one of said cameras is stationary (Paragraph [0069] describes the camera to image the real space of the pilot in the vehicle/aircraft and in communication with the vehicle processor. This inherently describes the camera to be mounted to the vehicle describing stationary in regards to the vehicle (while the vehicle can move the components of the vehicle are considered stationary to said vehicle).).
Regarding independent claim 11, Bicanic discloses a method (Figure 6) of delivering AR content (606+610) in a spectator event (Figure 8 806) involving one or more moving participants (802+804) in a real space (US+UK) (Figure 8 reference two real world aircraft 802 and 804 each respectively flying/moving in respective geographically different real world airspace (US and UK), each pilot using the disclosed system 100+106, and virtually 806 viewing each other (each in their own respective real space) as described in paragraphs [0106]-[0111]. The virtual space event 806 detailed in figure 6 and paragraphs [0095]-[0104]), each participant (802+804) having a participant field of view of said real space (figure 7) through a screen (106) (Figure 7 and paragraph [0105] describes the augmented reality display view of the pilot using display/screen device 106.), said spectator event (806) also involving one or more cameras (210+220) (Figure 2 depicts first and second vehicles 208+218, described in paragraphs [0051]-[0052] to be aircraft flown by pilots, with sensors 210 and 220 respectively, described in paragraph [0069] to be cameras.), each camera (210+220) having a camera field of view of said real space (Paragraph [0069] describes to image (field of view) the pilot in the real space inside of the vehicle.), said augmented reality system (100) comprising:
generating virtual space (806) corresponding to said real space (US+UK) and at least one virtual object (802, 804, 708, or 710) in said virtual space (Figure 1 platform 100 described in paragraph [0042] to further include facilitating provisioning of virtual experience. Figure 5 and paragraph [0088] describes facilitating provision of a virtual experience includes method 500 of generating presentation data 510+512 which is presented on an augmented display as described in figure 6 and paragraphs [0095]-[0104].), said virtual object's location in said virtual space being independent of said moving participants in said real space (paragraph [0109]);
causing an image of a portion of said virtual space (806) corresponding to said participant field of view (Figure 7) of said real space (US+UK) to overlay said field of view (Figure 7 comprises virtual objects, paragraph [0105]) of said real space for each participant (paragraph [0110]); and
capturing video (paragraph [0040] describes video) of said real space from said camera field of view (Paragraph [0069] describes to image (field of view) the pilot in the real space inside of the vehicle.); and
causing an image of a portion of said virtual space (806) corresponding to said camera field of view of said real space to overlay video (Figure 1 user 112 and paragraph [0049] describes that the user 112 may access platform 100 via a browser, desktop, mobile app, each examples of a display. Figure 7 708 and paragraphs [0097], [0100], and [0105] describes virtual aircraft 708 to be real people on the ground flying virtual aircraft via simulators (inherent to comprise displays) displaying the augmented and virtual reality content.).
Regarding claim 12, Bicanic discloses the method of claim 11, wherein said at least one virtual object (802, 804, 708, or 710) is an obstacle (Paragraphs [0105]-[0106] describes 802 and 804 may be virtual representations of real aircraft (obstacle in a physical sense), which may be an enemy (obstacle in the mental sense); 708 and 710 may be virtual representations of virtual aircraft (virtual obstacles in the mental sense).) or a boundary marker.
Regarding claim 13, Bicanic discloses the method of claim 11, wherein said real space is real airspace (US+UK), and said virtual space is virtual airspace (806) (Figure 8 and paragraphs [0106]-[0111]).
Regarding claim 14, Bicanic discloses the method of claim 13, wherein said moving participants are selected from aircraft (802+804 paragraph [0106]) or wingsuit racers.
Regarding claim 15, Bicanic discloses the method of claim 10, wherein said at least one virtual object is a virtual moving participant (Figure 7 and paragraph [0105] describes 708 and 708 to be virtual aircraft. Further from the perspective of real pilot 802 display, aircraft 804 is virtual. From the perspective of real pilot 804 display, aircraft 802 is virtual, paragraph [0109].).
Regarding claim 16, Bicanic discloses the method of claim 10, wherein said screen is a see-through display (paragraph [0068]).
Regarding claim 17, Bicanic discloses the method of claim 15, wherein said screen comprises at least one of a head-mounted display (HMD), eyeglasses, Head-Up Display (HUD), smart contact lenses, a virtual retinal display, an eye tap, a Primary Flight Display (PFD) or cockpit windshield (paragraph [0045]).
Regarding claim 18, Bicanic discloses the method of claim 10, wherein said system comprises a helmet worn by each of said moving participants, said helmet comprising said display (paragraphs [0039] and [0045]).
Regarding claim 19, Bicanic discloses the method of claim 17, further comprising a helmet position sensor system configured to determine a location and orientation of said helmet within said real space (Paragraph [0069]).
Conclusion
5. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER E LEIBY whose telephone number is (571)270-3142. The examiner can normally be reached 11-7.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr Awad can be reached at 571-272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER E LEIBY/ Primary Examiner, Art Unit 2621