DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This Office action is in response to amendment filed 9/29/2025.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim 1-5, 9-14 and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Wallen et al. (US 11,556,169 B2, hereinafter Wallen) in view of Rowley (US 2021/0056750 A1).
Regarding claim 1, Wallen discloses a system as shown in figure 12 comprising: at least one processor (figure 12, 1202); and memory (figure 12, 1204) storing instructions that, when executed by the at least one processor, cause the system to perform a set of operations, the set of operations (col. 21 lines 11-13, memory 1204 includes main memory for storing instructions for processor 1202 to execute or data for processor 1202 to operate on) comprising: display a content stream on a virtual reality device (figures 3, 8A-8B, col. 7 lines 9-19 and col. 13 lines 32-44, a VR display device may display a rendered VR environment); receive one or more interest indicators, wherein the one or more interest indicators relate to one or more of the content stream and/or a user (col. 8 line 27 through col. 9 line 8, detect a change in context of a first user); generate an overlay based on the one or more interest indicators, wherein the overlay includes one or more interactive elements (col. 8 line 27 through col. 9 line 8, receive an input by the user selecting an application that is associated with a change in the form factor and/or pose of the personal UI, or detect a change from one application to another); display the content stream with the overlay on the virtual reality device (col. 7 lines 9-19, the user may view the VR environment or a passthrough view of the real-world environment); receive a selection of an interactive elements (col. 7 lines 20-44, user may use the personal UI to execute one or more applications); and perform a workflow associated with the selected interactive elements (col. 7 lines 20-44, personal UI may be a feature of the VR operating system (VROS) associated with the virtual reality system). Although Wallen teaches the VR display device to execute one or more tasks, such as gaming applications (col. 7, lines 9-44), Wallen differs from the claimed invention in not specifically teaching to utilizing the VR display device to execute a betting game, wherein an interest indicator related to the user is based upon a betting history of the user. However, Rowley teaches a method for creating and distributing low-latency interactive addressable virtual content on betting (see figure 10) comprising interactive input engine may capture dynamic characteristics to include one or more of current location, sports team affiliations, gambling habits, browsing habits, betting estimates, browsing estimates, and/or any other data or patterns that may allow predictive analysis so that user may also be presented, via live content, with their recent betting history, their results, and results of other fans, of other regions, and other such related information to help them wager ([0091]) in order to create a symbiotic relationship between consumer and consumed content. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Wallen in having that an interest indicator related to the user is based upon a betting history of the user, as per teaching of Rowley, to create a symbiotic relationship between consumer and consumed content.
Regarding claim 2, Wallen discloses that displaying the content stream with the overlay further comprises: receiving an indication that the overlay is stale; generating a subsequent overlay based on the one or more interest indicators; and displaying the content stream with the subsequent overlay (figure 6 and col. 9 lines 19-54 and col. 10 line 38 through col. 11 line 36, the personal UI may adapt to a new form factor and/or pose with respect to the user in the VR environment, i.e., if the user has changed from a first pose to a second pose, the virtual reality system may adapt the personal UI to the user's second pose).
Regarding claim 3, Wallen discloses that receiving the indication that the overlay is stale comprises one or more of an expiration of a timer, an indication from the user that they are not interested in the overlay, and an occurrence of an action in the content stream indicating the overlay is stale (col. 8 line 27 through col. 9 line 18, the virtual reality system determines the user is within the threshold distance of the object, and the user is facing the object, the virtual reality system may detect the user has changed from a first pose to a second pose in the VR environment such that the system may detect a change in a context when it is over a period of time).
Regarding claim 4, Wallen discloses that generating the overlay further comprises personalizing the overlay for the user based on the one or more interest indicators (col. 8 lines 27-51 and col. 9 lines 19-54, the virtual reality system may detect a change in a context of the first user with respect to the VR environment).
Regarding claim 5, Wallen discloses that the one or more interest indicators comprise one or more of a content stream information, user profile information, user history information, user eye gaze history, product information, and common user characteristic information (col. 4 lines 44-45 and col. 18 line 67 through col. 19 line 11, an internal camera for capturing the user's eye gaze for eye-tracking purposes, and VR system may include one or more user-profile stores for storing user profiles, for example, biographic information, demographic information, behavioral information, social information, or other types of descriptive information, such as work experience, educational history, hobbies or preferences, interests, affinities, or location. Interest information may include interests related to one or more categories).
Regarding claim 9, Wallen discloses that performing the workflow further comprises performing one or more of the following operations: place a bet, purchase an item, interact with an interactive element of the overlay, transition to another aspect of the overlay, and select a viewing angle (col. 9 lines 19-54 and col. 17 lines 49-59, the virtual reality system receiving an input by the user selecting a particular application on the personal UI from the plurality of applications and VR system may belong, events or calendar entries in which a user might be interested, computer-based applications that a user may use, transactions that allow users to buy or sell items via the service, interactions with advertisements that a user may perform, or other suitable items or objects).
Regarding claim 10, the limitations of the claim are rejected as the same reasons as set forth in claim 1
Regarding claim 11, the limitations of the claim are rejected as the same reasons as set forth in claim 2.
Regarding claim 12, the limitations of the claim are rejected as the same reasons as set forth in claim 3.
Regarding claim 13, the limitations of the claim are rejected as the same reasons as set forth in claim 4.
Regarding claim 14, the limitations of the claim are rejected as the same reasons as set forth in claim 5.
Regarding claim 18, the limitations of the claim are rejected as the same reasons as set forth in claim 9.
Regarding claim 19, the limitations of the claim are rejected as the same reasons as set forth in claim 1.
Regarding claim 20, the limitations of the claim are rejected as the same reasons as set forth in claim 2.
Claims 6-7 and 15-16 are rejected under 35 U.S.C. 103 as being unpatentable over Wallen et al. (US 11,556,169 B2, hereinafter Wallen) in view of Rowley (US 2021/0056750 A1) as applied to claims above, and further in view of Seidel (US 11,308,764 B1).
Regarding claims 6-7, the combination of Wallen and Rowley differs from the claimed invention in not specifically teaching that the set of operations further comprises: geolocate the virtual reality device, wherein geolocating geolocate the virtual reality device further comprises: requesting a current location of the virtual reality device; determining if the current location of the virtual reality device is an unknown location or a boundary location; and when the current location is not unknown location or the boundary location, confirming the current location of the virtual reality device and performing a workflow associated with the request. However, Seidel teaches to determine client computing device location using geolocation services so that a server may determine a first physical location of the first client computing device sent by the first client computing device and confirm that the first physical location is a valid geographic location located within a particular geofenced area (col. 20 line 62 through col 21 line 23) in order to participate in both free and wagered parlays for those players in geographical regions where this form of gambling is legal. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Wallen and Rowley in having geolocate the virtual reality device, wherein geolocating geolocate the virtual reality device further comprises: requesting a current location of the virtual reality device; determining if the current location of the virtual reality device is an unknown location or a boundary location; and when the current location is not unknown location or the boundary location, confirming the current location of the virtual reality device and performing a workflow associated with the request, as per teaching of Seidel, in order to participate in both free and wagered parlays for those players in geographical regions where this form of gambling is legal.
Regarding claims 15-16, the limitations of the claims are rejected as the same reasons as set forth in claims 6-7.
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Wallen et al. (US 11,556,169 B2, hereinafter Wallen) in view of Rowley (US 2021/0056750 A1) as applied to claim 1 above, and further in view of Todd (US 20210019982 A1).
Regarding 8, the combination of Wallen and Rowley differs from the claimed invention in not specifically teaching that receiving the selection of the interactive element further comprises receiving a gesture input or a physical input from a physical device. However, it is old and notoriously well known in the art of receiving the selection of the interactive element further comprises receiving a gesture input or a physical input from a physical device of an interactive element, i.e., utilizing a gesture input, for example see Todd (figure 58 and [0596], detected gestures may be mapped as inputs to the content mixing and layering system, such as to allow a gesture to cause a mark on the screen, to highlight an element, to re-size a window, to write on a display, other otherwise configure any parameter of an output stream that appears on the display device) in order to allow a user to play video games with hands or body as an input ([0604]-[0605]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Wallen and Rowley in having that receiving the selection of the interactive element further comprises receiving a gesture input or a physical input from a physical device, as per teaching of Todd, in order to allow a user to play video games with hands or body as an input.
Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Wallen et al. (US 11,556,169 B2, hereinafter Wallen) in view of Rowley (US 2021/0056750 A1) and Seidel (US 11,308,764 B1) as applied to claim 16 above, and further in view of Todd (US 20210019982 A1).
Regarding claim 17, the combination of Wallen, Rowley and Seidel differ from the claimed invention in not specifically teaching that receiving the selection of the interactive element further comprises receiving a gesture input or a physical input from a physical device. However, it is old and notoriously well known in the art of receiving the selection of the interactive element further comprises receiving a gesture input or a physical input from a physical device, i.e., utilizing a gesture input, for example see Todd (figure 58 and [0596], detected gestures may be mapped as inputs to the content mixing and layering system, such as to allow a gesture to cause a mark on the screen, to highlight an element, to re-size a window, to write on a display, other otherwise configure any parameter of an output stream that appears on the display device) in order to allow a user to play video games with hands or body as an input ([0604]-[0605]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Wallen and Rowley in having that receiving the selection of the interactive element further comprises receiving a gesture input or a physical input from a physical device, as per teaching of Todd, in order to allow a user to play video games with hands or body as an input.
Response to Arguments
Applicant’s arguments with respect to claims 1-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Jovanovic et al. (US 2023/0360323 A1) discloses a virtual reality (VR) device to expand viewing options of an event by combining multiple video streams of the event with interactive overlays (abstract and [0004] - [0005]).
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GEORGE ENG whose telephone number is (571)272-7495. The examiner can normally be reached Flex M to F, 7 am to 3 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alford Kindred can be reached at 571-272-4037. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GEORGE ENG/Supervisory Patent Examiner, Art Unit 2699