Prosecution Insights
Last updated: April 18, 2026
Application No. 19/042,804

SYSTEMS AND METHODS FOR GENERATING A SMART OVERLAY FOR AN INTERACTIVE DISPLAY

Non-Final OA §102§103§DP
Filed
Jan 31, 2025
Examiner
SCHNURR, JOHN R
Art Unit
2425
Tech Center
2400 — Computer Networks
Assignee
Stats LLC
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
83%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
678 granted / 943 resolved
+13.9% vs TC avg
Moderate +11% lift
Without
With
+10.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
27 currently pending
Career history
970
Total Applications
across all art units

Statute-Specific Performance

§101
4.7%
-35.3% vs TC avg
§103
51.9%
+11.9% vs TC avg
§102
19.0%
-21.0% vs TC avg
§112
10.5%
-29.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 943 resolved cases

Office Action

§102 §103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This Office Action is in response to Application No. 19/042,804 filed 01/31/2025. Claims 1-20 are pending and have been examined. The information disclosure statement (IDS) submitted on 06/02/2025 was considered by the examiner. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-20 provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 3-8, 10-15 and 17-20 of copending Application No. 19/043,021 in view of Evans (US 2019/0273954). This is a provisional nonstatutory double patenting rejection. Application No. 19/042,804 Application No. 19/043,021 1. A computer-implemented method for generating a smart overlay in an interactive display, the method comprising: 1. A computer-implemented method for generating smart triggers in an interactive display, the method comprising: receiving, by one or more processors, a plurality of real-time event data comprising a plurality of real-time event actions; receiving, by one or more processors, a plurality of real-time event data comprising a plurality of real-time event actions; receiving, by the one or more processors, a plurality of user data comprising a plurality of user actions; receiving, by the one or more processors, a plurality of user data comprising a plurality of user actions; capturing, by the one or more processors, one or more real-time user interactions with the interactive display; capturing, by the one or more processors, one or more real-time user interactions with the interactive display; generating, by the one or more processors, a unique relevancy threshold using the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions, wherein the unique relevancy threshold is generated in real-time as the plurality of real-time event data and the one or more real-time user interactions are received; generating, by the one or more processors, a unique relevancy threshold using the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions, wherein the unique relevancy threshold is generated in real-time as the plurality of real-time event data and the one or more real-time user interactions are received; generating, by the one or more processors and in real-time, at least one unique smart overlay that has a relevancy that exceeds the unique relevancy threshold; and generating, by the one or more processors and in real-time, at least one unique smart trigger that has a relevancy that exceeds the unique relevancy threshold; and updating, by the one or more processors and in real-time, the interactive display with the at least one unique smart overlay. updating, by the one or more processors and in real-time, the interactive display with the at least one unique smart trigger. However, the copending claims do not explicitly teach generating at least one unique smart overlay; and updating the interactive display with the at least one unique smart overlay. In an analogous art, Evans, which discloses a system for video distribution, clearly teaches generating at least one unique smart overlay; (Fig. 3: Analyzer 160 selects the adjunct content to be provided to the user with the program content based on the adjusted selection criteria, [0034], [0043], [0046].) and updating the interactive display with the at least one unique smart overlay. (Fig. 3: The broadcast engine 164 provides the program content with the selected adjunct content to the viewer devices 106, [0017], [0037], [0044], [0045].) Therefore, before the effective filing date of the claimed invention, it would have been obvious to one with ordinary skill in the art to modify the copending claims by generating at least one unique smart overlay; and updating the interactive display with the at least one unique smart overlay, as taught by Evans, for the benefit of displaying relevant information to the user. Claim 2 of the application corresponds to claim 1 of the copending application in view of Evans [0013], [0034], [0049]. Claim 3 of the application corresponds to claim 3 of the copending application in view of Evans. Claim 4 of the application corresponds to claim 4 of the copending application in view of Evans. Claim 5 of the application corresponds to claim 5 of the copending application in view of Evans. Claim 6 of the application corresponds to claim 6 of the copending application in view of Evans. Claim 7 of the application corresponds to claim 7 of the copending application in view of Evans. Claim 8 of the application corresponds to claim 8 of the copending application in view of Evans. Claim 9 of the application corresponds to claim 8 of the copending application in view of Evans [0013], [0034], [0049]. Claim 10 of the application corresponds to claim 10 of the copending application in view of Evans. Claim 11 of the application corresponds to claim 11 of the copending application in view of Evans. Claim 12 of the application corresponds to claim 12 of the copending application in view of Evans. Claim 13 of the application corresponds to claim 13 of the copending application in view of Evans. Claim 14 of the application corresponds to claim 14 of the copending application in view of Evans. Claim 15 of the application corresponds to claim 15 of the copending application in view of Evans. Claim 16 of the application corresponds to claim 15 of the copending application in view of Evans [0013], [0034], [0049]. Claim 17 of the application corresponds to claim 17 of the copending application in view of Evans. Claim 18 of the application corresponds to claim 18 of the copending application in view of Evans. Claim 19 of the application corresponds to claim 19 of the copending application in view of Evans. Claim 20 of the application corresponds to claim 20 of the copending application in view of Evans. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 2, 4-9, 11-16 and 18-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Evans (US 2019/0273954). Consider claim 1, Evans clearly teaches a computer-implemented method for generating a smart overlay in an interactive display, (Figs. 3-6, [0047]) the method comprising: receiving, by one or more processors, a plurality of real-time event data comprising a plurality of real-time event actions; (Fig. 3: Analyzer 160 monitors real-time events occurring in a live event, [0032], [0033].) receiving, by the one or more processors, a plurality of user data comprising a plurality of user actions; (Fig. 3: Analyzer 160 analyzes user information collected about the user such as preferences and historical activity, [0029], [0035], [0039].) capturing, by the one or more processors, one or more real-time user interactions with the interactive display; (Fig. 3: Analyzer 160 analyzes real-time user engagement and actions, [0030], [0033], [0039].) generating, by the one or more processors, a unique relevancy threshold using the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions, wherein the unique relevancy threshold is generated in real-time as the plurality of real-time event data and the one or more real-time user interactions are received; (Fig. 3: In response to a triggering event analyzer 160 adjusts selection criteria based on the real-time events, user information and real-time user engagement and actions, [0034], [0043], [0046].) generating, by the one or more processors and in real-time, at least one unique smart overlay that has a relevancy that exceeds the unique relevancy threshold; (Fig. 3: Analyzer 160 selects the adjunct content to be provided to the user with the program content based on the adjusted selection criteria, [0034], [0043], [0046].) and updating, by the one or more processors and in real-time, the interactive display with the at least one unique smart overlay. (Fig. 3: The broadcast engine 164 provides the program content with the selected adjunct content to the viewer devices 106, [0017], [0037], [0044], [0045].) Consider claim 2, Evans clearly teaches the at least one unique smart overlay is an interactive interface overlaid on a live video stream on a user device. (Fig. 9: An interactive overlay 920 is inserted over program content 910, [0013], [0034], [0049].) Consider claim 4, Evans clearly teaches the at least one unique smart overlay is generated and/or updated based on one or more of the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions. (Fig. 3: In response to a triggering event analyzer 160 adjusts selection criteria based on the real-time events, user information and real-time user engagement and actions and selects the adjunct content to be provided to the user with the program content based on the adjusted selection criteria, [0034], [0043], [0046].) Consider claim 5, Evans clearly teaches the interactive display is displayed on a user mobile device. (Fig. 1: Viewer devices include mobile devices, [0017].) Consider claim 6, Evans clearly teaches the plurality of real-time event actions include at least one of a scored goal, a completed pass, an interception, a goal conceded, or no action. (Real-time events include a goal, [0033], [0050].) Consider claim 7, Evans clearly teaches one or more of the unique relevancy threshold, the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions is provided to one or more artificial intelligence models as input. (Analyzer 160 uses artificial intelligence to provide the adjunct content, [0049].) Consider claim 8, Evans clearly teaches a system for generating a smart overlay in an interactive display, (Fig. 1) the system comprising: a memory storing instructions and a processor operatively connected to the memory and configured to execute the instructions to perform operations ([0020], [0047]) including: receiving, by one or more processors, a plurality of real-time event data comprising a plurality of real-time event actions; (Fig. 3: Analyzer 160 monitors real-time events occurring in a live event, [0032], [0033].) receiving, by the one or more processors, a plurality of user data comprising a plurality of user actions; (Fig. 3: Analyzer 160 analyzes user information collected about the user such as preferences and historical activity, [0029], [0035], [0039].) capturing, by the one or more processors, one or more real-time user interactions with the interactive display; (Fig. 3: Analyzer 160 analyzes real-time user engagement and actions, [0030], [0033], [0039].) generating, by the one or more processors, a unique relevancy threshold using the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions, wherein the unique relevancy threshold is generated in real-time as the plurality of real-time event data and the one or more real-time user interactions are received; (Fig. 3: In response to a triggering event analyzer 160 adjusts selection criteria based on the real-time events, user information and real-time user engagement and actions, [0034], [0043], [0046].) generating, by the one or more processors and in real-time, at least one unique smart overlay that has a relevancy that exceeds the unique relevancy threshold; (Fig. 3: Analyzer 160 selects the adjunct content to be provided to the user with the program content based on the adjusted selection criteria, [0034], [0043], [0046].) and updating, by the one or more processors and in real-time, the interactive display with the at least one unique smart overlay. (Fig. 3: The broadcast engine 164 provides the program content with the selected adjunct content to the viewer devices 106, [0017], [0037], [0044], [0045].) Consider claim 9, Evans clearly teaches the at least one unique smart overlay is an interactive interface overlaid on a live video stream on a user device. (Fig. 9: An interactive overlay 920 is inserted over program content 910, [0013], [0034], [0049].) Consider claim 11, Evans clearly teaches the at least one unique smart overlay is generated and/or updated based on one or more of the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions. (Fig. 3: In response to a triggering event analyzer 160 adjusts selection criteria based on the real-time events, user information and real-time user engagement and actions and selects the adjunct content to be provided to the user with the program content based on the adjusted selection criteria, [0034], [0043], [0046].) Consider claim 12, Evans clearly teaches the interactive display is displayed on a user mobile device. (Fig. 1: Viewer devices include mobile devices, [0017].) Consider claim 13, Evans clearly teaches the plurality of real-time event actions include at least one of a scored goal, a completed pass, an interception, a goal conceded, or no action. (Real-time events include a goal, [0033], [0050].) Consider claim 14, Evans clearly teaches one or more of the unique relevancy threshold, the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions is provided to one or more artificial intelligence models as input. (Analyzer 160 uses artificial intelligence to provide the adjunct content, [0049].) Consider claim 15, Evans clearly teaches a non-transitory computer-readable medium storing instructions that, when executed by one or more processors, perform a method for generating a smart overlay in an interactive display, ([0020], [0047]) the method comprising: receiving, by one or more processors, a plurality of real-time event data comprising a plurality of real-time event actions; (Fig. 3: Analyzer 160 monitors real-time events occurring in a live event, [0032], [0033].) receiving, by the one or more processors, a plurality of user data comprising a plurality of user actions; (Fig. 3: Analyzer 160 analyzes user information collected about the user such as preferences and historical activity, [0029], [0035], [0039].) capturing, by the one or more processors, one or more real-time user interactions with the interactive display; (Fig. 3: Analyzer 160 analyzes real-time user engagement and actions, [0030], [0033], [0039].) generating, by the one or more processors, a unique relevancy threshold using the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions, wherein the unique relevancy threshold is generated in real-time as the plurality of real-time event data and the one or more real-time user interactions are received; (Fig. 3: In response to a triggering event analyzer 160 adjusts selection criteria based on the real-time events, user information and real-time user engagement and actions, [0034], [0043], [0046].) generating, by the one or more processors and in real-time, at least one unique smart overlay that has a relevancy that exceeds the unique relevancy threshold; (Fig. 3: Analyzer 160 selects the adjunct content to be provided to the user with the program content based on the adjusted selection criteria, [0034], [0043], [0046].) and updating, by the one or more processors and in real-time, the interactive display with the at least one unique smart overlay. (Fig. 3: The broadcast engine 164 provides the program content with the selected adjunct content to the viewer devices 106, [0017], [0037], [0044], [0045].) Consider claim 16, Evans clearly teaches the at least one unique smart overlay is an interactive interface overlaid on a live video stream on a user device. (Fig. 9: An interactive overlay 920 is inserted over program content 910, [0013], [0034], [0049].) Consider claim 18, Evans clearly teaches the at least one unique smart overlay is generated and/or updated based on one or more of the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions. (Fig. 3: In response to a triggering event analyzer 160 adjusts selection criteria based on the real-time events, user information and real-time user engagement and actions and selects the adjunct content to be provided to the user with the program content based on the adjusted selection criteria, [0034], [0043], [0046].) Consider claim 19, Evans clearly teaches the interactive display is displayed on a user mobile device. (Fig. 1: Viewer devices include mobile devices, [0017].) Consider claim 20, Evans clearly teaches one or more of the unique relevancy threshold, the plurality of real-time event actions, the plurality of user actions, and the one or more real-time user interactions is provided to one or more artificial intelligence models as input. (Analyzer 160 uses artificial intelligence to provide the adjunct content, [0049].) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 3, 10 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Evans (US 2019/0273954) in view of Zhang et al. (US 2022/0368979), herein Zhang. Consider claim 3, Evans clearly teaches the plurality of real-time event actions, the plurality of user actions, the one or more real-time user interactions. However, Evans does not explicitly teach a position of the at least one unique smart overlay within the interactive display is automatically determined based on one or more of the plurality of real-time event actions, the plurality of user actions, the one or more real-time user interactions, and a camera angle of a live video stream. In an analogous art, Zhang, which discloses a system for video distribution, clearly teaches a position of the at least one unique smart overlay within the interactive display is automatically determined based on one or more of the plurality of real-time event actions, the plurality of user actions, the one or more real-time user interactions, and a camera angle of a live video stream. (Fig. 6: A location in the video is determined based on features in the video frames and the overlay is positioned at this location, [0082]-[0090].) Therefore, before the effective filing date of the claimed invention, it would have been obvious to one with ordinary skill in the art to modify the system of Evans by a position of the at least one unique smart overlay within the interactive display is automatically determined based on one or more of the plurality of real-time event actions, the plurality of user actions, the one or more real-time user interactions, and a camera angle of a live video stream, as taught by Zhang, for the benefit of preventing the overlay from interfering with important portions of the video stream. Consider claim 10, Evans combined with Zhang clearly teaches a position of the at least one unique smart overlay within the interactive display is automatically determined based on one or more of the plurality of real-time event actions, the plurality of user actions, the one or more real-time user interactions, and a camera angle of a live video stream. (Fig. 6: A location in the video is determined based on features in the video frames and the overlay is positioned at this location, [0082]-[0090] Zhang.) Consider claim 17, Evans combined with Zhang clearly teaches a position of the at least one unique smart overlay within the interactive display is automatically determined based on one or more of the plurality of real-time event actions, the plurality of user actions, the one or more real-time user interactions, and a camera angle of a live video stream. (Fig. 6: A location in the video is determined based on features in the video frames and the overlay is positioned at this location, [0082]-[0090] Zhang.) Conclusion In the case of amending the claimed invention, applicant is respectfully requested to indicate the portion(s) of the specification which dictate(s) the structure relied on for proper interpretation and also to verify and ascertain the metes and bounds of the claimed invention. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN R SCHNURR whose telephone number is (571)270-1458. The examiner can normally be reached M-F 6a-4p. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian Pendleton can be reached at (571)272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOHN R SCHNURR/ Primary Examiner, Art Unit 2425
Read full office action

Prosecution Timeline

Jan 31, 2025
Application Filed
Mar 26, 2026
Non-Final Rejection — §102, §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593962
ENDOSCOPE SYSTEM AND COORDINATE SYSTEM CORRECTION METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12598359
DISPLAY DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12587703
VIDEO DISPLAY SYSTEM, OBSERVATION DEVICE, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12587729
Method And System For A Trail Camera With Modular Fresnel Lenses
2y 5m to grant Granted Mar 24, 2026
Patent 12579603
IMAGE PROJECTION DEVICE AND METHOD FOR OPERATING THE SAME
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
83%
With Interview (+10.8%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 943 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month