Prosecution Insights
Last updated: April 19, 2026
Application No. 18/772,314

SYSTEMS METHODS AND COMPUTER STORAGE MEDIA FOR COLLECTING AND PROCESSING PROGRESSIVISTIC METADATA

Non-Final OA §102§112§DP
Filed
Jul 15, 2024
Examiner
ANDERSEN, KRISTOPHER E
Art Unit
2159
Tech Center
2100 — Computer Architecture & Software
Assignee
Fantastic Athletes Corporation
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
250 granted / 358 resolved
+14.8% vs TC avg
Strong +40% interview lift
Without
With
+40.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
9 currently pending
Career history
367
Total Applications
across all art units

Statute-Specific Performance

§101
21.1%
-18.9% vs TC avg
§103
41.4%
+1.4% vs TC avg
§102
11.3%
-28.7% vs TC avg
§112
19.9%
-20.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 358 resolved cases

Office Action

§102 §112 §DP
DETAILED ACTION In response to claims filed 15 July 2024, this is the first Office action on the merits. Claims 1-20 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-19 of U.S. Patent No. 11,113,332 B2. Although the claims at issue are not identical, they are not patentably distinct from each other as shown in the following table. Instant Application US 11,113,332 B2 1. A system for collecting and processing progressivistic metadata comprises: (a) at least one video-related module selected from the group consisting of: (I) an accessible video files database, configured for storing at least one source video file, wherein said video file comprising a plurality of particular frames and/or distinct segments, of a recording of a real-life event; (II) a timecode database, configured for storing a timecode file, uniquely identifying at least one member selected from the group consisting of: particular timeframes in said real-life event and distinct time-segments in said real-life event; (b) a server configured for generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to at least one member selected from the group consisting of: said particular frames and/or said distinct segments in said video file in said video files database and said particular timeframes and/or distinct time-segments in said timecode file, in said timecode database; (c) a progressivistic metadata database, configured for storing said progressivistic metadata file. 1. A system for collecting and processing progressivistic metadata in sports comprises: a) a video file source, wherein said video file comprising a video file layer including a recording of a sport event; b) a video file database, configured for storing said video file therein, wherein said video file further comprising a timecode layer, configured for uniquely identifying particular frames or distinct segments in said video file layer; said system is characterized by: c) a sub-system for collecting and processing progressivistic metadata comprising: (I) a server configured for generating a progressivistic metadata layer, wherein events encoded into said progressivistic metadata layer are respectively related to particular instances in said timecode layer; (II) a database configurable for storing composite files, wherein said composite files further comprising said progressivistic metadata layer. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 11,429,666 B2. Although the claims at issue are not identical, they are not patentably distinct from each other as shown in the following table. Instant Application US 11,429,666 B2 1. A system for collecting and processing progressivistic metadata comprises: (a) at least one video-related module selected from the group consisting of: (I) an accessible video files database, configured for storing at least one source video file, wherein said video file comprising a plurality of particular frames and/or distinct segments, of a recording of a real-life event; (II) a timecode database, configured for storing a timecode file, uniquely identifying at least one member selected from the group consisting of: particular timeframes in said real-life event and distinct time-segments in said real-life event; (b) a server configured for generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to at least one member selected from the group consisting of: said particular frames and/or said distinct segments in said video file in said video files database and said particular timeframes and/or distinct time-segments in said timecode file, in said timecode database; (c) a progressivistic metadata database, configured for storing said progressivistic metadata file. 1. A system for collecting and processing progressivistic metadata comprises: a) a video file database, configured for storing a source video file, wherein said video file comprising a recording of a real-life event; b) a timecode file database, configured for storing a timecode file, uniquely identifying at least one member selected from the group consisting of: particular frames in said video file and distinct segments in said video file; said system is characterized by: c) a sub-system for collecting and processing progressivistic metadata comprising: (I) a server configured for generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to particular instances in said timecode file; (II) a database configurable for storing said progressivistic metadata file. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 12,038,971 B2. Although the claims at issue are not identical, they are not patentably distinct from each other as shown in the following table. Instant Application US 12,038,971 B2 1. A system for collecting and processing progressivistic metadata comprises: (a) at least one video-related module selected from the group consisting of: (I) an accessible video files database, configured for storing at least one source video file, wherein said video file comprising a plurality of particular frames and/or distinct segments, of a recording of a real-life event; (II) a timecode database, configured for storing a timecode file, uniquely identifying at least one member selected from the group consisting of: particular timeframes in said real-life event and distinct time-segments in said real-life event; (b) a server configured for generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to at least one member selected from the group consisting of: said particular frames and/or said distinct segments in said video file in said video files database and said particular timeframes and/or distinct time-segments in said timecode file, in said timecode database; (c) a progressivistic metadata database, configured for storing said progressivistic metadata file. 1. A system for collecting and processing progressivistic metadata comprises: a) a timecode database, configured for storing an absolute timecode file, uniquely identifying at least one member selected from the group consisting of: particular timeframes in a real-life event and distinct time-segments in said real-life event; b) an accessible video files database, configured for storing at least one source video file, wherein said video file comprising a recording of said real-life event, wherein said video file further comprises a relative timecode file; wherein said particular time-frames and/or distinct time-segments in said real-life event in said absolute timecode file in said timecode database are respectively related to particular frames and/or distinct segments in said relative timecode file of said at least one source video file in said video files database; c) a server configured for generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to said particular time-frames and/or distinct time-segments in said absolute timecode file, in said absolute timecode file database; d) a progressivistic metadata database, configured for storing said progressivistic metadata file. Claim Objections Claims 8-9, 16-17, and 19 are objected to because of the following informalities: claim 8: “said relative timecode file” and “said absolute timecode file” lack antecedent basis and should be --a relative timecode file-- and --an absolute timecode file--, respectively (lines 2-3). claim 9: “said relative timecode file” and “said absolute timecode file” lack antecedent basis and should be --a relative timecode file-- and --an absolute timecode file--, respectively (lines 2-3). claim 16: “said relative timecode file” and “said absolute timecode file” lack antecedent basis and should be --a relative timecode file-- and --an absolute timecode file--, respectively (line 3). claim 17: “said relative timecode file” and “said absolute timecode file” lack antecedent basis and should be --a relative timecode file-- and --an absolute timecode file--, respectively (lines 2-3). claim 19: “said video file in said video files database” lacks antecedent basis and should be --said video file-- (lines 15-16) claim 19: “said absolute timecode file” and “said absolute timecode file database” have antecedent basis to --said timecode file-- (lines 16-17) Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1, 11, and 19 recite “at least one video-related module [constituent]” selected from the group consisting of an “accessible video files data” (“at least one source video file”) and a “timecode database” (“timecode file”). Only one of the these elements is required by the claim. However, the antecedent basis established by the claim requires both elements. For example, the timecode database uniquely identifies “timeframes in said real-life event” or “distinct time-segments in said real life event,” referencing the “recording of a real-life event” in the video files database. It is therefore unclear whether the claim requires only “one” of the video-related modules/constituents or both. This ambiguity renders the claims indefinite. Claims 2-10, 12-18, and 20 are rejected because they inherit this deficiency. For the purpose of applying prior art, claims 1, 11, and 19 are interpreted as requiring both modules/constituents; i.e., limitation “(a)” is interpreted as requiring both modules/constituents (I) and (II). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Shichman et al. (US 2018/0132011 A1). Regarding claim 1, Shichman teaches a system for collecting and processing progressivistic metadata comprises: (a) at least one video-related module selected from the group consisting of: (I) an accessible video files database, configured for storing at least one source video file, wherein said video file comprising a plurality of particular frames and/or distinct segments, of a recording of a real-life event (see Shichman [0049], “video archive”); (II) a timecode database, configured for storing a timecode file, uniquely identifying at least one member selected from the group consisting of: particular timeframes in said real-life event and distinct time-segments in said real-life event (see Shichman [0055]-[0056], “database 220” and “time stamps that associate the metadata object with a segment of a video clip”); (b) a server configured for generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to at least one member selected from the group consisting of: said particular frames and/or said distinct segments in said video file in said video files database and said particular timeframes and/or distinct time-segments in said timecode file, in said timecode database (see Shichman [0057]-[0058] and [0066], “create new events . . . identify an event that was previously unknown”); (c) a progressivistic metadata database, configured for storing said progressivistic metadata file (see Shichman [0057], “event object . . . file or entry . . . stored”) Regarding claim 2, Shichman teaches further comprises a performance data module, configured for collecting and processing performance data, wherein events logged into a performance data file are related to events encoded into said progressivistic metadata file (see Shichman [0058], “information in an event object may be used for clipping a segment of a source video to a separate video file”). Regarding claim 3, Shichman teaches further comprises at least one progressivistic metadata interface, configured for encoding events into said progressivistic metadata file (see Shichman [0074]-[0075], “user interface module . . . user and may modify or create events”). Regarding claim 4, Shichman teaches further comprises at least one progressivistic metadata logging interface, configured for encoding events into said progressivistic metadata file, selected from the group consisting of: (a) a computer terminal comprising a human-machine interface, configured for manually logging events into said progressivistic metadata file, by a human operator; (b) an interface configured for logging relations between said events encoded into said progressivistic metadata file and said events encoded into said performance data file; (c) an interface configured for specifying an associative metric (see Shichman [0074]-[0075], “user interface module . . . user and may modify or create events”). Regarding claim 5, Shichman teaches further comprises a controllable playback device, wherein encoding events into said progressivistic metadata file is performed by a human operator in a real-time regime or near real-time regime (see Shichman [0077], “User interface module 320 may present events . . . as they happen” and “confirm an event, remove an event”). Regarding claim 6, Shichman teaches further comprises at least one automated progressivistic metadata logging module, comprising a machine learning device, configured for analyzing said at least one source video file and identifying progressivistic metadata events in said at least one source video file in automated manner and for encoding identified progressivistic metadata events into said progressivistic metadata file (see Shichman [0067], “image processing”; [0070], “Video processing”; [0071], “Replay Identification”; and/or [0072], “audio analysis”). Regarding claim 7, Shichman teaches further comprises at least one automated progressivistic metadata quality assurance module, comprises: (a) a plurality of portable computing devices comprising a human-machine interface, configured for manually logging, by a plurality of watchers of said real-life event, in a real-time regime, preliminary progressivistic draft events, thereby collecting and generating preliminary progressivistic draft data (see Shichman [0078]-[0079], “enable users on any device to provide input”); (b) a machine learning device, configured for analyzing said preliminary progressivistic draft data and identifying selected preliminary progressivistic draft events in said preliminary progressivistic draft data that attain a predetermined quality threshold, as quality assured progressivistic events (see Shichman [0080], “rank events according to preconfigured rules, thresholds or criteria”); (c) at least one automated progressivistic metadata logging module configured for encoding said quality assured progressivistic events into said progressivistic metadata file, in an automated manner (see Shichman [0079]-[0082] and [0057], “event object . . . stored”). Regarding claim 8, Shichman teaches wherein at least one file selected from the group consisting of: said at least one source video file, said relative timecode file, said absolute timecode file, said progressivistic metadata file and a performance data file, are compiled into a singular file (see Shichman [0058], “information in an event object may be used for clipping a segment of a source video to a separate video file”). Regarding claim 9, Shichman teaches wherein a timecode of said relative timecode file of said at least one source video file in said video files database is a timecode of said absolute timecode file in said timecode database or wherein said timecode of said relative timecode file of said at least one source video file in said video files database is synchronizable or linkable to said timecode of said absolute timecode file in said timecode database (see Shichman [0062], “start and end time may be an absolute value” and “start and end time may be a time offset”). Regarding claim 10, Shichman teaches wherein said events encoded into said progressivistic metadata file comprise a predefined progressivistic parameter related to at least one member selected from the group consisting of: a particular player, particular gamer, particular performer, particular team member, particular pair of team members, particular group of team members (see Shichman [0069], “players”). Regarding claim 11, Shichman teaches a method of collecting and processing progressivistic metadata comprises the steps of: (a) providing access to at least one video-related constituent selected from the group consisting of: (I) at least one source video file, wherein said video file comprising a plurality of particular frames and/or distinct segments, of a recording of a real-life event (see Shichman [0049], “video”); (II) a timecode file, uniquely identifying at least one member selected from the group consisting of: particular timeframes in said real-life event and distinct time-segments in said real-life event (see Shichman [0055]-[0056], “database 220” and “time stamps that associate the metadata object with a segment of a video clip”); (b) generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to at least one member selected from the group consisting of: said particular frames and/or said distinct segments in said video file and said particular timeframes and/or distinct time- segments in said timecode file (see Shichman [0057]-[0058] and [0066], “create new events . . . identify an event that was previously unknown”); (c) storing said progressivistic metadata file in a progressivistic metadata database (see Shichman [0057], “event object . . . stored in database”). Regarding claim 12, Shichman teaches further comprises providing a performance data module, configured for collecting and processing performance data, wherein events logged into a performance data file are related to events encoded into said progressivistic metadata file (see Shichman [0058], “information in an event object may be used for clipping a segment of a source video to a separate video file”). Regarding claim 13, Shichman teaches further comprises providing a controllable playback of said source video file and encoding events into said progressivistic metadata file is performed by a human operator in a real-time or near real time regime (see Shichman [0077], “User interface module 320 may present events . . . as they happen” and “confirm an event, remove an event”). Regarding claim 14, Shichman teaches further comprises: (a) manually logging from a human-machine interface of a plurality of portable computing devices of a plurality of watchers of said real-life event, in a real-time regime, preliminary progressivistic draft events, thereby generating preliminary progressivistic draft data (see Shichman [0078]-[0079], “enable users on any device to provide input”); (b) analyzing said preliminary progressivistic draft data by a machine learning device and identifying selected preliminary progressivistic draft events in said preliminary progressivistic draft data that attain a predetermined quality threshold, as quality assured progressivistic events (see Shichman [0080], “rank events according to preconfigured rules, thresholds or criteria”); (c) encoding said quality assured progressivistic events into said progressivistic metadata file, in an automated manner (see Shichman [0079]-[0082] and [0057], “event object . . . stored”). Regarding claim 15, Shichman teaches further comprises analyzing said at least one source video file and identifying progressivistic metadata events in said at least one source video file in automated manner and encoding identified progressivistic metadata events into said progressivistic metadata file, by at least one automated progressivistic metadata logging module, comprising a machine learning device (see Shichman [0067], “image processing”; [0070], “Video processing”; [0071], “Replay Identification”; and/or [0072], “audio analysis”). Regarding claim 16, Shichman teaches further comprises compiling into a singular file at least one file selected from the group consisting of: said at least one source video file, said relative timecode file, said absolute timecode file, said progressivistic metadata file and a performance data file (see Shichman [0058], “information in an event object may be used for clipping a segment of a source video to a separate video file”). Regarding claim 17, Shichman teaches wherein a timecode of said relative timecode file of said at least one source video file in said video files database is a timecode of said absolute timecode file in said timecode database or wherein said method further comprises synchronizing or linking said timecode of said relative timecode file of said at least one source video file in said video files database with said timecode of said absolute timecode file in said timecode database (see Shichman [0062], “start and end time may be an absolute value” and “start and end time may be a time offset”). Regarding claim 18, Shichman teaches wherein said events encoded into said progressivistic metadata file comprise a predefined progressivistic parameter related to at least one member selected from the group consisting of: a particular player, particular gamer, particular performer, particular team member, particular pair of team members, particular group of team members (see Shichman [0069], “players”). Regarding claim 19, Shichman teaches a non-transitory computer-readable storage medium, having computer-executable instructions stored thereon which, when executed by a computer micro-processor, causing said micro-processor collecting and processing progressivistic metadata (see Shichman [0027]), said computer-executable instructions comprise: (a) at least one set of instructions causing said micro-processor to obtain access to at least one video-related constituent selected from the group consisting of: (I) at least one source video file, wherein said video file comprising a plurality of particular frames and/or distinct segments, of a recording of a real-life event (see Shichman [0049], “video”); (II) a timecode file, uniquely identifying at least one member selected from the group consisting of: particular timeframes in said real-life event and distinct time- segments in said real-life event (see Shichman [0055]-[0056], “database 220” and “time stamps that associate the metadata object with a segment of a video clip”); (b) instructions causing said micro-processor generating a progressivistic metadata file, wherein events encoded into said progressivistic metadata file are respectively related to at least one member selected from the group consisting of: said particular frames and/or said distinct segments in said video file in said video files database and said particular timeframes and/or distinct time-segments in said absolute timecode file, in said absolute timecode file database (see Shichman [0057]-[0058] and [0066], “create new events . . . identify an event that was previously unknown”); (c) instructions causing said micro-processor storing said progressivistic metadata file in a progressivistic metadata database (see Shichman [0057], “event object . . . stored in database”). Regarding claim 20, Shichman teaches further comprises said progressivistic metadata file (see Shichman [0057]). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Kristopher Andersen whose telephone number is (571)270-5743. The examiner can normally be reached 8:30 AM-5:00 PM ET, Monday-Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ann Lo can be reached at (571) 272-9767. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Kristopher Andersen/Primary Examiner, Art Unit 2159
Read full office action

Prosecution Timeline

Jul 15, 2024
Application Filed
Oct 17, 2025
Non-Final Rejection — §102, §112, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602433
MESSAGE MANAGEMENT USING GRAPH-BASED MODELS
2y 5m to grant Granted Apr 14, 2026
Patent 12585636
RESOURCE EFFICIENT PARTIAL BOOTSTRAP
2y 5m to grant Granted Mar 24, 2026
Patent 12579153
RANKING SEARCH RESULTS BASED ON QUERY-SPECIFIC SELECTION RESULTS
2y 5m to grant Granted Mar 17, 2026
Patent 12579593
Systems and Methods for Entity, Relationship, and Timeline Generation from Complex Object Sets
2y 5m to grant Granted Mar 17, 2026
Patent 12561293
DYNAMIC DATABASE CONFIGURATION BASED ON APPLICATION PERFORMANCE METRICS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+40.2%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 358 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month