Prosecution Insights
Last updated: April 19, 2026
Application No. 18/273,916

FAILURE DIAGNOSIS SYSTEM, FAILURE DIAGNOSIS METHOD, AND NON-TRANSITORY STORAGE MEDIUM

Final Rejection §103
Filed
Jul 24, 2023
Examiner
LINHARDT, LAURA E
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Corporation
OA Round
2 (Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
92%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
155 granted / 223 resolved
+17.5% vs TC avg
Strong +23% interview lift
Without
With
+22.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
51 currently pending
Career history
274
Total Applications
across all art units

Statute-Specific Performance

§101
5.4%
-34.6% vs TC avg
§103
72.8%
+32.8% vs TC avg
§102
5.4%
-34.6% vs TC avg
§112
14.4%
-25.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 223 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-11 are pending in this application. Claims 1 and 8-10 are amended. Claim 11 is newly added. Claims 1-10 are presented for examination. Information Disclosure Statement The information disclosure statements (IDS) submitted on 25 November 2025 is being considered by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-3 and 6-10 are rejected under 35 U.S.C. 103 as being unpatentable over Takashi et al. (Foreign Reference JP 200405877 A) in view of Van Os et al. (US Patent 10,783,576 B1). Regarding claim 1, Takashi teaches a failure diagnosis system comprising: at least one memory configured to store one or more instructions; and at least one processor configured to execute the one or more instructions (Takashi: Pg. 3, Lines 37-46; information system includes an information terminal CPU; a driver information database for storing information on the driver, a diagnostic logic database for storing diagnostic logic, a diagnostic method database for storing detailed information on diagnostic methods) to: acquire vehicle-related data related to a target vehicle (Takashi: Pg. 2 Lines 9-10; in-vehicle terminal collects vehicle status information and transmits it to an information center); determine a malfunction that may be occurring in the target vehicle, based on the vehicle-related data (Takashi: Pg. 2 Line 11; information center diagnoses the vehicle based on the transmitted information); ……….. ; notify a user of the determined area to be captured (Takashi: Pg. 8 Lines 8-11; a photographed image of the camera 2120 is displayed on the display 2130, and a portion of a portion to be photographed is displayed by an arrow 2131, thereby informing the driver 2000 as a photographer of the photographing location). Takashi doesn’t explicitly teach determine an area to be captured in the target vehicle for failure diagnosis, based on the determined malfunction. However Takashi is deemed to disclose an equivalent teaching. Takashi teaches a camera necessity flag that indicated whether image information is necessary for diagnosis. This flag causes the system to request the driver take an image with their cellphone (Takashi: Pg. 6 Lines12-14). The example taught is that the information center needs to photograph a certain part in the vehicle’s trunk for diagnosis. The image request from the information system is displayed on the cellphone with the determined area in question noted by an arrow (Takashi: Pg. 8 Lines 5-11). The information center sending an image noting the area needing a picture with an arrow is an example of determining an area to be captured in the target vehicle for failure diagnosis, based on the determined malfunction. It would be obvious to one of ordinary skill before the effective filing date to determine a target area to be imaged based on the determined malfunction taught in Takashi with a reasonable expectation of success because adding a driver provided image of an area combined with the already submitted vehicle data creates a more accurate diagnosis (Takashi (Pg. 8 Lines 27-31). Takashi doesn’t explicitly teach provide the user with a sample image of the area to be captured of the vehicle. However Takashi in view of Van Os, solving the same problem, teaches provide the user with a sample image of the area to be captured of the vehicle. Van Os teaches the preview of an image in order for a user to properly obtain a needed image (Van Os: Col. 63 Lines 2-6, Fig. 81). Van Os sends the image to a remote server to complete the process (Van: Col. 63 Lines 32-41). Takashi’s system uses the user taken photograph sent to the information center to supplement the electronic control unit’s data to preform a more accurate diagnosis (Takashi: Pg. 2 Lines 9-10, Pg. 8 Lines 5-8, Pg. 8 Lines 27-30). It would be obvious to one of ordinary skill in the art to provide a sample image to the user in order to obtain an accurate image of the vehicle’s component for diagnostic purposes. It would have been obvious to one having ordinary skill in the art to modify the remote vehicle diagnostics (Takashi: Pg. 2 Lines 12-20) displaying an image preview (Van Os: Col. 63 Lines 2-6, Fig. 81) with a reasonable expectation of success because instructing the user how to capture an image of the desired item, transmitting the image to a remote server to complete the process (Van Os: Col. 63 Lines 32-41). Regarding claim 2, Takashi teaches the failure diagnosis system according to claim 1, wherein the processor is further configured to execute the one or more instructions to: acquire, after the notification, an image of the area to be captured in the target vehicle, the image being captured from an external apparatus (Takashi: Pg. 8 Lines 17-21; cellular phone receives the test mode switching signal, and receives information regarding the type of the audio / image signal; a picture is taken or recorded, and the image or audio file is transmitted to the information center); and perform failure diagnosis of the target vehicle based on analyzing the image (Takashi: Pg. 8 Lines 27-30; for the problem that there are diagnostic items that cannot be physically diagnosed by the electronic control unit, image information or sound information is also added by using a camera or a mobile phone with a microphone, and a more accurate diagnosis is performed). Regarding claim 3, Takashi teaches the failure diagnosis system according to claim 1, wherein the processor is further configured to execute the one or more instructions to: detect a data abnormality indicating a behavior different from a behavior under normal operation, based on the vehicle-related data and determine the malfunction, based on content of the detected data abnormality (Takashi: Pg. 2 Lines 12-20; transmitting the collected diagnostic data to the information center; receiving the result of the diagnosis and transmitting the result to the in-vehicle terminal). Regarding claim 6, Takashi teaches the failure diagnosis system according to claim 1, wherein the processor is further configured to execute the one or more instructions to: generate guidance information providing guidance on at least one of an image capture angle and a distance to a subject for capturing the area to be captured (Takashi: Pg. 8 Lines 12-15; FIG. 22 shows a camera angle change request instruction in a camera-equipped mobile phone), and notify the user of the guidance information (Takashi: Pg. 8 Lines 12-15; a camera angle change request instruction in a camera-equipped mobile phone; when the place to be photographed is out of the photographing range of the camera, the information center can instruct the driver to move the camera). Regarding claim 7, Takashi teaches the failure diagnosis system according to claim 6, wherein the processor is further configured to execute the one or more instructions to generate the guidance information varied for the each area to be captured (Takashi: Pg. 8 Lines 5-11; a photographed image of the camera is displayed on the display, and a portion of a portion to be photographed is displayed by an arrow, thereby informing the driver as a photographer of the photographing location). Regarding claim 8, Takashi teaches the failure diagnosis system according to claim 6, wherein the processor is further configured to execute the one or more instructions to generate the guidance information including the sample image (Takashi: Pg. 8 Lines 5-11; a photographed image of the camera is displayed on the display, and a portion of a portion to be photographed is displayed by an arrow, thereby informing the driver as a photographer of the photographing location). Regarding claim 9, Takashi teaches a failure diagnosis method comprising, by a computer: acquiring vehicle-related data related to a target vehicle (Takashi: Pg. 2 Lines 9-10; in-vehicle terminal collects vehicle status information and transmits it to an information center); determining a malfunction that may be occurring in the target vehicle, based on the vehicle-related data (Takashi: Pg. 2 Line 11; information center diagnoses the vehicle based on the transmitted information); …….. ; notifying a user of the determined area to be captured (Takashi: Pg. 8 Lines 8-11; a photographed image of the camera is displayed on the display, and a portion of a portion to be photographed is displayed by an arrow, thereby informing the driver as a photographer of the photographing location). Takashi doesn’t explicitly teach determining an area to be captured in the target vehicle for failure diagnosis, based on the determined malfunction. However Takashi is deemed to disclose an equivalent teaching. Takashi teaches a camera necessity flag that indicated whether image information is necessary for diagnosis. This flag causes the system to request the driver take an image with their cellphone (Takashi: Pg. 6 Lines12-14). The example taught is that the information center needs to photograph a certain part in the vehicle’s trunk for diagnosis. The image request from the information system is displayed on the cellphone with the determined area in question noted by an arrow (Takashi: Pg. 8 Lines 5-11). The information center sending an image noting the area needing a picture with an arrow is an example of determining an area to be captured in the target vehicle for failure diagnosis, based on the determined malfunction. It would be obvious to one of ordinary skill before the effective filing date to determine a target area to be imaged based on the determined malfunction taught in Takashi with a reasonable expectation of success because adding a driver provided image of an area combined with the already submitted vehicle data creates a more accurate diagnosis (Takashi (Pg. 8 Lines 27-31). Takashi doesn’t explicitly teach providing the user with a sample image of the area to be captured of the vehicle. However Van Os, solving the same problem, teaches providing the user with a sample image of the area to be captured of the vehicle. Van Os teaches the preview of an image in order for a user to properly obtain a needed image (Van Os: Col. 63 Lines 2-6, Fig. 81). Van Os sends the image to a remote server to complete the process (Van: Col. 63 Lines 32-41). Takashi’s system uses the user taken photograph sent to the information center to supplement the electronic control unit’s data to perform a more accurate diagnosis (Takashi: Pg. 2 Lines 9-10, Pg. 8 Lines 5-8, Pg. 8 Lines 27-30). It would be obvious to one of ordinary skill in the art to provide a sample image to the user in order to obtain an accurate image of the vehicle’s component for diagnostic purposes. It would have been obvious to one having ordinary skill in the art to modify the remote vehicle diagnostics (Takashi: Pg. 2 Lines 12-20) displaying an image preview (Van Os: Col. 63 Lines 2-6, Fig. 81) with a reasonable expectation of success because instructing the user how to capture an image of the desired item, transmitting the image to a remote server to complete the process (Van Os: Col. 63 Lines 32-41). Regarding claim 10, Takashi teaches a non-transitory storage medium storing a program causing a computer to: acquire vehicle-related data related to a target vehicle (Takashi: Pg. 2 Lines 9-10; in-vehicle terminal collects vehicle status information and transmits it to an information center); determine a malfunction that may be occurring in the target vehicle, based on the vehicle-related data (Takashi: Pg. 2 Line 11; information center diagnoses the vehicle based on the transmitted information); ……… ; notify a user of the determined area to be captured (Takashi: Pg. 8 Lines 8-11; a photographed image of the camera 2120 is displayed on the display 2130, and a portion of a portion to be photographed is displayed by an arrow 2131, thereby informing the driver 2000 as a photographer of the photographing location). Takashi doesn’t explicitly teach determine an area to be captured in the target vehicle for failure diagnosis, based on the determined malfunction. However Takashi is deemed to disclose an equivalent teaching. Takashi teaches a camera necessity flag that indicated whether image information is necessary for diagnosis. This flag causes the system to request the driver take an image with their cellphone (Takashi: Pg. 6 Lines12-14). The example taught is that the information center needs to photograph a certain part in the vehicle’s trunk for diagnosis. The image request from the information system is displayed on the cellphone with the determined area in question noted by an arrow (Takashi: Pg. 8 Lines 5-11). The information center sending an image noting the area needing a picture with an arrow is an example of determining an area to be captured in the target vehicle for failure diagnosis, based on the determined malfunction. It would be obvious to one of ordinary skill before the effective filing date to determine a target area to be imaged based on the determined malfunction taught in Takashi with a reasonable expectation of success because adding a driver provided image of an area combined with the already submitted vehicle data creates a more accurate diagnosis (Takashi (Pg. 8 Lines 27-31). Takashi doesn’t explicitly teach provide the user with a sample image of the area to be captured of the vehicle. However Van Os, solving the same problem, teaches provide the user with a sample image of the area to be captured of the vehicle. Van Os teaches the preview of an image in order for a user to properly obtain a needed image (Van Os: Col. 63 Lines 2-6, Fig. 81). Van Os sends the image to a remote server to complete the process (Van: Col. 63 Lines 32-41). Takashi’s system uses the user taken photograph sent to the information center to supplement the electronic control unit’s data to perform a more accurate diagnosis (Takashi: Pg. 2 Lines 9-10, Pg. 8 Lines 5-8, Pg. 8 Lines 27-30). It would be obvious to one of ordinary skill in the art to provide a sample image to the user in order to obtain an accurate image of the vehicle’s component for diagnostic purposes. It would have been obvious to one having ordinary skill in the art to modify the remote vehicle diagnostics (Takashi: Pg. 2 Lines 12-20) displaying an image preview (Van Os: Col. 63 Lines 2-6, Fig. 81) with a reasonable expectation of success because instructing the user how to capture an image of the desired item, transmitting the image to a remote server to complete the process (Van Os: Col. 63 Lines 32-41). Claims 4-5 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Takashi et al. (Foreign Reference JP 200405877 A) in view of Van Os et al. (US Patent 10,783,576 B1) and in further view of Wang et al. (US Publication 2020/0312056 A1). Regarding claim 4, Takashi teaches the failure diagnosis system according to claim 3, wherein the processor is further configured to execute the one or more instructions to: acquire symptom information indicating a symptom occurring in the target vehicle (Takashi: Pg. 1 Lines 40-43; abnormal noise that changes depending on the gear position of the car). Takashi doesn’t explicitly teach determine, based on the symptom information, the vehicle-related data used for detection of the data abnormality from a plurality of types of the vehicle-related data. However Wang, in the same field of endeavor, teaches determine, based on the symptom information, the vehicle-related data used for detection of the data abnormality from a plurality of types of the vehicle-related data (Wang: Para. 79, 82, Fig. 8; faulty feature signals identified for combustion, air and fuel subsystems to detect and isolate various subsystem level faults; a Bank 1 air fault class 140 is defined by a hyperplane 142, representing a fault in the air subsystem connected to Bank 1, and a Bank 2 air fault class 144 is defined by a hyperplane 146, representing a fault in the air subsystem connected to Bank 2). It would have been obvious to one having ordinary skill in the art to modify the remote vehicle diagnostics (Takashi: Pg. 2 Lines 12-20) displaying an image preview (Van Os: Col. 63 Lines 2-6, Fig. 81) with the isolation of various subsystem level faults (Wang: Para. 79) with a reasonable expectation of success because comparing vehicle data with a feature space multi-class classifier isolates subsystem level faults during a multi-fault vehicle diagnosis (Wang: Para.42, 79, 89). Regarding claim 5, Takashi doesn’t explicitly teach in a case that a plurality of types of the data abnormalities are detected, exclude the data abnormality occurring due to occurrence of the another data abnormality from the detected data abnormalities and determine the malfunction, based on content of the unexcluded data abnormality in the detected data abnormalities. However Wang, in the same field of endeavor, teaches in a case that a plurality of types of the data abnormalities are detected, exclude the data abnormality occurring due to occurrence of the another data abnormality from the detected data abnormalities and determine the malfunction, based on content of the unexcluded data abnormality in the detected data abnormalities (Wang: Para. 79, 82, Fig. 8; faulty feature signals identified for combustion, air and fuel subsystems to detect and isolate various subsystem level faults; a Bank 1 air fault class 140 is defined by a hyperplane 142, representing a fault in the air subsystem connected to Bank 1, and a Bank 2 air fault class 144 is defined by a hyperplane 146, representing a fault in the air subsystem connected to Bank 2). It would have been obvious to one having ordinary skill in the art to modify the remote vehicle diagnostics (Takashi: Pg. 2 Lines 12-20) displaying an image preview (Van Os: Col. 63 Lines 2-6, Fig. 81) with the isolation of various subsystem level faults (Wang: Para. 79) with a reasonable expectation of success because comparing vehicle data with a feature space multi-class classifier isolates subsystem level faults during a multi-fault vehicle diagnosis (Wang: Para.42, 79, 89). Regarding claim 11, Takashi doesn’t explicitly teach determine the area to be captured for failure diagnosis when the determined malfunction is occurring, by referring to malfunction-inspection-location relation information that indicates an area to be captured for each of a plurality of malfunctions. However Wang, in the same field of endeavor, teaches determine the area to be captured for failure diagnosis when the determined malfunction is occurring, by referring to malfunction-inspection-location relation information that indicates an area to be captured for each of a plurality of malfunctions (Wang: Para. 19, 92; the system monitors the engine subsystem and detects any misfires that occur; when the number of misfires reaches or exceeds a selected threshold number, a malfunction is identified; identifying one or more individual components or vehicle subsystem as having a contribution to the malfunction). It would have been obvious to one having ordinary skill in the art to modify the remote vehicle diagnostics (Takashi: Pg. 2 Lines 12-20) displaying an image preview (Van Os: Col. 63 Lines 2-6, Fig. 81) with the isolation of various subsystem level faults (Wang: Para. 79) with a reasonable expectation of success because comparing vehicle data with a feature space multi-class classifier isolates subsystem level faults during a multi-fault vehicle diagnosis (Wang: Para.42, 79, 89). Response to Arguments Applicant's arguments, filed on 2 December 2025, with respect to the rejection of claims 1-10 under 35 U.S.C. 103 have been fully considered, but they are not persuasive. The applicant’s attorney argues that Takashi does not teach claim 1’s “provid[ing] the user with a sample image of the area to captured of the of the vehicle.” In response to the applicant’s argument above, the above limitation is taught by Takashi in view of Van Os. Van Os teaches the preview of an image in order for a user to properly obtain a needed image (Van Os: Col. 63 Lines 2-6, Fig. 81). Van Os sends the image to a remote server to complete the process (Van: Col. 63 Lines 32-41). Takashi’s system uses the user taken photograph sent to the information center to supplement the electronic control unit’s data to perform a more accurate diagnosis (Takashi: Pg. 2 Lines 9-10, Pg. 8 Lines 5-8, Pg. 8 Lines 27-30). It would be obvious to one of ordinary skill in the art to provide a sample image to the user in order to obtain an accurate image of the vehicle’s component for diagnostic purposes. The applicant next argues that Takashi does not teach claim 4’s “determine ….. the vehicle-related data used for detection of the data abnormality from a plurality of types of the vehicle-related data.” In response to the applicant’s argument above, Wang teaches capturing pair measurement data from various components creating a two-dimensional picture of the component’s function. The system then compares the two-dimensional graphs to the graphs in the knowledge base to identify a component’s failure, fault, or sub-optimal operation (Wang: Para. 19, 92). The applicant next argues that Takashi does not teach claim 11’s “referring to malfunction-inspection-location relation information that indicates an area to be captured for each of a plurality of malfunctions.” In response to the applicant’s argument above, Wang teaches capturing pair measurement data from various components creating a two-dimensional picture of the component’s function. The system then compares the two-dimensional graphs to the graphs in the knowledge base to identify a component’s failure, fault, or sub-optimal operation (Wang: Para. 19, 92). The applicant’s arguments have failed to point out the distinguishing characteristics of the amended claim language over the prior art. For the above reasons, Takashi’s vehicle diagnostic in view of Van Os’s sample image reads on the applicant’s failure diagnosis system. The rejection is maintained. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to LAURA E LINHARDT whose telephone number is (571)272-8325. The examiner can normally be reached on M-TR, M-F: 8am-4pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Angela Ortiz can be reached on (571) 272-1206. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /L.E.L./Examiner, Art Unit 3663 /ANGELA Y ORTIZ/Supervisory Patent Examiner, Art Unit 3663
Read full office action

Prosecution Timeline

Jul 24, 2023
Application Filed
Aug 26, 2025
Non-Final Rejection — §103
Dec 02, 2025
Response Filed
Mar 05, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586463
DETERMINATION DEVICE, DETERMINATION METHOD, AND PROGRAM
2y 5m to grant Granted Mar 24, 2026
Patent 12578197
Tandem Riding Detection on Personal Mobility Vehicles
2y 5m to grant Granted Mar 17, 2026
Patent 12540822
WATER AREA OBJECT DETECTION SYSTEM AND MARINE VESSEL
2y 5m to grant Granted Feb 03, 2026
Patent 12517275
SUBMARINE EXPLORATION SYSTEM COMPRISING A FLEET OF DRONES
2y 5m to grant Granted Jan 06, 2026
Patent 12459564
ELECTRONIC STEERING APPARATUS OF VEHICLE AND CONTROL METHOD THEREOF
2y 5m to grant Granted Nov 04, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
92%
With Interview (+22.7%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 223 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month