Prosecution Insights
Last updated: April 19, 2026
Application No. 18/897,303

CONTROL DEVICE FOR MOVING BODY, CONTROL METHOD FOR MOVING BODY, AND STORAGE MEDIUM

Non-Final OA §102§103§112
Filed
Sep 26, 2024
Examiner
RIOS-AGUIRRE, IZCALLI ANDRE
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Honda Motor Co. Ltd.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
16 granted / 22 resolved
+20.7% vs TC avg
Strong +29% interview lift
Without
With
+29.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
18 currently pending
Career history
40
Total Applications
across all art units

Statute-Specific Performance

§101
15.8%
-24.2% vs TC avg
§103
38.9%
-1.1% vs TC avg
§102
23.2%
-16.8% vs TC avg
§112
19.5%
-20.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 22 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statements (IDS) submitted on 26 September 2024 and 16 December 2024 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Status of Application Claims 1-10 are pending. Claims 1, 9, and 10 are independent. This NON-FINAL action is in response to communications received 02 December 2025. Specification The abstract of the disclosure is objected to because it exceeds 150 words. A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Objections Claims 1, 9, and 10 are objected to because of the following informalities: Claim 1, line 5 – “a surrounding situation of a moving body,” should be corrected to “a surrounding situation of the moving body,” to avoid antecedent issues. Claim 1, line 7 – “the set destination,” should be changed to “the destination,” for consistency with “a destination”. Claim 9, line 4 – “the set destination,” should be changed to “the destination,” for consistency with “a destination”. Claim 10, lines 4-5 – “the set destination,” should be changed to “the destination,” for consistency with “a destination”. Appropriate correction is required. Claim Rejections - 35 USC § 112 Claim 10 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 10 recites the limitation "to generate a route from the moving body to the destination," in lines 3-4. There is insufficient antecedent basis for “the destination,” in the claim. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-6 and 9 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ishikawa et al. (US 20230205234 A1), hereinafter Ishikawa. Regarding claim 1, Ishikawa discloses: A control device for a moving body comprising ([0087], To set and control the movement path for the robot 10, such as for path creation processing, a data processing unit of an information processing device provided inside the robot 10 or an external information processing device that can communicate with the robot 10 may be used): a storage medium configured to store computer-readable instructions (Fig. 25, 308; [0042], a storage medium provided in a computer-readable form or a program that can be provided by a communication medium, the storage medium or the program being provided to an information processing device or a computer system that can execute various program codes. By providing such a program in a computer-readable form, processing according to the program can be implemented on an information processing device or a computer system); and a processor connected to the storage medium ([0485], The data processing unit 303 includes a processor such as a CPU having a function of executing a program, and performs the processing according to the flowcharts described in the above embodiments; [0486], the program is stored in the storage unit), wherein the processor executes the computer-readable instructions to recognize a surrounding situation of a moving body based at least on an image of the surrounding situation of the moving body, to generate a route from the moving body to a destination based on the recognized surrounding situation and the set destination, to determine whether or not the generated route is within a predetermined region defined by a current position of the moving body and the destination, to control the moving body so that the moving body moves along the generated route to the destination, and to turn the moving body on a spot and to regenerate a route from the moving body that has turned on the spot to the destination when it is determined that the route has deviated from the predetermined region (Fig. 1; Fig. 13; Fig. 14; Fig. 15; Fig. 16; Fig. 21; Fig. 24; [0150], The target recognition unit 102 receives the image captured by the camera 101 and identifies the target. The target image is stored in advance in a storage unit (not illustrated), and the target recognition unit 102 compares the target image stored in the storage unit with the image captured by the camera to extract the target from the image captured by the camera; [0151], The target recognition unit 102 analyzes the position, posture, and movement speed of the target extracted from the image captured by the camera, and creates target information including these pieces of information; [0117], The processing executed by the robot 10 includes only the processing of creating a piece of path position information (waypoint) in accordance with the above-described Rule and the processing of creating a path connecting the pieces of path position information (waypoints), so that it is possible to create a reliable following route with a less processing load; [0157], The path planning unit 105 performs processing of planning the movement path and movement speed for the mobile device (robot) 100; [0097], As the target 20 to be followed moves, the robot 10 sequentially sets a new piece of path position information (waypoint) near the target 20 in accordance with a predetermined rule; [0102], Rule: Set a new piece of path position information (waypoint) on a circle with a radius Ra centered on the target and at an angular position of a specified angle (for example, 150 degrees counterclockwise) with respect to the travel direction of the target to be followed; [0103], Note that this Rule is an example, and various values can be used as the size of the radius Ra and parameters such as an angle of 150 degrees). Regarding claim 2, Ishikawa discloses: wherein the processor determines whether or not the regenerated route is within the predetermined region ([0498], The data processing unit 501 of the user terminal 500 creates a piece of path position information (waypoint) and a connection path, further creates drive control information for driving the mobile device 300 along the created pieces of path position information (waypoints) and connection path, and transmits the drive control information to the mobile device 300 via the communication unit 503; [0500], The data processing unit 501 includes a processor such as a CPU having a function of executing a program, and performs the processing according to the flowcharts described in the above embodiments), and when it is determined that the regenerated route is within the predetermined region, the processor controls the moving body so that the moving body moves along the regenerated route to the destination (Fig. 13; Fig. 14; Fig. 15; Fig. 16; Fig. 21; Fig. 24; [0044], According to the configuration of one embodiment of the present disclosure, it is possible to realize a device and a method for efficiently creating a following path of a target to be followed by a mobile device to enable reliable follow up. Specifically, for example, it includes a movement path creation unit that creates a path that is the following path on which the target is to be followed by the mobile device. The movement path creation unit sequentially sets a piece of path position information (waypoint) near the target, and sequentially connects set pieces of path position information to create the path. The movement path creation unit sets the piece of path position information at a position determined in accordance with a predetermined rule based on a position and a travel direction of the target. When the target moves a predetermined distance, a new piece of path position information is set). Regarding claim 3, Ishikawa discloses: wherein the processor determines whether or not the regenerated route is within the predetermined region ([0498], The data processing unit 501 of the user terminal 500 creates a piece of path position information (waypoint) and a connection path, further creates drive control information for driving the mobile device 300 along the created pieces of path position information (waypoints) and connection path, and transmits the drive control information to the mobile device 300 via the communication unit 503; [0500], The data processing unit 501 includes a processor such as a CPU having a function of executing a program, and performs the processing according to the flowcharts described in the above embodiments), and when it is determined that the regenerated route has deviated from the predetermined region, the processor controls the moving body so that the moving body moves along the regenerated route to a vicinity of a boundary of the predetermined region, turns the moving body on the spot in the vicinity of the boundary of the predetermined region, and generates a route from the moving body that has turned on the spot to the destination (Fig. 13; Fig. 14; Fig. 15; Fig. 16; Fig. 21; Fig. 24; [0100], In the example illustrated in FIG. 1, the new piece of path position information (waypoint) 12 is set on a circle with a radius Ra centered on the target 20 and at an angular position of 150 degrees counterclockwise with respect to the travel direction of the target 20 to be followed). Regarding claim 4, Ishikawa discloses: wherein the predetermined region is a region which represents a range within a predetermined distance from a reference line that connects the current position of the moving body and the destination (Fig. 13; Fig. 14; Fig. 15; Fig. 16; Fig. 21; Fig. 24; [0104], The piece of path position information (waypoint) is sequentially set as the target 20 moves. For example, when the target 20 moves a predetermined distance (L), a new piece of path position information (waypoint) is set). Regarding claim 5, Ishikawa discloses: wherein the moving body is operated in either a following mode in which the moving body moves to follow a user, or a guide mode in which the moving body moves in front of the user at a moving speed corresponding to that of the user (Fig. 1, Basic Algorithm of Target Following Process; [0011], a movement path creation unit that creates a path that is a following path on which a target is to be followed by a mobile device). Regarding claim 6, Ishikawa discloses: wherein when the moving body is operated in the following mode, the destination is the user or a point within a predetermined range from the user (Fig. 1, Basic Algorithm of Target Following Process; [0011], a movement path creation unit that creates a path that is a following path on which a target is to be followed by a mobile device; Fig. 13; Fig. 14; Fig. 15; Fig. 16; Fig. 21; Fig. 24; [0104], The piece of path position information (waypoint) is sequentially set as the target 20 moves. For example, when the target 20 moves a predetermined distance (L), a new piece of path position information (waypoint) is set). Regarding claim 9, Ishikawa discloses: A control method for a moving body, wherein a computer recognizes a surrounding situation of a moving body based at least on an image of the surrounding situation of the moving body, generates a route from the moving body to a destination based on the recognized surrounding situation and the set destination, determines whether or not the generated route is within a predetermined region defined by a current position of the moving body and the destination, controls the moving body so that the moving body moves along the generated route to the destination, and turns the moving body on a spot and regenerates a route from the moving body that has turned on the spot to the destination when it is determined that the route deviates from the predetermined region (0024], an information processing method performed in an information processing device, the information processing device including a movement path creation unit that creates a path that is a following path on which a target is to be followed by a mobile device; Fig. 1; Fig. 13; Fig. 14; Fig. 15; Fig. 16; Fig. 21; Fig. 24; [0150], The target recognition unit 102 receives the image captured by the camera 101 and identifies the target. The target image is stored in advance in a storage unit (not illustrated), and the target recognition unit 102 compares the target image stored in the storage unit with the image captured by the camera to extract the target from the image captured by the camera; [0151], The target recognition unit 102 analyzes the position, posture, and movement speed of the target extracted from the image captured by the camera, and creates target information including these pieces of information; [0117], The processing executed by the robot 10 includes only the processing of creating a piece of path position information (waypoint) in accordance with the above-described Rule and the processing of creating a path connecting the pieces of path position information (waypoints), so that it is possible to create a reliable following route with a less processing load; [0157], The path planning unit 105 performs processing of planning the movement path and movement speed for the mobile device (robot) 100; [0097], As the target 20 to be followed moves, the robot 10 sequentially sets a new piece of path position information (waypoint) near the target 20 in accordance with a predetermined rule; [0102], Rule: Set a new piece of path position information (waypoint) on a circle with a radius Ra centered on the target and at an angular position of a specified angle (for example, 150 degrees counterclockwise) with respect to the travel direction of the target to be followed; [0103], Note that this Rule is an example, and various values can be used as the size of the radius Ra and parameters such as an angle of 150 degrees). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 7, 8, and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa in view of Komuro et al. (US 20230315130 A1), hereinafter Komuro. Regarding claim 7, Ishikawa does not specifically state: wherein when the moving body is operated in the guidance mode, the destination is a point set by the user or a point within a predetermined range in front of the user. Komuro teaches: wherein when the moving body is operated in the guidance mode, the destination is a point set by the user or a point within a predetermined range in front of the user (Fig. 5A, Guide Mode and Leading Mode; Fig. 6B; Fig. 6D; [0028], The mobile body 100 can guide a specific person (also referred to as a user) to a specific location in a space, can store a load of the user in a housing and follow the user, or can deliver a load from a specific location in a space to a location of a user (or from the location of the user to the specific location)). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Komuro into the invention of Ishikawa to include a guidance mode where a destination is set by a user as Komuro discloses with a reasonable expectation of success. One would be motivated to incorporate aspects of the cited prior art to create a more robust system that guides a user to a desired location (Komuro: [0036]). Additionally, the claimed invention is merely a combination of old, well-known elements of a following robot that maintains a radius distance away from a user as disclosed by Ishikawa and a robot that leads/guides a user to a destination as taught by Komuro. The combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the results of the combination would have been predictable. Regarding claim 8, Ishikawa does not specifically state: wherein when the moving body is operated in the guidance mode, the destination is a temporary point that is temporarily set to reach a final point set by the user. Komuro teaches: wherein when the moving body is operated in the guidance mode, the destination is a temporary point that is temporarily set to reach a final point set by the user (Fig. 6B, S626; Fig. 6D, S669; [0088], In a case of a user detection failure (loss) during traveling, the mobile body 100 is stopped when temporarily failing to detect the user in the leading mode and the guide mode, and requires user authentication again. On the other hand, in the follow mode, for example, in a case where the user walking in front of the mobile body is detected again from the sensor information, the traveling is resumed without performing the user authentication). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the additional teachings of Komuro into the invention of Ishikawa to include a temporary waiting destination when a user cannot be detected as Komuro discloses with a reasonable expectation of success. One would be motivated to incorporate aspects of the cited prior art to create a more robust system that sets a temporary waiting destination to allow a user to reach the guiding robot and to continue navigation. Additionally, the claimed invention is merely a combination of old, well-known elements of a following robot that maintains a radius distance away from a user as disclosed by Ishikawa and a robot that leads/guides a user to a destination as taught by Komuro. The combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the results of the combination would have been predictable. Regarding claim 10, Ishikawa discloses: wherein a computer is caused to recognize a surrounding situation of a moving body based at least on an image of the surrounding situation of the moving body, to generate a route from the moving body to the destination based on the recognized surrounding situation and the set destination, to determine whether or not the generated route is within a predetermined region defined by a current position of the moving body and the destination, to control the moving body so that the moving body moves along the generated route to the destination, and to turn the moving body on a spot and to regenerate a route from the moving body that has turned on the spot to the destination when it is determined that the route deviates from the predetermined region (Fig. 1; Fig. 13; Fig. 14; Fig. 15; Fig. 16; Fig. 21; Fig. 24; [0150], The target recognition unit 102 receives the image captured by the camera 101 and identifies the target. The target image is stored in advance in a storage unit (not illustrated), and the target recognition unit 102 compares the target image stored in the storage unit with the image captured by the camera to extract the target from the image captured by the camera; [0151], The target recognition unit 102 analyzes the position, posture, and movement speed of the target extracted from the image captured by the camera, and creates target information including these pieces of information; [0117], The processing executed by the robot 10 includes only the processing of creating a piece of path position information (waypoint) in accordance with the above-described Rule and the processing of creating a path connecting the pieces of path position information (waypoints), so that it is possible to create a reliable following route with a less processing load; [0157], The path planning unit 105 performs processing of planning the movement path and movement speed for the mobile device (robot) 100; [0097], As the target 20 to be followed moves, the robot 10 sequentially sets a new piece of path position information (waypoint) near the target 20 in accordance with a predetermined rule; [0102], Rule: Set a new piece of path position information (waypoint) on a circle with a radius Ra centered on the target and at an angular position of a specified angle (for example, 150 degrees counterclockwise) with respect to the travel direction of the target to be followed; [0103], Note that this Rule is an example, and various values can be used as the size of the radius Ra and parameters such as an angle of 150 degrees). However, Ishikawa does not specifically state: A non-transient computer-readable storage medium for storing a program Komuro teaches: A non-transient computer-readable storage medium for storing a program ([0009], Still another aspect of the present disclosure provides a non-transitory computer-readable storage medium storing a program for causing a computer to function as each unit of a mobile body control device) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Komuro into the invention of Ishikawa to include a non-transitory computer-readable storage medium storing a program for causing a computer to function as each unit of a mobile body control device as Komuro discloses with a reasonable expectation of success. One would be motivated to incorporate aspects of the cited prior art to create a more robust system that stores control instructions even in the absence of power. Additionally, the claimed invention is merely a combination of old, well-known elements of a following robot that maintains a radius distance away from a user as disclosed by Ishikawa and non-transitory computer-readable storage media as taught by Komuro. The combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the results of the combination would have been predictable. Documents Considered but Not Relied Upon The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. Motoyama et al. (US 20240393805 A1) discloses an information processing method executed by one processor or executed by a plurality of processors in cooperation, the method includes: an estimation step of estimating a relative position or a relative attitude of an aerial vehicle with respect to a moving body; an acquisition step of acquiring information related to a distance between the moving body and the aerial vehicle; and a switching step of switching an estimation method for estimating the relative position or the relative attitude of the aerial vehicle, based on the information related to the distance. Kuhara (US 20250271870 A1) discloses robot current location information, guidance destination location information, robot travelable area information, and person movable area information are obtained; a person route that is a route for a person from a current location to the location of a guidance destination is calculated, to satisfy a predetermined condition for lightening a burden on the person, based on the robot current location information, the guidance destination location information, and the person movable area information obtained; the person route calculated is compared with the robot travelable area information; and a display format of guidance to the guidance destination, the display format being displayed on a display of the robot, is changed in accordance with a comparison result obtained from the comparing. Oshima et al. (US 20230341862 A1) discloses an autonomous movement device includes a sensor to detect a surrounding object, driven wheels, and a processor. The processor acquires teaching control data, generates route data of a surrounding environment based on a point cloud detected by the sensor while controlling the driven wheels in accordance with the acquired teaching control data, and memorizes a movement path along which the autonomous movement device moves in the generated route data. Kishimoto et al. (US 20170349258 A1) discloses a moving body control device includes a moving body direction sensor, a position sensor, and processing circuitry. The processing circuitry estimates a direction of disturbance. The processing circuitry sets a target position and a starting position. The processing circuitry controls a propulsion generator and a movement direction adjuster such that a heading direction sensed by the moving body direction sensor is opposite to the direction of the disturbance estimated by the processing circuitry, and when the moving body has been drifted at least a specific distance from the target position, the moving body returns to the starting position, and the heading direction at the starting position is opposite to the direction of the disturbance. The processing circuitry changes the starting position based on a distance between the target position and a position of the moving body sensed by the position sensor. Okada (US 20170293795 A1) discloses a moving device for moving along a terminal device includes a first control unit and a second control unit. The first control unit is configured to move the moving device from a position far from the terminal device to a vicinity of the terminal device based on a current position of the terminal device. The second control unit is configured to recognize the terminal device or a user of the terminal device in the vicinity of the current position of the terminal device. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to IZCALLI ANDRE RIOS-AGUIRRE whose telephone number is (571)272-0790. The examiner can normally be reached Monday through Friday 8:30 - 17:00 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached at (571) 270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /I.A.R./Examiner, Art Unit 3666 /SCOTT A BROWNE/Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Sep 26, 2024
Application Filed
Feb 04, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589735
DRIVING ASSISTANCE DEVICE, VEHICLE CONTROL SYSTEM, AND DRIVING ASSISTANCE METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12567286
SYSTEMS AND METHODS FOR DETECTING VEHICLE CHARGING BEHAVIOR
2y 5m to grant Granted Mar 03, 2026
Patent 12552411
METHOD FOR CHECKING A CONTROL DEVICE OF A REMOTE DRIVING STATION
2y 5m to grant Granted Feb 17, 2026
Patent 12539760
METHOD OF ADJUSTING VEHICLE DISPLAY DEVICE, AND VEHICLE DISPLAY DEVICE
2y 5m to grant Granted Feb 03, 2026
Patent 12528436
APPARATUS FOR PROTECTING PEDESTRIAN FROM VEHICLE COLLISION
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+29.1%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 22 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month