Prosecution Insights
Last updated: April 19, 2026
Application No. 17/930,727

DISTANCE MEASURING DEVICE AND DISTANCE MEASURING METHOD

Non-Final OA §102§103
Filed
Sep 09, 2022
Examiner
HAUT, EVAN HARRISON
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kabushiki Kaisha Toshiba
OA Round
1 (Non-Final)
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 0 resolved
-52.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
17 currently pending
Career history
17
Total Applications
across all art units

Statute-Specific Performance

§103
64.6%
+24.6% vs TC avg
§102
22.9%
-17.1% vs TC avg
§112
12.5%
-27.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 0 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claims 1, 6, and 8 are objected to because of the following informalities: The final limitation in Claim 1 should be amended to recite the following “divide at least one or some of the pixels” Claim 6 should be amended to recite the following “based on a result of calculating each of the areas” Claim 8 should be amended to recite the following “configured to calculate each of the areas” Claim 20 should be amended to recite the following “based on a result of calculating each of the areas” Appropriate correction is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 14, and 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Yoshida (U.S. 20230384436 A1). Regarding claim 1, Yoshida discloses a distance measuring device ([0024] a distance measurement device) comprising: a plurality of light receiving elements ([0030] the multiple light receiving elements) each of which receives a reflected optical signal reflected by an object ([0023] device measures a distance to a reflection point on a target by detecting, as a pixel, light reflected on the reflection point); and an image processor that generates a distance image in accordance with distances to the object ([0046] the distance correction unit 150 corrects the distance values for all the reflection points… and generates a distance image based on the corrected distance values), based on signal intensities and light reception timings of the reflected optical signal received by the plurality of light receiving elements ([0045] the signal intensity of the peak of the waveform decreases, and the light reception time may be delayed. Therefore, the corrected distance value closer to the true value can be calculated by calculating the correction amount for correcting the distance value in a direction in which the delay of the light reception time, that is, the increase in the length of the distance value is eliminated), wherein the image processor is configured to: detect a direction of the object, based on at least either the signal intensities of the reflected optical signal received by the light receiving elements ([0045] On the target T, a partial surface SA, which reflects light beams to be incident on the same pixel, constitutes a reflection point detected by the pixel. In FIG. 2, the partial surface SA is included in the detection range PR on the surface of the target T. As the inclination {direction} of the partial surface SA with respect to the reference surface R increases, an optical path length difference of the reflected light on the partial surface increases. The peak of the waveform of the reflection intensity also decreases with an increase of the inclination) or the distances to the object measured based on the reflected optical signal; and divide at least one or some of pixels included in the distance image, based on the direction of the detected object. ([0086]-[0088] The pixel information acquisition unit 110 acquires pixel information for each sub-pixel obtained by dividing one pixel into the high speed range A and the low speed range B…The pixel corresponding to the reflection point is divided into multiple sub-pixels corresponding to the different scan speeds…Specifically, the change degree calculation unit 135 calculates the change degree between the detected waveform in the high speed range A and the detected waveform in the low speed range B as the inclination feature amount. The change degree calculation unit 135 may use, as the change degree, the change amount of at least one or more waveform characteristics. The change degree calculation unit 135 may use, as the change degree, the change amount based on all points of the detected waveform. This change degree is an example of the inclination {direction} feature amount). Regarding claim 14, Yoshida discloses the light emitter that emits the optical signal ([0030] The light emitting unit 2 is a semiconductor element that emits directional laser light). Regarding claim 15, Yoshida discloses a distance measuring method that comprises: receiving, by a light receiving element provided in each of a plurality of pixels arranged in a one-dimensional direction or a two-dimensional direction ([0030] the multiple light receiving elements may be arrayed in a two-dimensional direction), a reflected optical signal reflected by an object ([0023] device measures a distance to a reflection point on a target by detecting, as a pixel, light reflected on the reflection point); detecting a direction of the object, based on at least either signal intensities of the reflected optical signal received by the light receiving elements ([0045] On the target T, a partial surface SA, which reflects light beams to be incident on the same pixel, constitutes a reflection point detected by the pixel. In FIG. 2, the partial surface SA is included in the detection range PR on the surface of the target T. As the inclination of the partial surface SA with respect to the reference surface R increases, an optical path length difference of the reflected light on the partial surface increases. The peak of the waveform of the reflection intensity also decreases with an increase of the inclination) or distances to the object measured based on the reflected optical signal; and dividing at least one or some of the plurality of pixels, based on the detected direction of the object ([0086]-[0088] The pixel information acquisition unit 110 acquires pixel information for each sub-pixel obtained by dividing one pixel into the high speed range A and the low speed range B…The pixel corresponding to the reflection point is divided into multiple sub-pixels corresponding to the different scan speeds…Specifically, the change degree calculation unit 135 calculates the change degree between the detected waveform in the high speed range A and the detected waveform in the low speed range B as the inclination feature amount. The change degree calculation unit 135 may use, as the change degree, the change amount of at least one or more waveform characteristics. The change degree calculation unit 135 may use, as the change degree, the change amount based on all points of the detected waveform. This change degree is an example of the inclination feature amount). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-4, 12, 13 and 15-18 are rejected under 35 U.S.C. 103 as being unpatentable over Watanabe (U.S. 20180172830 A1), in view of Yoshida (U.S. 20230384436 A1). Regarding claim 1, Watanabe discloses a distance measuring device ([0047] to generate a distance image before noise removal, the control processor 2 first measures the transmission/reception period for each of the plurality of directions) comprising: an image processor that generates a distance image in accordance with distances to the object ([0002] a distance image is an image composed of a plurality of pixels represented by information indicating a distance to a subject), based on signal intensities and light reception timings of the reflected optical signal received by the plurality of light receiving elements ([0047] When receiving a reflected wave corresponding to the transmitted wave, the receiver 12 outputs a digital signal corresponding to the intensity of the reflected wave to the control processor 2. The distance image pre-processor 221 of the control processor 2 obtains a transmission/reception period τ2 from the transmission timing). Watanabe is not relied upon as disclosing a plurality of light receiving elements each of which receives a reflected optical signal reflected by an object. Watanabe is also not relied upon as disclosing the image processor that is configured to: detect a direction of the object, based on at least either the signal intensities of the reflected optical signal received by the light receiving elements or the distances to the object measured based on the reflected optical signal; and divide at least one or some of pixels included in the distance image, based on the direction of the detected object. However, Yoshida teaches a plurality of light receiving elements ([0030] the multiple light receiving elements) each of which receives a reflected optical signal reflected by an object ([0023] device measures a distance to a reflection point on a target by detecting, as a pixel, light reflected on the reflection point). Yoshida also teaches the image processor is configured to: detect a direction of the object, based on at least either the signal intensities of the reflected optical signal received by the light receiving elements ([0045] On the target T, a partial surface SA, which reflects light beams to be incident on the same pixel, constitutes a reflection point detected by the pixel. In FIG. 2, the partial surface SA is included in the detection range PR on the surface of the target T. As the inclination {direction} of the partial surface SA with respect to the reference surface R increases, an optical path length difference of the reflected light on the partial surface increases. The peak of the waveform of the reflection intensity also decreases with an increase of the inclination) or the distances to the object measured based on the reflected optical signal; and divide at least one or some of pixels included in the distance image, based on the direction of the detected object. ([0086]-[0088] The pixel information acquisition unit 110 acquires pixel information for each sub-pixel obtained by dividing one pixel into the high speed range A and the low speed range B…The pixel corresponding to the reflection point is divided into multiple sub-pixels corresponding to the different scan speeds…Specifically, the change degree calculation unit 135 calculates the change degree between the detected waveform in the high speed range A and the detected waveform in the low speed range B as the inclination feature amount. The change degree calculation unit 135 may use, as the change degree, the change amount of at least one or more waveform characteristics. The change degree calculation unit 135 may use, as the change degree, the change amount based on all points of the detected waveform. This change degree is an example of the inclination {direction} feature amount). Watanabe and Yoshida are both considered to be analogous to the claimed invention because they are in the same field of LiDAR. It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Watanabe to incorporate the teachings of Yoshida to include a plurality of light receiving elements, as is standard in the art. Further, it would have also been obvious to modify Watanabe to incorporate the teachings of Yoshida to include detecting a direction of the object and dividing at least one or some of pixels included in the distance image, based on the direction of the detected object. This would have the predictable result of minimizing the noise region of Watanabe. Regarding claim 2, Watanabe teaches a measuring device comprising an image processor that is configured to divide, among pixels included in the distance image, a pixel including distance information of two or more objects ([0017]- [0019] FIG. 7 are charts showing a case where noise is generated by the front object and the rear object located one after the other along the transmission direction of the pulsed laser light… FIG. 9 is a view showing noise regions judged by a process shown in the flow chart shown in FIG. 8 as white regions and regions not judged as noise as black regions). The teachings of Watanabe have been interpreted as pixels with noise caused by two objects (white) are grouped separately (divided) from the pixels without noise caused by two objects (black). Regarding claim 3, Watanabe teaches the image processor that is configured to: detect a direction of at least one of the two or more objects, based on at least either signal intensities of the reflected optical signal from the two or more objects received by the light receiving elements ([0041] if the distance between the front object Obf and the rear object Obr becomes shorter (the front object Obf and the rear object Obr move toward each other) from the state shown in FIG. 6A, the respective peaks in the double-peak pulse waveform shown in FIG. 6C move toward each other. When the front object Obf and the rear object Obr are relatively close as shown in FIG. 7A, the signal waveform of the reflected wave becomes a single-peak pulse waveform as shown in FIG. 7C) or distances to the two or more objects measured based on the reflected optical signal, and divide the pixel including the distance information of the two or more objects, based on the direction of the at least one of the two or more objects ([0032] and [0060] the noise determiner 222, for example, determines the pixel as noise for each of the plurality of pixels (plurality of directions) if a gradient Gr of a distance value between this pixel (target pixel) and a predetermined peripheral pixel located around this pixel is within the predetermined range of thg1 to thg2 (thg1≤Gr≤thg2)… a binary image as shown in FIG. 9 in which the noise pixel flags are pixel values is obtained. In FIG. 9, the noise pixel flag “1” is set as a white pixel value and the noise pixel flag “0” is set as a black pixel value. As is understood from FIG. 9, an outline part of the front object). Regarding claim 4, Watanabe teaches the image processor that is configured to divide the pixel including the distance information of the two or more objects, depending on positions of individual ones of the two or more objects in the pixel including the distance information of the two or more objects ([0043] Whether or not the signal waveform of the reflected wave changes from a double-peak pulse waveform to a single-peak pulse waveform is related to the distance between the front object Obf and the rear object Obr and can be determined based on the gradient Gr between the target region and the predetermined peripheral region located around the target region…Then, whether or not the pixel value of the pixel in the distance image is noise can be determined from these respective determination results (gradient determination result). Regarding claim 12, Watanabe teaches the image processor that is further configured to measure a distance to the object, based on a time difference between a timing at which the reflected optical signal is received by the light receiving elements and a timing at which a light emitter emitted an optical signal toward the object ([0031] calculates the distance to the object having reflected the transmitted wave by multiplying a propagation speed of the transmitted wave by half the obtained transmission/reception period τ (TOF (Time Of Fright) method)). Regarding claim 13, Watanabe teaches the image processor that is further configured to: measure the distance for each of the divided pixels ([0039] The three-dimensional point cloud shown in FIG. 5 is obtained by a known technique from the pixel values (distances) of the distance image shown in FIG. 4, the variation angle θ (vertical angle) and the variation angle ϕ (lateral angle). As is understood from FIG. 5, if a plurality of objects Obk (a person Ob1 and a white wall Ob2 in an example shown in FIGS. 4 and 5) are arranged one after the other along the propagation direction of the transmitted wave, there is also a point cloud PGn between a point cloud PGf (point cloud PG1 corresponding to the person Ob1 in the example shown in FIG. 5) representing the front object Obf (person Ob1 in the example shown in FIGS. 4 and 5) and a point cloud PGr (point cloud PG2 corresponding to the white wall Ob2 in the example shown in FIG. 5) representing the rear object Obr (white wall Ob2 in the example shown in FIGS. 4 and 5).”), and generate the distance image having a resolution corresponding to the divided pixels ([0039] representing the above distance image before noise removal shown as an example in FIG. 4 and generated by the distance image pre-processor 221 by a three-dimensional point cloud, the distance image before noise removal is a point cloud shown in FIG. 5.). Regarding claim 15, Watanabe discloses a distance measuring method ([0047] to generate a distance image before noise removal, the control processor 2 first measures the transmission/reception period for each of the plurality of directions) comprises: receiving, by a light receiving element a reflected optical signal ([0025]-[0026] a transmitted wave is laser light and a reflected wave is reflected light of this laser light… The transmitter/receiver…. configured to transmit predetermined pulsed transmitted waves…. and receive a plurality of reflected waves) reflected by an object ([0003] a transmitted wave is incident on a subject and returned as a reflected wave). Watanabe is not relied upon as disclosing a light receiving element provided in each of a plurality of pixels arranged in a one-dimensional direction or a two-dimensional direction, detecting a direction of the object, based on at least either signal intensities of the reflected optical signal received by the light receiving elements or distances to the object measured based on the reflected optical signal; and dividing at least one or some of the plurality of pixels, based on the detected direction of the object. However, Yoshida teaches a light receiving element provided in each of a plurality of pixels arranged in a one-dimensional direction or a two-dimensional direction ([0030] the multiple light receiving elements may be arrayed in a two-dimensional direction), detecting a direction of the object, based on at least either signal intensities of the reflected optical signal received by the light receiving elements ([0045] On the target T, a partial surface SA, which reflects light beams to be incident on the same pixel, constitutes a reflection point detected by the pixel. In FIG. 2, the partial surface SA is included in the detection range PR on the surface of the target T. As the inclination of the partial surface SA with respect to the reference surface R increases, an optical path length difference of the reflected light on the partial surface increases. The peak of the waveform of the reflection intensity also decreases with an increase of the inclination) or distances to the object measured based on the reflected optical signal; and dividing at least one or some of the plurality of pixels, based on the detected direction of the object ([0086]-[0088] The pixel information acquisition unit 110 acquires pixel information for each sub-pixel obtained by dividing one pixel into the high speed range A and the low speed range B…The pixel corresponding to the reflection point is divided into multiple sub-pixels corresponding to the different scan speeds…Specifically, the change degree calculation unit 135 calculates the change degree between the detected waveform in the high speed range A and the detected waveform in the low speed range B as the inclination feature amount. The change degree calculation unit 135 may use, as the change degree, the change amount of at least one or more waveform characteristics. The change degree calculation unit 135 may use, as the change degree, the change amount based on all points of the detected waveform. This change degree is an example of the inclination feature amount). Watanabe and Yoshida are both considered to be analogous to the claimed invention because they are in the same field of LiDAR. It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Watanabe to incorporate the teachings of Yoshida to include a plurality of light receiving elements, as is standard in the art. Further, it would have also been obvious to modify Watanabe to incorporate the teachings of Yoshida to include detecting a direction of the object and dividing at least one or some of pixels included in the distance image, based on the direction of the detected object. This would have the predictable result of minimizing the noise region of Watanabe. Regarding claim 16, Watanabe teaches the distance measuring method according to claim 15, wherein a pixel that is among pixels included in the distance image and includes distance information of two or more objects is divided ([0017]-[0019] FIG. 7 are charts showing a case where noise is generated by the front object and the rear object located one after the other along the transmission direction of the pulsed laser light… FIG. 9 is a view showing noise regions judged by a process shown in the flow chart shown in FIG. 8 as white regions and regions not judged as noise as black regions). The teachings of Watanabe have been interpreted as pixels with noise caused by two objects (white) are grouped separately (divided) from the pixels without noise caused by two objects (black). Regarding claim 17, Watanabe teaches or suggests the distance measuring method according to claim 16, wherein a direction of at least one of the two or more objects is detected based on at least either signal intensities of the reflected optical signal from the two or more objects received by the light receiving elements ([0041] if the distance between the front object Obf and the rear object Obr becomes shorter (the front object Obf and the rear object Obr move toward each other) from the state shown in FIG. 6A, the respective peaks in the double-peak pulse waveform shown in FIG. 6C move toward each other. When the front object Obf and the rear object Obr are relatively close as shown in FIG. 7A, the signal waveform of the reflected wave becomes a single-peak pulse waveform as shown in FIG. 7C) or distances to the two or more objects measured based on the reflected optical signal, and the pixel including the distance information of the two or more objects is divided based on the direction of the at least one of the two or more objects ([0032] and [0060] the noise determiner 222, for example, determines the pixel as noise for each of the plurality of pixels (plurality of directions) if a gradient Gr of a distance value between this pixel (target pixel) and a predetermined peripheral pixel located around this pixel is within the predetermined range of thg1 to thg2 (thg1≤Gr≤thg2)… a binary image as shown in FIG. 9 in which the noise pixel flags are pixel values is obtained. In FIG. 9, the noise pixel flag “1” is set as a white pixel value and the noise pixel flag “0” is set as a black pixel value. As is understood from FIG. 9, an outline part of the front object). Regarding claim 18, Watanabe teaches or suggests the distance measuring method according to claim 17, wherein the pixel including the distance information of the two or more objects is divided depending on positions of individual ones of the two or more objects in the pixel including the distance information of the two or more objects ([0043] “Whether or not the signal waveform of the reflected wave changes from a double-peak pulse waveform to a single-peak pulse waveform is related to the distance between the front object Obf and the rear object Obr and can be determined based on the gradient Gr between the target region and the predetermined peripheral region located around the target region…Then, whether or not the pixel value of the pixel in the distance image is noise can be determined from these respective determination results (gradient determination result). Allowable Subject Matter Claims 5-11, 19 and 20 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: Regarding claim 5, Yoshida teaches the distance measuring device according to claim 1, wherein the image processor is further configured to: detect the signal intensities of the reflected optical signal received by the light receiving elements ([0045] The peak of the waveform of the reflection intensity also decreases with an increase of the inclination”) ; and wherein the image processor is configured to divide at least one or some of the pixels included in the distance image, based on the direction of the detected object and the area of the object in the calculated pixel ([0086-88] The pixel information acquisition unit 110 acquires pixel information for each sub-pixel obtained by dividing one pixel into the high speed range A and the low speed range B…The pixel corresponding to the reflection point is divided into multiple sub-pixels corresponding to the different scan speeds…Specifically, the change degree calculation unit 135 calculates the change degree between the detected waveform in the high speed range A and the detected waveform in the low speed range B as the inclination feature amount. The change degree calculation unit 135 may use, as the change degree, the change amount of at least one or more waveform characteristics. The change degree calculation unit 135 may use, as the change degree, the change amount based on all points of the detected waveform. This change degree is an example of the inclination feature amount”). However, Yoshida and Watanabe, alone or in any combination, to not teach, suggest, or disclose calculating an area of the object in a pixel corresponding to the light receiving element that has received the reflected optical signal, based on the detected signal intensities. Therefore, Claim 5 contains allowable subject matter. Claims 6-11 depend from Claim 5 and are therefore contain allowable subject matter due to nature of dependency. Regarding claim 19, Yoshida teaches the distance measuring method according to claim 15, wherein a signal intensity of the reflected optical signal received by the light receiving element is detected ([0045] the peak of the waveform of the reflection intensity also decreases with an increase of the inclination), and at least one or some of pixels included in the distance image is divided based on the detected direction of the object and the calculated area of the object in the pixel ([0086-88] The pixel information acquisition unit 110 acquires pixel information for each sub-pixel obtained by dividing one pixel into the high speed range A and the low speed range B…The pixel corresponding to the reflection point is divided into multiple sub-pixels corresponding to the different scan speeds…Specifically, the change degree calculation unit 135 calculates the change degree between the detected waveform in the high speed range A and the detected waveform in the low speed range B as the inclination feature amount. The change degree calculation unit 135 may use, as the change degree, the change amount of at least one or more waveform characteristics. The change degree calculation unit 135 may use, as the change degree, the change amount based on all points of the detected waveform. This change degree is an example of the inclination feature amount”). However, Yoshida and Watanabe, alone or in any combination, to not teach, suggest, or disclose an area of the object in a pixel corresponding to the light receiving element that has received the reflected optical signal is calculated based on the detected signal intensity. Therefore, Claim 19 contains allowable subject matter. Claim 20 depends from Claim 19 and are therefore contain allowable subject matter due to nature of dependency. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to EVAN H HAUT whose telephone number is (571)272-7927. The examiner can normally be reached Monday-Thursday 10am-8pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Isam Alsomiri can be reached at (571) 272-7559. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /E.H.H./Patent Examiner, Art Unit 3645 /ISAM A ALSOMIRI/Supervisory Patent Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Sep 09, 2022
Application Filed
Oct 27, 2025
Non-Final Rejection — §102, §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
Grant Probability
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 0 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month