Prosecution Insights
Last updated: April 19, 2026
Application No. 18/803,214

DIGITAL COLOR ASSESSMENT

Non-Final OA §103§112
Filed
Aug 13, 2024
Examiner
VANCHY JR, MICHAEL J
Art Unit
2666
Tech Center
2600 — Communications
Assignee
Graftek Imaging Inc.
OA Round
1 (Non-Final)
67%
Grant Probability
Favorable
1-2
OA Rounds
3y 4m
To Grant
87%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
404 granted / 606 resolved
+4.7% vs TC avg
Strong +20% interview lift
Without
With
+20.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
16 currently pending
Career history
622
Total Applications
across all art units

Statute-Specific Performance

§101
11.7%
-28.3% vs TC avg
§103
60.8%
+20.8% vs TC avg
§102
8.4%
-31.6% vs TC avg
§112
10.4%
-29.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 606 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 24, 29, and 37 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The term “approximately” in claims 24, 29, and 37 is a relative term which renders the claim indefinite. The term “approximately” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Applicant does define the word “approximately” within the Specification (see paragraph [0109]), however, the definition is also indefinite because it does not specifically state how approximately should be considered. For example, the Specification states “The term approximately is intended to mean at least close to a given value (e.g., within 10% of)”. The word “close” is also relative, and the “10%” is just given as an example, not a statement that “approximately” means “within 10% of”. Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 21-30 are rejected under 35 U.S.C. 103 as being unpatentable over Ishizaki et al., US 2014/0022571 A1 (Ishizaki), and further in view of Ozaki et al., US 2009/0027705 A1 (Ozaki). Regarding claim 21, Ishizaki teaches a method of creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), comprising: digitally imaging the workpiece (imaging the sheet with an image formed thereon using the image forming apparatus 10) (Figs. 1 and 14; [0091]); estimating a first set of L*a*b* values from the digital image of the workpiece (estimating a set of L1*a1*b1* data/values from the image of the sheet with an image formed thereon) (Fig. 13; [0087-0090]); selecting a first set of colors from a wide gamut of colors related to the first set of L*a*b* values (wherein the creator selects colors from RGB data (converted into CMYK data) that are satisfactory and are related to the L1*a1*b1* data) (Figs. 13 and 14; [0087] and [0090-0091]); creating a digital chart of the first set of colors (wherein when making adjustments in the image forming apparatus 10, various charts, such as various color patches are arranged on a sheet) (Fig. 1; [0041]) (Fig. 14, step S3; [0092]); forming the digital chart (various charts such as various color patches can be arranged on the sheet) ([0041]) (wherein the color patches (chart) are formed on the sheet) (Fig. 14, step S3; [0092]); measuring the first set of colors, from the formed chart, to create a second set of L*a*b* values (the optical measuring instrument 26 is used to take colorimetric measurements of the color patches (color chart) to create L*a*b* data (L2*a2*b2*)) (Fig 1, Fig. 14, step S4; [0092-0093]); calculating a value between the first set of L*a*b* values and the second set of L*a*b* values (comparing the values between L2*a2*b2* and L1*a1*b1* to determine if there is a sufficient match (the value being a distance between the two values)) (Fig. 14, step S5; [0094]); determining whether or not the value is below a predetermined threshold (determine if the distance value is within a predetermined distance, or in other words, whether or not the two sets of data values sufficiently match) (Fig. 14, step S6; [0094]); and, if the value is below the predetermined threshold (if the distance is within a predetermined distance) ([0094]), then creating the color target specification (creating the color target using those colors instead of doing a correction) ([0003] and [0094]). Although Ishizaki does not explicitly state that the imaging is “digital” it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention that since the image data is processed using an image processor 30, which includes components such as a memory and a computational circuit (Fig. 1; [0045]), that the imaging being done is digital. Ishizaki also states that the various color patches are “formed” on the sheet, however, Ishizaki does not explicitly state “printing” or calculating a “Delta E” value. Ozaki teaches picture color tone control for a printing press (Abstract); wherein a color chart is printed ([0126-0127]); and wherein a Delta E value is obtained based on the color difference (Lab difference) which is compared to a color difference threshold value ([0145-0146]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ishizaki to include detecting a Delta E value for the difference between the Lab values since it makes it possible to confirm a color shade before printing is performed thereby to prevent failure in printing and suppress incidence of paper loss (Ozaki; Abstract). Regarding claim 22, Ishizaki teaches further comprising the step of: if the value is not below the predetermined threshold (if the distance values is above a predetermined distance; i.e. the values do not sufficiently match) (Fig. 14, step S6; [0094]), then, selecting a second set of colors from the wide gamut of colors (selecting a second set of colors by correcting the CMYK values of the RGB data) (Fig. 14, steps S7 and S8; [0094-0095]). Ozaki teaches picture color tone control for a printing press (Abstract); wherein a color chart is printed ([0126-0127]); and wherein a Delta E value is obtained based on the color difference (Lab difference) which is compared to a color difference threshold value ([0145-0146]). Regarding claim 23, Ozaki teaches further comprising the step of: converting the digital chart to a digital form printable on an inkjet printer (wherein the color chart can be printed by the printing press) ([0031], [0073], and [0126-0127]). Regarding claim 24, Ishizaki teaches wherein the step of determining further comprises the steps of: specifying that the predetermined threshold is approximately 3 (wherein the threshold is a “predetermined distance”, which could obviously be set as 3 by the user) ([0094]). Ozaki teaches wherein the step of determining further comprises the steps of: specifying that the predetermined threshold is approximately 3 (wherein the threshold value is provided and thus can obviously be set to 3 if desired) ([0145-0146]). Regarding claim 25, Ozaki teaches wherein the step of estimating further comprises the steps of: measuring the workpiece to create a third set of L*a*b* values; and, averaging the third set of L*a*b* values (wherein a third set of Lab values can be calculated and averaged to be used with the Delta E predetermined threshold) ([0146] and [0196]). Regarding claim 26, Ishizaki teaches a method of creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), comprising: digitally selecting a color target for the workpiece (wherein the creator selects a target color on a monitor) ([0089-0090]); estimating a first set of L*a*b* values from the color target (estimating a set of L1*a1*b1* data/values from the image of the sheet with an image formed thereon) (Fig. 13; [0087-0090]); selecting a first set of colors from a wide gamut of colors related to the first set of L*a*b* values (wherein the creator selects colors from RGB data (converted into CMYK data) that are satisfactory and are related to the L1*a1*b1* data) (Figs. 13 and 14; [0087] and [0090-0091]); creating a digital chart of the first set of colors (wherein when making adjustments in the image forming apparatus 10, various charts, such as various color patches are arranged on a sheet) (Fig. 1; [0041]) (Fig. 14, step S3; [0092]); forming the digital chart (various charts such as various color patches can be arranged on the sheet) ([0041]) (wherein the color patches (chart) are formed on the sheet) (Fig. 14, step S3; [0092]); measuring the first set of colors, from the formed chart, to create a second set of L*a*b* values (the optical measuring instrument 26 is used to take colorimetric measurements of the color patches (color chart) to create L*a*b* data (L2*a2*b2*)) (Fig 1, Fig. 14, step S4; [0092-0093]); calculating a value between the first set of L*a*b* values and the second set of L*a*b* values (comparing the values between L2*a2*b2* and L1*a1*b1* to determine if there is a sufficient match (the value being a distance between the two values)) (Fig. 14, step S5; [0094]); determining whether or not the value is below a predetermined threshold (determine if the distance value is within a predetermined distance, or in other words, whether or not the two sets of data values sufficiently match) (Fig. 14, step S6; [0094]); and, if the value is below the predetermined threshold (if the distance is within a predetermined distance) ([0094]), then creating the color target specification (creating the color target using those colors instead of doing a correction) ([0003] and [0094]). Although Ishizaki does not explicitly state that the imaging is “digital” it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention that since the image data is processed using an image processor 30, which includes components such as a memory and a computational circuit (Fig. 1; [0045]), that the imaging being done is digital. Ishizaki also states that the various color patches are “formed” on the sheet, however, Ishizaki does not explicitly state “printing” or calculating a “Delta E” value. Ozaki teaches picture color tone control for a printing press (Abstract); wherein a target color is selected (Abstract, [0023], and [0027]); wherein a color chart is printed ([0126-0127]); and wherein a Delta E value is obtained based on the color difference (Lab difference) which is compared to a color difference threshold value ([0145-0146]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ishizaki to include detecting a Delta E value for the difference between the Lab values since it makes it possible to confirm a color shade before printing is performed thereby to prevent failure in printing and suppress incidence of paper loss (Ozaki; Abstract). Regarding claim 27, Ishizaki teaches further comprising: if the value is not below the predetermined threshold (if the distance values is above a predetermined distance; i.e. the values do not sufficiently match) (Fig. 14, step S6; [0094]), then, selecting a second set of colors from the wide gamut of colors (selecting a second set of colors by correcting the CMYK values of the RGB data) (Fig. 14, steps S7 and S8; [0094-0095]). Ozaki teaches picture color tone control for a printing press (Abstract); wherein a color chart is printed ([0126-0127]); and wherein a Delta E value is obtained based on the color difference (Lab difference) which is compared to a color difference threshold value ([0145-0146]). Regarding claim 28, Ozaki teaches further comprising: converting the digital chart to a digital form printable on an inkjet printer (wherein the color chart can be printed by the printing press) ([0031], [0073], and [0126-0127]). Regarding claim 29, Ishizaki teaches wherein the step of determining further comprises the steps of: specifying that the predetermined threshold is approximately 2 (wherein the threshold is a “predetermined distance”, which could obviously be set as 2 by the user) ([0094]). Ozaki teaches wherein the step of determining further comprises the steps of: specifying that the predetermined threshold is approximately 2 (wherein the threshold value is provided and thus can obviously be set to 2 if desired) ([0145-0146]). Regarding claim 30, Ishizaki teaches wherein the step of selecting the color target further comprises the steps of: selecting the color target by use of color modification software (wherein the creator can select the color based on using the CPU for executing programs) ([0072], [0086], and [0089-0090]). Ozaki teaches selecting the color target by use of color modification software (selecting the color target using a simulation printing tool) (Abstract, [0023], and [0077]). Claim(s) 31-37 are rejected under 35 U.S.C. 103 as being unpatentable over Ishizaki et al., US 2014/0022571 A1 (Ishizaki), Ozaki et al., US 2009/0027705 A1 (Ozaki), and further in view of Darel et al., 6,024,018 (Darel). Regarding claim 31, Ishizaki teaches a method of creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), comprising: estimating a first set of space-specific color values (L*a*b* values) ([0087-0090]) corresponding to a workpiece (estimating a set of L1*a1*b1* data/values from the image of the sheet with an image formed thereon) (Fig. 13; [0087-0090]), said space-specific color values (L*a*b* values) ([0087-0090]); selecting a first set of colors from a wide gamut of colors related to the first set of space-specific color values (wherein the creator selects colors from RGB data (converted into CMYK data) that are satisfactory and are related to the L1*a1*b1* data) (Figs. 13 and 14; [0087] and [0090-0091]); creating a digital chart of the first set of colors (wherein when making adjustments in the image forming apparatus 10, various charts, such as various color patches are arranged on a sheet) (Fig. 1; [0041]) (Fig. 14, step S3; [0092]); forming the digital chart (various charts such as various color patches can be arranged on the sheet) ([0041]) (wherein the color patches (chart) are formed on the sheet) (Fig. 14, step S3; [0092]); measuring the first set of colors, from the formed chart, to create a second set of space-specific color values (the optical measuring instrument 26 is used to take colorimetric measurements of the color patches (color chart) to create L*a*b* data (L2*a2*b2*)) (Fig 1, Fig. 14, step S4; [0092-0093]); calculating an error measurement between the first set of space-specific color values and the second set of space-specific color values (comparing the values between L2*a2*b2* and L1*a1*b1* to determine if there is a sufficient match (the value being a distance between the two values)) (Fig. 14, step S5; [0094]); determining whether or not the error measurement is below a predetermined threshold (determine if the distance value is within a predetermined distance, or in other words, whether or not the two sets of data values sufficiently match) (Fig. 14, step S6; [0094]); and, if the error measurement is below the predetermined threshold (if the distance is within a predetermined distance) ([0094]), then creating the color target specification (creating the color target using those colors instead of doing a correction) ([0003] and [0094]). Although Ishizaki does not explicitly state that the imaging is “digital” it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention that since the image data is processed using an image processor 30, which includes components such as a memory and a computational circuit (Fig. 1; [0045]), that the imaging being done is digital. Ishizaki also states that the various color patches are “formed” on the sheet, however, Ishizaki does not explicitly state “printing”. Ozaki teaches picture color tone control for a printing press (Abstract); wherein a target color is selected (Abstract, [0023], and [0027]); and wherein a color chart is printed ([0126-0127]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ishizaki to include printing the chart for detecting a Delta E value for the difference between the Lab values since it makes it possible to confirm a color shade before printing is performed thereby to prevent failure in printing and suppress incidence of paper loss (Ozaki; Abstract). Ishizaki teaches using L*a*b* values ([0087-0090]) and Ozaki teaches using L*a*b* values ([0144-0145]). However, neither explicitly teaches “a color space that encompasses all visible colors and is an absolute standard”. Darel teaches a color control system for maintaining the color of a printed page of a printing press constant, within the context of the human perceptual color space system optimizes the settings of a plurality of ink keys in a printing press in accordance with a test image and a reference image (Abstract); and wherein using a color space that encompasses all visible colors and is an absolute standard (wherein the color comparison can be performed using any color space, such as CIE Lab) (col. 8, lines 9-10). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of prior arts to include a color space that encompasses all visible colors and is an absolute standard, such as CIE Lab color space since it closely imitates human visual perception (Darel; col. 8, lines 10-12). Regarding claim 32, Ishizaki teaches creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), wherein the color space that encompasses all visible colors and is an absolute standard comprises a CIE Lab space and the space-specific color values comprise L*a*b* values (L*a*b* values) ([0087-0090]). Ozaki teaches using L*a*b* values ([0144-0145]). However, neither explicitly teaches “wherein the color space that encompasses all visible colors and is an absolute standard comprises a CIE Lab space”. Darel teaches a color control system for maintaining the color of a printed page of a printing press constant, within the context of the human perceptual color space system optimizes the settings of a plurality of ink keys in a printing press in accordance with a test image and a reference image (Abstract); and wherein the color space that encompasses all visible colors and is an absolute standard comprises a CIE Lab space (wherein the color comparison can be performed using any color space, such as CIE Lab) (col. 8, lines 9-10). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of prior arts to include a color space that encompasses all visible colors and is an absolute standard, such as CIE Lab color space since it closely imitates human visual perception (Darel; col. 8, lines 10-12). Regarding claim 33, Ishizaki teaches creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), wherein estimating a first set of space-specific color values corresponding to a workpiece (estimating a set of L1*a1*b1* data/values from the image of the sheet with an image formed thereon) (Fig. 13; [0087-0090]) comprises: digitally imaging the workpiece (imaging the sheet with an image formed thereon using the image forming apparatus 10) (Figs. 1 and 14; [0091]); and estimating the first set of space-specific color values from the digital image of the workpiece (estimating a set of L1*a1*b1* data/values from the image of the sheet with an image formed thereon) (Fig. 13; [0087-0090]). Although Ishizaki does not explicitly state that the imaging is “digital” it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention that since the image data is processed using an image processor 30, which includes components such as a memory and a computational circuit (Fig. 1; [0045]), that the imaging being done is digital. Regarding claim 34, Ishizaki teaches creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), wherein estimating a first set of space-specific color values corresponding to a workpiece (estimating a set of L1*a1*b1* data/values from the image of the sheet with an image formed thereon) (Fig. 13; [0087-0090]) comprises: digitally selecting a color target for the workpiece (wherein the creator selects a target color on a monitor) ([0089-0090]); and estimating the first set of space-specific color values from the color target (estimating a set of L1*a1*b1* data/values from the image of the sheet with an image formed thereon) (Fig. 13; [0087-0090]). Ozaki teaches picture color tone control for a printing press (Abstract); wherein a target color is selected (Abstract, [0023], and [0027]). Although Ishizaki does not explicitly state that the imaging is “digital” it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention that since the image data is processed using an image processor 30, which includes components such as a memory and a computational circuit (Fig. 1; [0045]), that the imaging being done is digital. Regarding claim 35, Ishizaki teaches creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), wherein calculating the error measurement comprises computing the difference between an intended color value (reference color value) ([0094]) and a measured (measure color value) ([0094]) color value (comparing the values between L2*a2*b2* and L1*a1*b1* to determine if there is a sufficient match (the value being a distance between the two values)) (Fig. 14, step S5; [0094]). Ozaki teaches wherein a Delta E value is obtained based on the color difference (Lab difference) which is compared to a color difference threshold value ([0145-0146]) (difference between the Lab of the galley and the Lab of the simulation printing tool to perform comparison) ([0145]). Regarding claim 36, Ozaki teaches for creating a color target specification for a workpiece (Abstract, [0023], and [0027]), wherein calculating the error measurement comprises computing Delta E (and wherein a Delta E value is obtained based on the color difference (Lab difference) which is compared to a color difference threshold value) ([0145-0146]). Regarding claim 37, Ishizaki teaches creating a color target specification for a workpiece (creating the correct color tones for a sheet with an image formed thereon) ([0003]), wherein the predetermined threshold is approximately 3 (wherein the threshold is a “predetermined distance”, which could obviously be set as 3 by the user) ([0094]). Ozaki teaches wherein the predetermined threshold is approximately 3 (wherein the threshold value is provided and thus can obviously be set to 3 if desired) ([0145-0146]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Nishizawa, US 2016/0134784 A1: teaches a method for obtaining a cross-section of a color gamut of an image forming apparatus by slicing the color gamut along a plane (Abstract). Mishima, US 2017/0195503 A1: teaches a case where a correspondence relationship between colors of the manuscript image data and the printed material is established, it is preferable for an optimum reading area of a portion of the printed material suitable for establishment of the correspondence relationship between the colors to be read by a scanner and for a color distribution of read image data of the scanner to be obtained ([0012]). Contact Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL J VANCHY JR whose telephone number is (571)270-1193. The examiner can normally be reached Monday - Friday 9am - 5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emily Terrell can be reached at (571) 270-3717. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MICHAEL J VANCHY JR/Primary Examiner, Art Unit 2666 Michael.Vanchy@uspto.gov
Read full office action

Prosecution Timeline

Aug 13, 2024
Application Filed
Jan 02, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602906
IMAGE RECOGNITION APPARATUS
2y 5m to grant Granted Apr 14, 2026
Patent 12579596
MANAGING ARTIFICIAL-INTELLIGENCE DERIVED IMAGE ATTRIBUTES
2y 5m to grant Granted Mar 17, 2026
Patent 12579634
REAL-TIME PROCESS DEFECT DETECTION AUTOMATION SYSTEM AND METHOD USING MACHINE LEARNING MODEL
2y 5m to grant Granted Mar 17, 2026
Patent 12573225
METHODS AND SYSTEMS OF FIELD DETECTION IN A DOCUMENT
2y 5m to grant Granted Mar 10, 2026
Patent 12551101
SYSTEM AND METHOD FOR DIGITAL MEASUREMENTS OF SUBJECTS
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
67%
Grant Probability
87%
With Interview (+20.1%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 606 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month