Prosecution Insights
Last updated: April 19, 2026
Application No. 18/665,976

VIRTUAL HAPTIC TEXTURE RENDERING METHOD AND DEVICE, DISPLAY DEVICE, AND STORAGE MEDIUM

Non-Final OA §102§103
Filed
May 16, 2024
Examiner
ABEBE, SOSINA
Art Unit
2626
Tech Center
2600 — Communications
Assignee
BOE TECHNOLOGY GROUP CO., LTD.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
91%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
332 granted / 457 resolved
+10.6% vs TC avg
Strong +18% interview lift
Without
With
+18.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
16 currently pending
Career history
473
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
59.6%
+19.6% vs TC avg
§102
25.4%
-14.6% vs TC avg
§112
5.8%
-34.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 457 resolved cases

Office Action

§102 §103
DETAILED ACTION This is a first office action in response to application No. 18/665,976 filed on 05/16/2024, in which claims 1 - 20 are presented for examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation 1. The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. 2. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. 3. Claim limitations “a first obtaining module configured to”, “an image conversion module configured to”, “a multi-valued processing module configured to” and “a coordinate system association module configured to”, have been interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because it uses/they use a generic placeholder “module” coupled with functional language “configured to” without reciting sufficient structure to achieve the function. Furthermore, the generic placeholder is not preceded by a structural modifier. Since the claim limitation(s) invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, claim 9 has been interpreted to cover the corresponding structure described in the specification that achieves the claimed function, and equivalents thereof. A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitation: a first obtaining module configured to: Fig. 9; 81 and par. [0075] an image conversion module configured to: Fig. 9; 82 and par. [0075] a multi-valued processing module configured to: Fig. 9; 83 and par. [0075] a coordinate system association module configured to: Fig. 9; 84 and par. [0075] If applicant does not intend to have the claim limitation(s) treated under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112 , sixth paragraph, applicant may amend the claim(s) so that it/they will clearly not invoke 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, or present a sufficient showing that the claim recites/recite sufficient structure, material, or acts for performing the claimed function to preclude application of 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. For more information, see MPEP § 2173 et seq. and Supplementary Examination Guidelines for Determining Compliance With 35 U.S.C. 112 and for Treatment of Related Issues in Patent Applications, 76 FR 7162, 7167 (Feb. 9, 2011). Claim Rejections - 35 USC § 102 4. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 5. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed 5nvention. 6. Claims 1 - 2, 5 - 13 and 16 - 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Shih “US 2012/0327100”. Re-claim 1, Shih teaches a virtual haptic texture rendering method, (fig. 2 and par. [0018] The tactile activation unit 240 generates tactile feedback, e.g., haptics. The processing module 210 is coupled to the display panel 220, the touch panel 230 and the tactile activation unit 240. The processing 210 detects a touch input on the touch panel 230, so as to correspondingly control display information on the display panel 220 to generate tactile feedback) comprising: obtaining a to-be-displayed visual image; (fig. 1 and par. [0017] Step S110 includes inputting an original image.) converting the visual image into a grayscale image; (par. [0017] Step S120 includes converting the original image to a binary image, which includes a first grayscale value and a second grayscale value) performing multi-valued processing on the grayscale image to obtain a haptic response image, all pixels of the haptic response image being grouped into at least two grayscale regions, and different haptic parameters corresponding to different grayscale regions; (par. [0017] Step S130 includes converting the original image to a grayscale image, which at least includes a third grayscale value between the first grayscale value and the second grayscale value and par. [0025] The approaches for automatically identifying the threshold may be carried out according to luminance characteristics of objects, sizes of objects, areas or area ratios occupied by objects, or the number of types of objects. par. [0027] In some embodiments, for example, the grayscale image Im3 is a luminance information of the original image Im1. For example, for one of pixels of the original image Im1, a luminance value of the pixel may be but not limited obtained by converting R/G/B image information of the pixel. For example, the luminance value is a mean value or a weighted sum of the R/G/B image information. In other embodiments, for example, the grayscale image Im3 is a Y component of the original image Im1, or other image information indicative of detail characteristics of the original image Im1. and par. [0028] The grayscale image Im3 may be utilized to generate diversified tactile feedback in different strengths. As the image is processed by binary processing, intermediate transitional grayscale information no longer exists. Therefore, a transitional area value may be provided from the grayscale image to serve as reference for tactile feedback strength.) and associating a first coordinate system of the visual image with a second coordinate system of the haptic response image to obtain an association relationship between the first coordinate system and the second coordinate system, the association relationship being configured to determine a haptic parameter corresponding to a pixel at a touch point of a user in the case that the visual image is displayed and touched by the user, and generate a haptic vibration signal based on the haptic parameter. (par. [0017] Step S140 includes generating an index map according to the binary image and the grayscale image. The index map includes index values indicative of different tactile feedback strengths. Step S150 includes detecting a touch input on a display panel when the original image is displayed on the display panel. Step S160 includes driving a tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input. par. [0036] For example, a corresponding index value KS in the index map Im4 is obtained according to a coordinate P(x,y) of a touch input by a tactile strength processing unit 217, and the index value KS is converted to a strength variable (SV) of a tactile activation unit 240 by a value converting unit 218. par. [0040] FIG. 7 shows a schematic diagram of an example of the original image. As shown in FIG. 7, the main object of the original image Im1 is a cactus in this example. After combining the binary image and the grayscale image, distributions of the index map Im4 are associated with the grayscale values of the original image Im1 while the main object corresponds to stronger tactile feedback. For coordinates A1, A2 and A3, suppose the coordinate A1 is located at the main object and has a grayscale value corresponding to strongest tactile feedback, the coordinate A2 is located at the main object and has a grayscale value corresponding to stronger tactile feedback, and the coordinate A3 is located at the background and has a grayscale value corresponding to weakest tactile feedback. Thus, different strengths for tactile feedback associated with image content may be provided.) Re-claim 2, Shih teaches the virtual haptic texture rendering method according to claim 1, further comprising: determining first coordinates of a first touch point of the user in the first coordinate system in the case that the visual image displayed on a display screen is touched by the user; (fig. 1 and par. [0017] Step S150 includes detecting a touch input on a display panel when the original image is displayed on the display panel.) determining second coordinates of the first touch point in the second coordinate system according to the first coordinates and the association relationship between the first coordinate system and the second coordinate system; (par. [0017] Step S140 includes generating an index map according to the binary image and the grayscale image. The index map includes index values indicative of different tactile feedback strengths. Step S150 includes detecting a touch input on a display panel when the original image is displayed on the display panel. Step S160 includes driving a tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input. and par. [0036] For example, a corresponding index value KS in the index map Im4 is obtained according to a coordinate P(x,y) of a touch input by a tactile strength processing unit 217, and the index value KS is converted to a strength variable (SV) of a tactile activation unit 240 by a value converting unit 218.) determining a target grayscale region in which a pixel corresponding to the second coordinates is located; (pars. [0026], [0027] and [0028]) invoking a target haptic parameter corresponding to the target grayscale region, and generating the haptic vibration signal in accordance with the target haptic parameter; (par. [0017] Step S160 includes driving a tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input. par. [0039] In some embodiments, for example, the tactile activation unit 240 is a piezoelectric vibrator or a motor vibrator, which generates piezoelectric or vibration tactile feedback.) and driving the display screen to vibrate at a position corresponding to the first coordinates in accordance with the haptic vibration signal. (par. [0017] Step S160 includes driving a tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input. Thus, tactile feedback associated with image content is provided to enhance manipulation conveniences of the electronic device. and par. [0038] Step S160 includes driving the tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input. For example, the strength variable SV is transmitted to the tactile activation unit 240 by the value converting unit 218, so as to drive the tactile activation unit 240 to generate tactile feedback.) Re-claim 5, Shih teaches wherein the performing the multi-valued processing on the grayscale image to obtain the haptic response image comprises performing binarization on the grayscale image to obtain the haptic response image, and pixels of the haptic response image are grouped into two grayscale regions of 0 and 255. (pars. [0027] - [0028] and [0037] the tactile activation unit 240 by a ratio between the index value KS and the maximum grayscale value to generate the strength variable SV, … represents the maximum grayscale value (e.g., 255)) Re-claim 6, Shih teaches wherein prior to associating the first coordinate system of the visual image with the second coordinate system of the haptic response image, the virtual haptic texture rendering (fig. 2 and par. [0018] The tactile activation unit 240 generates tactile feedback, e.g., haptics. The processing module 210 is coupled to the display panel 220, the touch panel 230 and the tactile activation unit 240. The processing 210 detects a touch input on the touch panel 230, so as to correspondingly control display information on the display panel 220 to generate tactile feedback) method further comprises: displaying a corrected image on a display screen, a touch reference point being displayed on the corrected image; (par. [0017] Step S150 includes detecting a touch input on a display panel when the original image is displayed on the display panel. Step S160 includes driving a tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input.) and determining a coordinate difference value between a second touch point of the user and the touch reference point in the case that the corrected image is touched by the user, (pars. [0022] - [0023]) wherein the associating the first coordinate system of the visual image with the second coordinate system of the haptic response image comprises associating the first coordinate system of the visual image with the second coordinate system of the haptic response image in accordance with the coordinate difference value. (par. [0017] Step S140 includes generating an index map according to the binary image and the grayscale image. The index map includes index values indicative of different tactile feedback strengths. Step S150 includes detecting a touch input on a display panel when the original image is displayed on the display panel. Step S160 includes driving a tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input. par. [0036] For example, a corresponding index value KS in the index map Im4 is obtained according to a coordinate P(x,y) of a touch input by a tactile strength processing unit 217, and the index value KS is converted to a strength variable (SV) of a tactile activation unit 240 by a value converting unit 218.) Re-claim 7, Shih teaches wherein the corrected image is the visual image. (par. [0041] tactile feedback associated with image content is provided to enhance manipulation conveniences of the electronic device. By the tactile feedback, reliance on visual feedback for a user is reduced to enhance user experiences.) Re-claim 8, Shih teaches wherein the quantity of the touch reference points is plural, and touch reference points comprise at least one of a lower-left coordinate point, an upper-left coordinate point, a lower-right coordinate point or an upper-right coordinate point of the corrected image. (fig. 7 and pars. [0036] and [0040]) Re-claim 9, Shih teaches a virtual haptic texture rendering device, (fig. 2 and par. [0018] The tactile activation unit 240 generates tactile feedback, e.g., haptics. The processing module 210 is coupled to the display panel 220, the touch panel 230 and the tactile activation unit 240. The processing 210 detects a touch input on the touch panel 230, so as to correspondingly control display information on the display panel 220 to generate tactile feedback) comprising: a first obtaining module (fig. 2; 211) configured to obtain a to-be-displayed visual image; (fig. 1 and par. [0017] Step S110 includes inputting an original image.) an image conversion module (fig. 3; 212 & 213) configured to convert the visual image into a grayscale image; (par. [0017] Step S120 includes converting the original image to a binary image, which includes a first grayscale value and a second grayscale value.) a multi-valued processing module (fig. 3; 210) configured to perform multi-valued processing on the grayscale image to obtain a haptic response image, all pixels of the haptic response image being grouped into at least two grayscale regions, different haptic parameters corresponding to different grayscale regions; (par. [0017] Step S130 includes converting the original image to a grayscale image, which at least includes a third grayscale value between the first grayscale value and the second grayscale value and par. [0025] The approaches for automatically identifying the threshold may be carried out according to luminance characteristics of objects, sizes of objects, areas or area ratios occupied by objects, or the number of types of objects. par. [0027] In some embodiments, for example, the grayscale image Im3 is a luminance information of the original image Im1. For example, for one of pixels of the original image Im1, a luminance value of the pixel may be but not limited obtained by converting R/G/B image information of the pixel. For example, the luminance value is a mean value or a weighted sum of the R/G/B image information. In other embodiments, for example, the grayscale image Im3 is a Y component of the original image Im1, or other image information indicative of detail characteristics of the original image Im1. par. [0028] The grayscale image Im3 may be utilized to generate diversified tactile feedback in different strengths. As the image is processed by binary processing, intermediate transitional grayscale information no longer exists. Therefore, a transitional area value may be provided from the grayscale image to serve as reference for tactile feedback strength.) and a coordinate system association module (fig. 3; 218) configured to associate a first coordinate system of the visual image with a second coordinate system of the haptic response image to obtain an association relationship between the first coordinate system and the second coordinate system, the association relationship being configured to determine a haptic parameter corresponding to a pixel at a touch point of a user in the case that the visual image is displayed and touched by the user, and generate a haptic vibration signal in accordance with the haptic parameter. (par. [0017] Step S140 includes generating an index map according to the binary image and the grayscale image. The index map includes index values indicative of different tactile feedback strengths. Step S150 includes detecting a touch input on a display panel when the original image is displayed on the display panel. Step S160 includes driving a tactile activation unit according to the index map to generate tactile feedback in response to the detected touch input. par. [0025] The approaches for automatically identifying the threshold may be carried out according to luminance characteristics of objects, sizes of objects, areas or area ratios occupied by objects, or the number of types of objects. Par. [0036] For example, a corresponding index value KS in the index map Im4 is obtained according to a coordinate P(x,y) of a touch input by a tactile strength processing unit 217, and the index value KS is converted to a strength variable (SV) of a tactile activation unit 240 by a value converting unit 218. And par. [0040] FIG. 7 shows a schematic diagram of an example of the original image. As shown in FIG. 7, the main object of the original image Im1 is a cactus in this example. After combining the binary image and the grayscale image, distributions of the index map Im4 are associated with the grayscale values of the original image Im1 while the main object corresponds to stronger tactile feedback. For coordinates A1, A2 and A3, suppose the coordinate A1 is located at the main object and has a grayscale value corresponding to strongest tactile feedback, the coordinate A2 is located at the main object and has a grayscale value corresponding to stronger tactile feedback, and the coordinate A3 is located at the background and has a grayscale value corresponding to weakest tactile feedback. Thus, different strengths for tactile feedback associated with image content may be provided.) Re-claim 10, the rejection of claim 1 incorporated into the rejection of claim 1 and only further limitation will be addressed below. Shih teaches a display device, comprising a processor, a memory, and a program stored in the memory and executed by the processor, wherein the processor is configured to read the program (fig. 3 and par. [0019]) so as to: Re-claim 11, Shih teaches a computer-readable storage medium storing therein a computer program, wherein the computer program is executed by a processor so as to implement the steps of the virtual haptic texture rendering method according to claim 1. (see claim 1 rejection above) Re-claims 12 - 13 and 16 - 20, are rejected as applied to claims 2 and 5 - 8 above because the scope and contents of the recited limitations are substantially the same. Claim Rejections - 35 USC § 103 7. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 8. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 9. Claims 3 - 4 and 14 - 15 are rejected under 35 U.S.C. 103 as being unpatentable over Shih “US 2012/0327100”. Re-claim 3, Shih teaches wherein the performing the multi-valued processing on the grayscale image to obtain the haptic response image (pars. [0026] - [0027]) comprises: performing the multi-valued processing on the grayscale image to obtain a multi-valued image; (par. [0026]) segmenting the multi-valued image into a plurality of pixel regions; (par. [0027]) and filtering out a target pixel region to obtain the haptic response image, (pars. [0029] and [0037]) Shih does not explicitly teach wherein a side length of the target pixel region is smaller than or equal to a first threshold value, and/or an area of the target pixel region is smaller than or equal to a second threshold value. It would have been obvious matter of design choice to “a side length of the target pixel region is smaller than or equal to a first threshold value, and/or an area of the target pixel region is smaller than or equal to a second threshold value”, since such a modification would have involved a mere change in the size of a component. A change is size is generally recognized as being within the level of ordinary skill in the art. In re Rose, 105 USPQ 237, (CCPA 1955). Re-claim 4, Shih does not explicitly teach wherein the first threshold is 1 mm, and/or the second threshold is 1 mm2. It would have been an obvious matter of design choice to include that wherein the first threshold is 1 mm, and/or the second threshold is 1 mm2 could be different as necessitated by the specific requirements of the particular application; additionally, it has been held that where the general conditions of claim are disclosed in the prior art, discovering the optimum or workable ranges of an invention involves only routine skill in the art. In Gardner v. TEC Syst., Inc., 725 F.2d 1338, 220 USPQ 777. Re-claims 14 - 15, are rejected as applied to claims 3 - 4 above because the scope and contents of the recited limitations are substantially the same. Contact Information 10. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Sosina Abebe whose telephone number is (571) 270-7929. The examiner can normally be reached on Mon-Friday from 9:00-5:30 If attempts to reach the examiner by telephone are unsuccessful, the examiner's Supervisor, Temesghen Ghebretinsae can be reached on (571) 272-3017. The fax phone number for the organization where this application or proceeding is assigned is 703-872-9306. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). /S.A/Examiner, Art Unit 2626 /TEMESGHEN GHEBRETINSAE/Supervisory Patent Examiner, Art Unit 2626 3/10/26B
Read full office action

Prosecution Timeline

May 16, 2024
Application Filed
Mar 07, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578821
TOUCH SENSING DEVICE HAVING MALFUNCTION PREVENTION FUNCTION
2y 5m to grant Granted Mar 17, 2026
Patent 12578815
TOUCH-CONTROL DISPLAY PANEL AND DISPLAY APPARATUS
2y 5m to grant Granted Mar 17, 2026
Patent 12572209
TACTILE-FEEDBACK MODULE AND DRIVING METHOD THEREOF, AND TACTILE-FEEDBACK DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12566515
ARCHITECTURE FOR DIFFERENTIAL DRIVE AND SENSE TOUCH TECHNOLOGY
2y 5m to grant Granted Mar 03, 2026
Patent 12554356
TOUCH DEVICE, TOUCH SYSTEM INCLUDING THE SAME, AND DRIVING METHOD OF THE TOUCH DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
91%
With Interview (+18.5%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 457 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month