Prosecution Insights
Last updated: April 19, 2026
Application No. 18/384,153

METHOD AND SYSTEM FOR TASK RECORDING USING ROBOTIC PROCESS AUTOMATION TECHNOLOGY

Non-Final OA §101§103§DP
Filed
Oct 26, 2023
Examiner
DUNN, DARRIN D
Art Unit
2117
Tech Center
2100 — Computer Architecture & Software
Assignee
Samsung Electronics
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
678 granted / 899 resolved
+20.4% vs TC avg
Strong +24% interview lift
Without
With
+24.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
34 currently pending
Career history
933
Total Applications
across all art units

Statute-Specific Performance

§101
15.6%
-24.4% vs TC avg
§103
52.8%
+12.8% vs TC avg
§102
13.8%
-26.2% vs TC avg
§112
11.4%
-28.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 899 resolved cases

Office Action

§101 §103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions I. Claims 1-13 drawn to correcting events in an event sequence, class G05B 13/027. II. Claims 14- 18 drawn to removing a noise event in event sequences, G06F 11/00. III. Claims 19-20 drawn to partial event replacement in a series of events based on pattern analysis, G06F 11/28. The inventions are distinct, each from the other because of the following reasons: Inventions I -III are related as subcombinations usable together. Inventions in this relationship are distinct if it can be shown that (1) the combination as claimed does not require the particulars of the subcombination as claimed for patentability, and (2) that the subcombination has utility by itself or in other combinations (MPEP § 806.05(c)). In the instant case, Group I does not require the particulars of the subcombination as claimed in Groups II-III because the event correction of Group I does not require the particulars of the noise filtering and pattern analysis. Group I has separate utility for identify abnormal robotic task implementation due to positional errors in target object positioning and/or identifying long running tasks indicative of robotic automation error, see PG/PUB 20210023709. 6. The Examiner has required restriction between combination and subcombination inventions. Where applicant elects a subcombination, and claims thereto are subsequently found allowable, any claim(s) depending from or otherwise requiring all the limitations of the allowable subcombination will be examined for patentability in accordance with 37 CFR 1.104. See MPEP § 821.04(a). Applicant is advised that if any claim presented in a continuation or divisional application is anticipated by, or includes all the limitations of, a claim that is allowable in the present application, such claim may be subject to provisional statutory and/or nonstatutory double patenting rejections over the claims of the instant application. 3. Restriction for examination purposes as indicated is proper because all these inventions listed in this action are independent or distinct for the reasons given above and there would be a serious search and/or examination burden if restriction were not required because at least the following reason(s) apply: (a) the inventions have acquired a separate status in the art in view of their different classification; (b) the inventions have acquired a separate status in the art due to their recognized divergent subject matter; c) the inventions require a different field of search (for example, searching different classes/subclasses or electronic resources, or employing different search queries); (d) the prior art applicable to one invention would not likely be applicable to another invention; (e) the inventions are likely to raise different non-prior art issues under 35 U.S.C. 101 and/or 35 U.S.C. 112, first paragraph. 7. Applicant is advised that the reply to this requirement to be complete must include (i) an election of a invention to be examined even though the requirement may be traversed (37 CFR 1.143) and (ii) identification of the claims encompassing the elected invention. 8. The election of an invention may be made with or without traverse. To reserve a right to petition, the election must be made with traverse. If the reply does not distinctly and specifically point out supposed errors in the restriction requirement, the election shall be treated as an election without traverse. Traversal must be presented at the time of election in order to be considered timely. Failure to timely traverse the requirement will result in the loss of right to petition under 37 CFR 1.144. If claims are added after the election, applicant must indicate which of these claims are readable upon the elected invention. Should applicant traverse on the ground that the inventions are not patentably distinct, applicant should submit evidence or identify such evidence now of record showing the inventions to be obvious variants or clearly admit on the record that this is the case. In either instance, if the examiner finds one of the inventions unpatentable over the prior art, the evidence or admission may be used in a rejection under 35 U.S.C. 103(a) of the other invention. Ms. Park elected group I responsive to a telephonic restriction request on 1/16/26. Allowable Subject Matter Claims 10 and 13 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Claims 11-12 depend upon claim 10. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-9 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim(s) recite(s) a mental process directed to correcting , reproducing, and repeating the obtaining, correcting, and reproducing until the event sequence is exhausted. But for the inclusion of general computing components, the abstract idea is not meaningfully limited because opinion, judgment, and observation are involved automating manual activities, MPEP 2106.04(a)(2). In particular, the correction, reproducing, and repeating limitations, involve mental identification of events sequences needing correction, reproducing involves applying a corrected event, and repeating involves a replication of the corrected events (claim 1); the determining whether events exceed a reference value (claim 2); determining whether an event holding time exceeds a second reference value (claims 3); calculating a representative value (claim 4) ; correcting the target object type (claim 5); correcting the target object type of the obtained event (claim 6); and determining type of display object and correcting the target object (claim 8) represent mental processes. This judicial exception is not integrated into a practical application because the recording, recording result correction, and sequentially obtaining represent insignificant extra solution activity (claim 1), MPEP 210605(g). The task and RPA are generally recited so as to generically link the abstract idea to the field of automation (claim 2), MPEP 2106.05(h); performing a recording result correction steps in response to the determination (claims 2-4 ,6, and 8); calling object type returns, waiting until an object return method returns (claims 5-6); executing a first process or a third process (claim 7); and automatically performing a recording result correction and storing a final result (claim 9) represent insignificant extra solution because the performing, calling, executing a process; and recording results involve obtaining at least a target object position via applying methods for obtaining and returning results, MPEP 2106.05(g). The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the insignificant extra solution described above is well understood, conventional, and routine, see MPEP 2106.05(d), see repeating testing all cases, USPN 7096459, USPN 10509693, 20240031313-0170 Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1 and 4-9 are rejected under 35 U.S.C. 103 as being unpatentable over Ma et al. (PG/PUB 20200206920) in view over P.K et al. (USPN 11710064). Claim 1. Ma et al. teaches a method for recording a task based on robotic process automation (RPA), performed by a computing system in which RPA solution is installed but does not expressly teach the correcting the target object limitations described below in combination with the claim as a whole. P.K. et al. teaches the correcting the target object limitations described below, the method comprising: recording a task using the RPA solution (Figure 3-302-314, 0230 e.g. see event recording) and performing, in response to completion of the recording, recording result correction (Figure 3-304, 306, 308 e.g. see the pre-processing teachings as reading on “result correction” proceeded by recording the results via identifying processes for robotic process automation, see also “cleaning and normalization,” 0106-0131) wherein the performing the recording result correction comprises: sequentially obtaining each event included in an event sequence generated as a result of the recording (Figure 3-302 e.g. see recording a plurality of event streams), wherein the event sequence comprises information about each event included in the event sequence (0007-0008, 0152, 0236-0237 e.g. see task types), and the information about each event comprises information about a type of user manipulation (0111 , 0113, 0122, 0148, 0243 e.g. see “clicking the same element”), information about a target object position (0111, 0113, 0122, 0161 e.g. see determining position of windows for example), and information about a target object type (0111, 0113, 0122, 0161 e.g. see “windows” as one example of multiple target object types) correcting the target object type of the obtained event using the target object position of the obtained event (Ma, 0232-0234, 0329 e.g. see cleaning at least redundant events, see P.K as correcting user interface issues involving the selection of target object types (e.g. display elements, windows, text boxes, etc.) responsive to identifying time of interaction exceeds a threshold duration indicating user difficulties encountered during sequential event/task execution, the correction involving suggested size, location, or identifiers of user interface elements, resizing, relocating, relabeling, and other changes made to the target object type based on its position during user selection in light of errors/time durations exceeded, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) reproducing the obtained event using the target object position of the obtained event and a corrected target object type of the obtained event (P.K., see reproducing as implementing the changes to the target object type to simply user interaction with the target object type during interaction with the user interface in executing sequential tasks, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) repeating the obtaining, the correcting, and the reproducing until an event included in the event sequence is exhausted (Ma et al., see identifying all recorded event streams sequentially, correcting all necessary event streams. The limitation “repeating” is interpreted as sequentially analyzing each event for necessary cleaning, normalization, and/or modification prior to selecting a process for robotic automation until all tasks have been analyzed) One of ordinary skill in the art before the effective filing date of the claimed invention applying the teachings of P.K., namely performing corrective actions via applying new resolutions to target objects based on duration of use associated with positions, to the teachings of Ma et al., namely cleaning and normalization recorded events for automation based on repeatedly identifying tasks for correction, would achieve an expected and predictable result of correcting interface display elements and locations to facilitate tasks optimization for automating user activities using robotic process automation. P.K is reasonably pertinent to a problem of improving user interactions and would commend itself to optimizing sequential processes executed by a user, as described, ABSTRACT, Summary of Invention. Claim 4. The method of claim 1, wherein the information about each event further comprises information about event holding time, which is interval between an occurrence time of the event and an occurrence time of an event immediately after the event (P.K.. e.g. see time intervals between interactions) wherein the performing the recording result correction comprises: calculating a representative value of the event holding time of all events included in the event sequence (P.K. see determining expected durations between time intervals e.g. “User interaction durations or delays determined from the times 528 can be used for a variety of purposes. For example, the 7 seconds taken to move from event 504 b to 504 a can be compared with a threshold or range. If a normal time to move to a text box is less than 10 seconds, then the event sequence of table 500 can be determined to be normal or acceptable. On the other hand, if the normal time taken to move to a text box is less than 5 seconds, then the 7 second time can indicate the presence of a user interface design issue. That is, it may be possible to reduce the interaction time by providing a more understandable label for the text box, making the text box larger, moving the text box to another location, highlighting the text box, etc.”, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) performing, when it is determined that the representative value of the event holding time is less than a third reference value, the recording result correction (supra claim 1 for correction when duration is expected or less than a third reference value e.g. “User interaction durations or delays determined from the times 528 can be used for a variety of purposes. For example, the 7 seconds taken to move from event 504 b to 504 a can be compared with a threshold or range. If a normal time to move to a text box is less than 10 seconds, then the event sequence of table 500 can be determined to be normal or acceptable. On the other hand, if the normal time taken to move to a text box is less than 5 seconds, then the 7 second time can indicate the presence of a user interface design issue. That is, it may be possible to reduce the interaction time by providing a more understandable label for the text box, making the text box larger, moving the text box to another location, highlighting the text box, etc.”, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) Claim 5. The method of claim 1, wherein the correcting the target object type comprises: calling an object type return method that receives a target object position of the obtained event (P.K. see obtaining user interactions per display element , Figure 1-140) waiting until the object type return method returns (P.K, Figure 1-140 e.g. see obtaining results , see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) correcting the target object type of the obtained event to an object type returned from the object type return method (P.K., Figure 1-132, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) Claim 6. The method of claim 1, wherein a target object type of each event included in the event sequence is obtained as a result of calling a first method (supra claim 1, P.K, Figure 1-140 e.g. see method of identifying a target object and associated interactions) wherein the correcting the target object type comprises: calling a second method that receives a target object position of the obtained event (Figure 1-132, Figure 16 (A-C) e.g. see correcting the target object using a second method based on results of first method, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) correcting the target object type of the obtained event to an object type returned from the second method (P.K., Figure 1-132, Figure 16(A-C), see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) wherein the first method and the second method are different from each other (P.K., supra claim 1, Figure 16 (A-C), see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) Claim 7. The method of claim 6, wherein the first method is executed by a second process that is different from a first process of the RPA solution by an inter-process communication method (P.K, Figure 1-120 vs. 132, Figures 16(A-C) wherein the second method is executed by the first process or a third process (P.K., Figure 16(A-C), ABSTRACT, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) Claim 8. The method of claim 1, wherein the correcting the target object type comprises: determining a type of an object displayed at the target object position of the obtained event on a display screen output by an operating system installed in the computing system using an object type classification model (P.K., Figure 16(A-C), see application of learning model) correcting the target object type of the obtained event to the determined type (P.K., Figure 1, supra claim 1, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) Claim 9. The method of claim 1, wherein the performing the recording result correction comprises: automatically performing, in response to completion of the recording, the recording result correction without a separate user command (supra claim 1, P.K., for automated correction, Figure 1, see ABSTRACT, Col 1 lines 38-57, Col 4 lines 23-34, Col 6 lines 29-54, Col 8 lines 38-52, Col 10 lines 41-54, Col 13 lines 30-40, Col 17 lines 56-67 thru. Col 18 lines 1-67, figure 12 , **Col 8 lines 63-67, **Col 20 lines 36-51 (see general corrective actions) the method further comprises: storing final result data of the task recording generated using a corrected event sequence obtained according to the recording result correction (supra claim 1 for storing valid recordings upon cleaning, normalization, and correction) Claims 2-3 are rejected under 35 U.S.C. 103 as being unpatentable over Ma et al. (PG/PUB 20200206920) in view over P.K et al. (USPN 11710064) in view over Nychis et al. (USPN 12020046). Claim 2. The method of claim 1 but does not expressly teach the event holding time limitations described below. Nychis et al. teaches the event holding times described below wherein the information about each event further comprises information about event holding time, which is interval between an occurrence time of the event and an occurrence time of an event immediately after the event (Nychis et al., see comparison of duration exceeding an expected duration or first reference value, supra claim 1 for recording valid, corrective tasks for automation, Col 23 lines 30-45, Col 23 lines 50-67 thru. Col 24 lines 1-5, Col 24 lines 41-56) wherein the performing the recording result correction comprises: determining whether the event holding time of all events included in the event sequence exceeds a first reference value (Nychis et al., see comparison of duration exceeding an expected duration or first reference value, supra claim 1 for recording valid, corrective tasks for automation, , Col 23 lines 30-45, Col 23 lines 50-67 thru. Col 24 lines 1-5, Col 24 lines 41-56) performing, when it is determined as a result of the determination that the event holding time of at least some events is less than the first reference value, the recording result correction (Nychis et al., see determining whether process/event is within expected duration/less than a first reference value for inclusion while excluding instances exceeding the duration, supra claim 1 for recording valid processes or events in response to cleaning, normalization, and correction) One of ordinary skill in the art before the effective filing date of the claimed invention applying the teachings of Nychis et al., namely determining whether total event durations exceed or remain under a threshold duration for identifying valid or invalid tasks, to the teachings of Ma, as modified, namely recording valid tasks upon correction, would achieve an expected and predictable result of identifying tasks to be excluded from the recordings for robotic process automation. Nychis is in the same field of endeavor and reasonably pertinent to a problem of robotic tasks automation as described, Summary of Invention. Claim 3. The method of claim 1 but does not expressly teach the holding time limitations described below. Nychis et al. teaches the holding time limitations described below wherein the information about each event further comprises information about event holding time, which is interval between an occurrence time of the event and an occurrence time of an event immediately after the event, UI design timing between events wherein the performing the recording result correction comprises: determining whether the event holding time of each event included in the event sequence exceeds a second reference value; and application excessive invalid (Nychis. e.g. see process duration exceed a threshold time, , Col 23 lines 30-45, Col 23 lines 50-67 thru. Col 24 lines 1-5, Col 24 lines 41-56) performing the recording result correction only for an event, in which the event holding time is determined to be less than a second reference value (Nychis see determining a valid process duration less than the threshold duration, , Col 23 lines 30-45, Col 23 lines 50-67 thru. Col 24 lines 1-5, Col 24 lines 41-56) One of ordinary skill in the art before the effective filing date of the claimed invention applying the teachings of Nychis et al., namely determining whether total event durations exceed or remain under a threshold duration for identifying valid or invalid tasks, to the teachings of Ma, as modified, namely recording valid tasks upon correction, would achieve an expected and predictable result of identifying tasks to be excluded from the recordings for robotic process automation. Nychis is in the same field of endeavor and reasonably pertinent to a problem of robotic tasks automation as described, Summary of Invention. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. 12118490 – time duration wasteful 12020046 – time duration not instance of process 10986252 – touch accommodation time 20200206960 -0102 event recording filter time periods 20200150851 – selecting UI based on response times 20200019418 – interaction time abnormal 20190114572 – task timing 20130346950 – usability action recoding 20160110277 20200206920 UI elements: 12020046 (excluding time) 20230214239 10986252 20190373004 20140053106 20110246905 20130346950 Any inquiry concerning this communication or earlier communications from the examiner should be directed to DARRIN D DUNN whose telephone number is (571)270-1645. The examiner can normally be reached M-Sat (10-8) PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert Fennema can be reached at 571-272-2748. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DARRIN D DUNN/Patent Examiner, Art Unit 2117
Read full office action

Prosecution Timeline

Oct 26, 2023
Application Filed
Feb 07, 2026
Non-Final Rejection — §101, §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603497
CONTROL APPARATUS, CONTROL METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR OCCUPANT-BASED ENERGY PREDICTION
2y 5m to grant Granted Apr 14, 2026
Patent 12595924
SYSTEM AND METHOD FOR PROVIDING COOLING DURING REFRIGERANT LEAK
2y 5m to grant Granted Apr 07, 2026
Patent 12591226
CLOUD-BASED VIBRATORY FEEDER CONTROLLER
2y 5m to grant Granted Mar 31, 2026
Patent 12590726
AIR CONDITIONING LOAD LEARNING APPARATUS AND AIR CONDITIONING LOAD PREDICTION APPARATUS
2y 5m to grant Granted Mar 31, 2026
Patent 12585241
METHODS AND APPARATUS FOR SENSOR-ASSISTED PART DEVELOPMENT IN ADDITIVE MANUFACTURING
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
99%
With Interview (+24.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 899 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month