Prosecution Insights
Last updated: April 19, 2026
Application No. 17/588,896

MULTIPLE SELECTION ON DEVICES WITH MANY GESTURES

Final Rejection §103
Filed
Jan 31, 2022
Examiner
SPRATT, BEAU D
Art Unit
2143
Tech Center
2100 — Computer Architecture & Software
Assignee
Fujifilm Business Innovation Corp.
OA Round
6 (Final)
79%
Grant Probability
Favorable
7-8
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
342 granted / 432 resolved
+24.2% vs TC avg
Strong +27% interview lift
Without
With
+26.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
37 currently pending
Career history
469
Total Applications
across all art units

Statute-Specific Performance

§101
12.2%
-27.8% vs TC avg
§103
63.7%
+23.7% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
5.4%
-34.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 432 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application is being examined under the pre-AIA first to invent provisions. Response to Amendment The Amendment filed 09/08/2025 has been entered. Claims 178-204 remain pending in the application. Allowable Subject Matter Claims 179, 186, 188, 195, 197 and 204 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 178 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Sunday et al. (US 20070220444 A1 hereinafter Sunday) in view of Tseng et al. (US 20090249247 A1) (provisional 01/30/2008 61/024,869) As to independent claim 178, Sunday teaches an information processing apparatus comprising:[computer ¶27] a display; and [monitor with touch ¶29] a processor [¶26] configured to process a first action in response to the display sensing a first activation point gesture beginning at a first activation point, which is not explicitly displayed, and which is provided on an edge portion of the display, and ending at a first inner point of the display, [Fig. 6 602-603 illustrate a gesture starting on an edge portion (previously not shown Fig. 5) and ending at an inner point 603 ¶40 "touching a location on the display with the user 602's finger and dragging it in a direction, such as towards the second user, may identify the orientation to use for the second interface 601"] wherein the processor is configured to process a reserved action in response to the display sensing a reserved gesture beginning at a point on a portion other than the edge portion and ending at another point of the display, and [Fig. 7-8 illustrates a drag on interface portion 601 for moving and ending elsewhere ¶43 " The interface 601 may also be moved and/or dragged to a different location as part of, or in addition to, this change in orientation."] wherein the processor is configured to process a second action, which is different from the first action, in response to the display sensing a second activation point gesture beginning at the first activation point and ending at a second inner point, [Fig. 6 602 gesture support starting on same edge portion and ending at different inner point causing different orientation ¶40 "touching a location on the display with the user 602's finger and dragging it in a direction, such as towards the second user, may identify the orientation to use for the second interface 601"] Sunday does not specifically teach distinguishing the second inner point from the first inner point, wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and wherein the processor is configured to process the second action, which is different from the first action, even in a case where the second activation point gesture is performed in a same direction as the first activation point gesture. However, Tseng teaches distinguishing the second inner point from the first inner point. [different gestures that stop on different inner points cause status bar to change according to different inner points ¶58-59] wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and [Fig 2B 222 illustrates a status bar that is displayed and allows for gestures despite content (can expand and collapse) ¶58-59 "Upon the user selecting the status bar, the bar may expand slightly, and a "slide down" indicator may be displayed"] wherein the processor is configured to process the second action, which is different from the first action, even in a case where the second activation point gesture is performed in a same direction as the first activation point gesture. [Fig. 2C illustrates a gesture starting at the same top and performing different actions according to distance not direction of downward gesture (like a roll shade) ¶58-59 " Upon the user selecting the status bar, the bar may expand slightly, and a "slide down" indicator may be displayed, in the form of a notification bar with up and down arrows. The indicator may visually signal the user that they can pull the indicator down to show messages associated with the notification icons. Alternatively, a tab like that shown in FIG. 2A may be displayed below the status area. [0059] The third display shows the user dragging the notification bar downward across the main part of the display, like a roll shade. At the point shown, more messages remain above the top of the display area, so that the top-displayed message is faded to black at its upper edge to indicate to the user that more messages are left to review. The initially shown message at the bottom of the pulled down section may be a message computed to be at the bottom of the message area once the bar is pulled all the way down to the bottom of the display."] Accordingly, it would have been obvious to a person of ordinary skill in the art before at the time the invention was made to modify the touch gesture interface by Sunday by incorporating the distinguishing the second inner point from the first inner point, wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and wherein the processor is configured to process the second action, which is different from the first action, even in a case where the second activation point gesture is performed in a same direction as the first activation point gesture disclosed by Tseng because both techniques address the same field of touch interfaces and by incorporating Tseng into Sunday enhances user interface and interaction capabilities for mobile devices allowing multitasking [Tseng ¶47] Claims 180-181 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Sunday et al. (US 20070220444 A1 hereinafter Sunday) in view of Novick et al. (US 20080098331 A1 hereinafter Novick) and Tseng As to independent claim 180, Sunday teaches an information processing apparatus comprising: [computer ¶27] a display; and [monitor with touch ¶29] a processor [¶26] configured to process a first action in response to the display sensing a first activation point gesture beginning at a first activation point, which is not explicitly displayed, and which is provided on an edge portion of the display, and ending at a first inner point of the display, [Fig. 6 602-603 illustrate a gesture starting on an edge portion (previously not shown Fig. 5) and ending at an inner point 603 ¶40 "touching a location on the display with the user 602's finger and dragging it in a direction, such as towards the second user, may identify the orientation to use for the second interface 601"] wherein the processor is configured to process a reserved action in response to the display sensing a reserved gesture beginning at a point on a portion other than the edge portion and ending at another point of the display, and [Fig. 7-8 illustrates a drag on interface portion 601 for moving and ending elsewhere ¶43 " The interface 601 may also be moved and/or dragged to a different location as part of, or in addition to, this change in orientation."] Sunday does not specifically teach wherein the processor is configured to cancel the first action in response to the display sensing a third activation point gesture beginning at the first activation point and ending at a third point, which is different from the first inner point, by distinguishing the third point from the first inner point. However, Novick teaches wherein the processor is configured to cancel the first action in response to the display sensing a third activation point gesture beginning at the first activation point and ending at a third point, which is different from the first inner point, by distinguishing the third point from the first inner point. [gesture outside of range undoes (cancels) action ¶155 "Referring back to FIG. 8A, after the finger moves outside a predefined range around the first key icon, the portable device de-highlights the first key icon (505) or undoes any visual distinguishing effects it applied to the first key icon previously (605)"] Accordingly, it would have been obvious to a person of ordinary skill in the art before at the time the invention was made to modify the touch gesture interface by Sunday by incorporating the wherein the processor is configured to cancel the first action in response to the display sensing a third activation point gesture beginning at the first activation point and ending at a third point, which is different from the first inner point, by distinguishing the third point from the first inner point disclosed by Novick because both techniques address the same field of touch interfaces and by incorporating Novick into Sunday enhances interfaces with more transparent and intuitive key that are easy to use, configure, and/or adapt. [Novick ¶8] Sunday and Novick do not specifically teach wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and wherein the processor is configured not to process the first action in a case where the display senses another gesture beginning at a point near the first inner point and ending at the first inner point of the display. However, Tseng teaches wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and [Fig 2B 222 illustrates a status bar that is displayed and allows for gestures despite content (can expand and collapse) ¶58-59 "Upon the user selecting the status bar, the bar may expand slightly, and a "slide down" indicator may be displayed"] wherein the processor is configured not to process the first action in a case where the display senses another gesture beginning at a point near the first inner point and ending at the first inner point of the display. [Fig. 2A 216 illustrates another gesture beginning near end point for closing shade menu (not first action) ¶46 " the tab 216 may take other forms, and may be dragged upward by a user in a manner like a window shade, to fully or partially close the message area 206"] Accordingly, it would have been obvious to a person of ordinary skill in the art before at the time the invention was made to modify the touch gesture interface by Sunday by incorporating the distinguishing the second inner point from the first inner point, wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and wherein the processor is configured not to process the first action in a case where the display senses another gesture beginning at a point near the first inner point and ending at the first inner point of the display disclosed by Tseng because both techniques address the same field of touch interfaces and by incorporating Tseng into Sunday enhances user interface and interaction capabilities for mobile devices allowing multitasking [Tseng ¶47] As to dependent claim 181, the rejection of claim 180 is incorporated. Sunday, Novick and Tseng further teach wherein the third activation point gesture continues to outside of the display without interrupting at any inner point of the display. [Novick gesture outside of range undoes (cancels) action ¶155 "Referring back to FIG. 8A, after the finger moves outside a predefined range around the first key icon, the portable device de-highlights the first key icon (505) or undoes any visual distinguishing effects it applied to the first key icon previously (605)"] wherein the processor is configured to cancel the first action, by reverting the information processing apparatus to a state prior to the beginning of the third activation point gesture. Tseng et al. [Fig. 2A 216 illustrates another gesture beginning near end point for closing shade menu (not first action) ¶46 " the tab 216 may take other forms, and may be dragged upward by a user in a manner like a window shade, to fully or partially close the message area 206"] Claim 182 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Sunday in view of Novick, as applied in claim 180 above, and further in view of Chaudhri et al. (US 8046721 B2 hereinafter Chaudhri) As to dependent claim 182, Sunday, Novick and Tseng teach the limitations of claim 180 above, Sunday, Novick and Tseng further teach wherein the processor is configured to cancel the first action, by reverting the information processing apparatus to a state prior to the beginning of the third activation point gesture, in response to the first activation point gesture being completed as the finger is lifted up from the display. Tseng et al. [Fig. 2A 216 illustrates another gesture beginning near end point for closing shade menu (not first action) ¶46 " the tab 216 may take other forms, and may be dragged upward by a user in a manner like a window shade, to fully or partially close the message area 206"] Sunday, Novick and Tseng do not specifically teach wherein the display is configured to sense the first activation point gesture and the reserved gesture being performed by a finger, wherein the processor is configured to, during the display sensing the first activation point gesture, visually change an area extending from the edge portion, where the first activation point gesture began, and extending to another position corresponding to a point where the first activation point gesture is being sensed, and wherein the processor is configured to process the first action in response to the first activation point gesture being completed as the finger is lifted up from the display. However, Chaudhri teaches wherein the display is configured to sense the first activation point gesture and the reserved gesture being performed by a finger, wherein the processor is configured to, during the display sensing the first activation point gesture, visually change an area extending from the edge portion, where the first activation point gesture began, and extending to another position corresponding to a point where the first activation point gesture is being sensed, and wherein the processor is configured to process the first action in response to the first activation point gesture being completed as the finger is lifted up from the display. [Fig. 5B illustrates dragging of a unlock image from an edge along a sensed path to a predefined location, accompanied by visual cues to guide the gesture's performance and completes on release (lift) col. 13 ln. 8-24 " the user is in the process of performing the gesture by moving her finger, which is in continuous contact with the touch screen 408, in the direction of movement 504. The unlock image 402 is dragged along the channel 404 as a result of the gesture. The channel 404 reminds the user that the unlock gesture is a horizontal motion. In some embodiments, the channel 404 indicates the predefined location (in FIGS. 5A-5D, the right end of the channel) to which the user drags the unlock image 402 to complete the unlock action and/or the predefined path along which the user drags the unlock image 402 to complete the unlock action"… "Once the user releases the unlock image 402 at the right end of the channel 404, the unlock action is complete"] Accordingly, it would have been obvious to a person of ordinary skill in the art at the time the invention was made to modify the touch gesture interfaces disclosed by Sunday and Novick by incorporating the wherein the display is configured to sense the first activation point gesture and the reserved gesture being performed by a finger, wherein the processor is configured to, during the display sensing the first activation point gesture, visually change an area extending from the edge portion, where the first activation point gesture began, and extending to another position corresponding to a point where the first activation point gesture is being sensed, and wherein the processor is configured to process the first action in response to the first activation point gesture being completed as the finger is lifted up from the display disclosed by Chaudhri because all techniques address the same field of touch interfaces and by incorporating Chaudhri into Sunday and Novick enhances the efficiency of touch screens for friendlier transitions of states [Chaudhri Col. 1-2 ln .56-67] Claims 183-185 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Sunday et al. (US 20070220444 A1 hereinafter Sunday) in view of Kupka (US 20050024239 A1). As to independent claim 183, Sunday teaches an information processing apparatus comprising: :[computer ¶27] a display including a plurality of sides; and [monitor has sides ¶29] a processor configured to process a first predetermined action in response to the display sensing a first activation point gesture beginning at a first activation point, which is not explicitly displayed, and which is provided on a first edge portion of the display, and ending at a first inner point of the display, [Fig. 6 602-603 illustrate a gesture starting on an edge portion (previously not shown Fig. 5) and ending at an inner point 603 ¶40 "touching a location on the display with the user 602's finger and dragging it in a direction, such as towards the second user, may identify the orientation to use for the second interface 601"] wherein the processor is configured to process a second predetermined action in response to the display sensing a second activation point gesture beginning at a second activation point, which is not explicitly displayed, and which is provided on a second edge portion of the display, and ending at a second inner point of the display, [Fig. 12 illustrates a second action after a gesture near 1201a and may end inside ¶40, ¶52] wherein the processor is configured to process a reserved action in response to the display sensing a reserved gesture beginning at a point on a portion other than any edge portion of the display and ending at another point of the display, [Fig. 7-8 illustrates a drag on interface portion 601 for moving and ending elsewhere ¶43 " The interface 601 may also be moved and/or dragged to a different location as part of, or in addition to, this change in orientation."] Sunday does not specifically teach wherein the first edge portion and the second edge portion are provided at different portions on a same one of the plurality of sides of the display, and wherein the second predetermined action is different from the first predetermined action. However, Kupka teaches wherein the first edge portion and the second edge portion are provided at different portions on a same one of the plurality of sides of the display, and wherein the second predetermined action is different from the first predetermined action. [teaches zones on the same side of the display such as Fig. 3A 103A-C having different actions responsive to gestures ¶57 "Input field 700 includes start zone 103A for changing the start time of the appointment, and end zone 103H for changing the end time. Zones 103A and 103H may be displayed with an outline, text label, distinct color, icon, or other indicator or any combination thereof. Alternatively, zones 103A and 103H may contain no visible demarcations"] Accordingly, it would have been obvious to a person of ordinary skill in the art before at the time the invention was made to modify the touch gesture interface by Sunday by incorporating the wherein the first edge portion and the second edge portion are provided at different portions on a same one of the plurality of sides of the display, and wherein the second predetermined action is different from the first predetermined action disclosed by Kupka because both techniques address the same field of touch interfaces and by incorporating Kupka into Sunday provides an easier to use and intuitive interface for smaller devices [Kupka ¶14] As to dependent claim 184, the rejection of claim 183 is incorporated. Sunday and Kupka further teach wherein each of the first edge portion and the second edge portion extend in a direction along the one of the plurality of sides on which the first edge portion and the second edge portion are provided. [Sunday user inputs within edge regions automatically influence the orientation of application interfaces, with interfaces' bottoms oriented towards the edge from which the input was made ¶40, ¶52] As to dependent claim 185, the rejection of claim 184 is incorporated. Sunday and Kupka further teach wherein the first edge portion and the second edge portion are provided with a gap between the first edge portion and the second edge portion in the direction. [Kupka, zones on the same side of the display such as Fig. 3A 103A-C having gaps or partitions between them ¶57 "Input field 700 includes start zone 103A for changing the start time of the appointment, and end zone 103H for changing the end time. Zones 103A and 103H may be displayed with an outline, text label, distinct color, icon, or other indicator or any combination thereof. Alternatively, zones 103A and 103H may contain no visible demarcations"] Claims 187-204 are rejected similarly to the rejections above. See corresponding similar language above. Response to Arguments Applicant's arguments filed 09/08/2025. In the remark, applicant argues that: Sunday and Lee fail to teach " wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and wherein the processor is configured to process the second action, which is different from the first action, even in a case where the second activation point gesture is performed in a same direction as the first activation point gesture.” recited by amended claim 178. Chaundhri fail to teach " wherein an object is displayed in a first display manner in a case in which the object is in the area and the object is displayed in a second display manner in a case in which the object is not in the area" recited by amended claim 179. Sunday and Novick fail to teach " wherein the first action does not depend on content displayed on the display when the first activation point gesture begins, and wherein the processor is configured not to process the first action in a case where the display senses another gesture beginning at a point near the first inner point and ending at the first inner point of the display." recited by amended claim 180. Sunday and Kupka fail to teach “a first activation point gesture beginning at a first activation point, which is not explicitly displayed, and which is provided on a first edge portion of the display, and ending at a first inner point of the display... a second activation point gesture beginning at a second activation point, which is not explicitly displayed, and which is provided on a second edge portion of the display, and ending at a second inner point of the display...”, “wherein the first edge portion and the second edge portion are provided at different portions on a same one of the plurality of sides of the display” as recited by claim 183 As to points (1)-(3) Applicant’s arguments with respect to claims have been considered but are moot in view of a new ground of rejection made under pre-AIA 35 U.S.C. 103 as being unpatentable over Sunday in view of Tseng, and Novick or Kupka as set forth above. As to point (4) Sunday and Kupka do teach “a first activation point gesture beginning at a first activation point, which is not explicitly displayed, and which is provided on a first edge portion of the display, and ending at a first inner point of the display... a second activation point gesture beginning at a second activation point, which is not explicitly displayed, and which is provided on a second edge portion of the display, and ending at a second inner point of the display...”, “wherein the first edge portion and the second edge portion are provided at different portions on a same one of the plurality of sides of the display” as recited by claim 183 See Sunday Fig. 5-6, ¶39-41 “touching a location on the display with the user 602's finger and dragging it in a direction,” where gesture includes dragging and not displaying any indicators. Interface allows this for multiple user/edges. Kupka teach zones allowing gestures in various areas withing a zone supported and include edges and beginning zone location (see ¶39 ¶47). According to MPEP 2111, examiner is obliged to give the terms or phrases their broadest interpretation definition awarded by one of an ordinary skill in the art unless applicant has provided some indication of the definition of the claimed terms or phrases. Hence, Sunday and Kupka do teach “a first activation point gesture beginning at a first activation point, which is not explicitly displayed, and which is provided on a first edge portion of the display, and ending at a first inner point of the display... a second activation point gesture beginning at a second activation point, which is not explicitly displayed, and which is provided on a second edge portion of the display, and ending at a second inner point of the display...”, “wherein the first edge portion and the second edge portion are provided at different portions on a same one of the plurality of sides of the display” as recited by claim 183 Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Applicant is required under 37 C.F.R. § 1.111(c) to consider these references fully when responding to this action. Kurtenbach (US 5689667 A) Fig. 7 illustrates a menu interface with radial and linear sections where gestures such as 76 can be in the same directions and have item 3 or 5 actions in the same direction (Col. 5 ln. 37-67) Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BEAU SPRATT whose telephone number is (571)272-9919. The examiner can normally be reached M-F 8:30-5 PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Welch can be reached on 5712127212. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BEAU D SPRATT/ Primary Examiner, Art Unit 2143
Read full office action

Prosecution Timeline

Jan 31, 2022
Application Filed
Feb 03, 2023
Non-Final Rejection — §103
Jul 10, 2023
Response Filed
Aug 29, 2023
Final Rejection — §103
Jan 19, 2024
Examiner Interview Summary
Jan 19, 2024
Applicant Interview (Telephonic)
Feb 06, 2024
Request for Continued Examination
Feb 15, 2024
Response after Non-Final Action
Apr 02, 2024
Non-Final Rejection — §103
Jul 16, 2024
Examiner Interview Summary
Jul 16, 2024
Applicant Interview (Telephonic)
Aug 30, 2024
Response Filed
Oct 02, 2024
Final Rejection — §103
Jan 07, 2025
Request for Continued Examination
Jan 13, 2025
Response after Non-Final Action
May 05, 2025
Non-Final Rejection — §103
Aug 29, 2025
Examiner Interview Summary
Aug 29, 2025
Applicant Interview (Telephonic)
Sep 08, 2025
Response Filed
Oct 21, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12595715
Cementing Lab Data Validation based On Machine Learning
2y 5m to grant Granted Apr 07, 2026
Patent 12596955
REWARD FEEDBACK FOR LEARNING CONTROL POLICIES USING NATURAL LANGUAGE AND VISION DATA
2y 5m to grant Granted Apr 07, 2026
Patent 12596956
INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD FOR PRESENTING REACTION-ADAPTIVE EXPLANATION OF AUTOMATIC OPERATIONS
2y 5m to grant Granted Apr 07, 2026
Patent 12561464
CATALYST 4 CONNECTIONS
2y 5m to grant Granted Feb 24, 2026
Patent 12561606
TECHNIQUES FOR POLL INTENTION DETECTION AND POLL CREATION
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
79%
Grant Probability
99%
With Interview (+26.6%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 432 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month