Prosecution Insights
Last updated: April 19, 2026
Application No. 18/820,856

CAPTURE CONTROL APPARATUS, CAPTURE CONTROL METHOD, AND IMAGE CAPTURE SYSTEM

Non-Final OA §103
Filed
Aug 30, 2024
Examiner
BILLAH, MASUM
Art Unit
2486
Tech Center
2400 — Computer Networks
Assignee
Canon Kabushiki Kaisha
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
335 granted / 419 resolved
+22.0% vs TC avg
Strong +21% interview lift
Without
With
+21.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
31 currently pending
Career history
450
Total Applications
across all art units

Statute-Specific Performance

§101
3.9%
-36.1% vs TC avg
§103
60.5%
+20.5% vs TC avg
§102
14.2%
-25.8% vs TC avg
§112
11.2%
-28.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 419 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This Office Action is in response to the application 18/820,856 filed on 08/30/2024. Claims 1 – 25 have been examined and are pending in this application. Information Disclosure Statement The information disclosure statement (IDS) submitted on 01/23/2025 and 08/30/2024. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. This application includes one or more claim limitations that use the word “means” or “step” but are nonetheless not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph because the claim limitation(s) recite(s) sufficient structure, materials, or acts to entirely perform the recited function. Such claim limitation(s) is/are: an obtainment unit, control unit in claim 1 – 9, 11 - 13, . Because this/these claim limitation(s) is/are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are not being interpreted to cover only the corresponding structure, material, or acts described in the specification as performing the claimed function, and equivalents thereof. If applicant intends to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to remove the structure, materials, or acts that performs the claimed function; or (2) present a sufficient showing that the claim limitation(s) does/do not recite sufficient structure, materials, or acts to perform the claimed function. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1 – 5, 8 – 18 and 21 - 25 are rejected under 35 U.S.C. 103 as being unpatentable over Wakamatsu (US 2021/0152731 A1) in view of Sakakima (JP 2025008645 A) Regarding claim 1, Wakamatsu discloses: “a capture control apparatus, comprising one or more processors that execute a program stored in a memory [see para: 0065; In the configuration illustrated in FIG. 2, a first control unit 223 includes a processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a microprocessor, or a micro processing unit (MPU)) and a memory (e.g., a dynamic random access memory (DRAM) or a static RAM). These components are used to control each block of the image capturing apparatus 101 by executing various processes] and thereby function as: an obtainment unit configured to obtain information related to a first image capture apparatus included among a plurality of image capture apparatuses [see para: 0081; FIG. 3 illustrates an example of an automatic image capturing system (control system) in which a plurality of image capturing apparatuses works in conjunction with each other]; and Wakamatsu does not explicitly disclose: “a control unit configured to control an operation of a second image capture apparatus included among the plurality of image capture apparatuses based on a role that is set to the second image capture apparatus and on the information related to the first image capture apparatus, wherein the obtainment unit is configured to obtain, as the information related to the first image capture apparatus, information of a subject of interest of the first image capture apparatus and information related to an angle of view of the first image capture apparatus, and the control unit is configured to control a subject that is a tracking target of the second image capture apparatus and an angle of view of the second image capture apparatus so that at least one of the subject that is the tracking target of the second image capture apparatus and the angle of view of the second image capture apparatus varies between a case where a first role is set to the second image capture apparatus and a case where a second role different from the first role is set to the second image capture apparatus”. However, Sakakima, from the same or similar field of endeavor teaches: “a control unit [see para:0018; The controller 123 includes a control station 124] configured to control an operation of a second image capture apparatus included among the plurality of image capture apparatuses based on a role that is set to the second image capture apparatus and on the information related to the first image capture apparatus [see para: 0026; The role determination unit 303 determines the role of each camera in the multiple sensor systems 110 according to the attention area determined by the area determination unit 302. Determining the role of the camera means assigning the area to be photographed by each camera 112, including the setting of the photographing direction by the camera platform 113, and further assigning how the photographed image of each camera 112 is to be handled in the image generating device 122], wherein the obtainment unit is configured to obtain, as the information related to the first image capture apparatus, information of a subject of interest of the first image capture apparatus [see para: 0026; The role determination unit 303 determines the role of each camera in the multiple sensor systems 110 according to the attention area determined by the area determination unit 302. Determining the role of the camera means assigning the area to be photographed by each camera 112, including the setting of the photographing direction by the camera platform 113, and further assigning how the photographed image of each camera 112 is to be handled in the image generating device 122] and information related to an angle of view of the first image capture apparatus [see para: 0103; The information processing apparatus according to any one of configurations 7 to 10, further comprising a setting control means for controlling a shooting direction and an angle of view of the camera device, the setting control means controlling the shooting direction and the angle of view of the camera device determined to have a role of shooting the first area of interest so that the first area of interest fits as large as possible on the entire screen], and the control unit is configured to control a subject that is a tracking target [see para: 0086; Furthermore, for example, if the player 1103 as a subject of interest moves and goes out of the range that the camera can shoot, the role determination unit 303 sets another camera to the role of shooting the player 1103, enabling continuation of tracking] of the second image capture apparatus and an angle of view of the second image capture apparatus so that at least one of the subject that is the tracking target of the second image capture apparatus and the angle of view of the second image capture apparatus varies between a case where a first role is set to the second image capture apparatus and a case where a second role different from the first role is set to the second image capture apparatus [see para: 0050; On the other hand, when the region determination unit 302 receives scene information indicating that the players are scattered throughout the entire field and are not crowded, it determines the entire field of the shooting target range as an attention region for capturing an image for generating a three-dimensional model. Then, the region determination unit 302 sends information on the attention region determined according to the scene to the role determination unit 303. The role determination unit 303 of the second embodiment determines the role of each camera 112 based on the information on the attention region determined according to the scene as described above. And see para: 0053; 0063]. It would have been obvious to the person of ordinary skill in the art before the effective filing date of the claimed invention to modify the information processing apparatus includes a reception unit configured to receive, from a plurality of image capturing apparatuses disclosed by Wakamatsu to add the teachings of Sakakima as above, in order to provide a means for improving cameras controlling operation, image processor will assign role for each camera including sub-cameras. The collection unit will information related to the corresponding cameras with targeted subject that has interest in. These roles are different for each camera so that each sub camera can be controlled individually or group wise based on the role [Sakakia see para: 0018; 0026; 0103; 0086; 0050]. Regarding claim 2, Wakamatsu and Sakakima disclose all the limitation of claim 1 and are analyzed as previously discussed with respect to that claim. Wakamatsu does not explicitly disclose: “wherein the control unit is configured to cause a relationship between a change direction of an extent of the angle of view of the first image capture apparatus and a change direction of an extent of the angle of view of the second image capture apparatus to vary in accordance with the role that is set to the second image capture apparatus”. However, Sakakima, from the same or similar field of endeavor teaches: “wherein the control unit is configured to cause a relationship between a change direction of an extent of the angle of view of the first image capture apparatus and a change direction of an extent of the angle of view of the second image capture apparatus to vary in accordance with the role that is set to the second image capture apparatus [see para: 0030; 4A and 4B are diagrams used to explain the determination process of the region of interest performed by the region determination unit 302 of the first embodiment. In Fig. 4A and 4B, an example is given in which there are two courts within the shooting range of a badminton match venue, and a badminton match is shot using these courts. In Fig. 4A and 4B, cameras 401a-401n indicate some of the cameras 112 (14 in this example) including the pan head 113 described above, and the dotted lines extending from each camera 112 indicate the shooting direction. In addition, players 402-413 are players playing a badminton match, and are the main subjects of the cameras 401a-401n]. It would have been obvious to the person of ordinary skill in the art before the effective filing date of the claimed invention to modify the information processing apparatus includes a reception unit configured to receive, from a plurality of image capturing apparatuses disclosed by Wakamatsu to add the teachings of Sakakima as above, in order to provide a means for improving the performance of the cameras functionality, image processing algorithm will detect any specific subject/targets and assigned to the one of the sub camera and these commands or instructions will vary camera by camera [Sakakima see para: 0030]. Regarding claim 3, Wakamatsu and Sakakima disclose all the limitation of claim 2 and are analyzed as previously discussed with respect to that claim. Wakamatsu does not explicitly disclose: “wherein the control unit is configured to perform control so that whether to increase the extent of the angle of view of the second image capture apparatus in a case where the extent of the angle of view of the first image capture apparatus is increased varies in accordance with the role that is set to the second image capture apparatus”. However, Sakakima, from the same or similar field of endeavor teaches: “wherein the control unit is configured to perform control so that whether to increase the extent of the angle of view of the second image capture apparatus in a case where the extent of the angle of view of the first image capture apparatus is increased varies in accordance with the role that is set to the second image capture apparatus [see para: 0030; 4A and 4B are diagrams used to explain the determination process of the region of interest performed by the region determination unit 302 of the first embodiment. In Fig. 4A and 4B, an example is given in which there are two courts within the shooting range of a badminton match venue, and a badminton match is shot using these courts. In Fig. 4A and 4B, cameras 401a-401n indicate some of the cameras 112 (14 in this example) including the pan head 113 described above, and the dotted lines extending from each camera 112 indicate the shooting direction. In addition, players 402-413 are players playing a badminton match, and are the main subjects of the cameras 401a-401n]. It would have been obvious to the person of ordinary skill in the art before the effective filing date of the claimed invention to modify the information processing apparatus includes a reception unit configured to receive, from a plurality of image capturing apparatuses disclosed by Wakamatsu to add the teachings of Sakakima as above, in order to provide a means for improving the performance of the cameras functionality, image processing algorithm will detect any specific subject/targets and assigned to the one of the sub camera and these commands or instructions will vary camera by camera [Sakakima see para: 0030]. Regarding claim 4, Wakamatsu and Sakakima disclose all the limitation of claim 1 and are analyzed as previously discussed with respect to that claim. Wakamatsu does not explicitly disclose: “wherein the control unit is configured to cause a relationship between the subject of interest of the first image capture apparatus and the subject that is the tracking target of the second image capture apparatus to vary in accordance with the role that is set to the second image capture apparatus”. However, Sakakima, from the same or similar field of endeavor teaches: “wherein the control unit is configured to cause a relationship between the subject of interest of the first image capture apparatus and the subject that is the tracking target of the second image capture apparatus to vary in accordance with the role that is set to the second image capture apparatus [see para: 0030; 4A and 4B are diagrams used to explain the determination process of the region of interest performed by the region determination unit 302 of the first embodiment. In Fig. 4A and 4B, an example is given in which there are two courts within the shooting range of a badminton match venue, and a badminton match is shot using these courts. In Fig. 4A and 4B, cameras 401a-401n indicate some of the cameras 112 (14 in this example) including the pan head 113 described above, and the dotted lines extending from each camera 112 indicate the shooting direction. In addition, players 402-413 are players playing a badminton match, and are the main subjects of the cameras 401a-401n]. It would have been obvious to the person of ordinary skill in the art before the effective filing date of the claimed invention to modify the information processing apparatus includes a reception unit configured to receive, from a plurality of image capturing apparatuses disclosed by Wakamatsu to add the teachings of Sakakima as above, in order to provide a means for improving the performance of the cameras functionality, image processing algorithm will detect any specific subject/targets and assigned to the one of the sub camera and these commands or instructions will vary camera by camera [Sakakima see para: 0030]. Regarding claim 5, Wakamatsu and Sakakima disclose all the limitation of claim 4 and are analyzed as previously discussed with respect to that claim. Wakamatsu does not explicitly disclose: “wherein the control unit is configured to perform control so that whether the subject of interest of the first image capture apparatus is set as the subject that is the tracking target of the second image capture apparatus varies in accordance with the role that is set to the second image capture apparatus”. However, Sakakima, from the same or similar field of endeavor teaches: “wherein the control unit is configured to perform control so that whether the subject of interest of the first image capture apparatus[see para: 0086; Furthermore, for example, if the player 1103 as a subject of interest moves and goes out of the range that the camera can shoot, the role determination unit 303 sets another camera to the role of shooting the player 1103, enabling continuation of tracking] is set as the subject that is the tracking target of the second image capture apparatus varies in accordance with the role that is set to the second image capture apparatus[see para: 0050; On the other hand, when the region determination unit 302 receives scene information indicating that the players are scattered throughout the entire field and are not crowded, it determines the entire field of the shooting target range as an attention region for capturing an image for generating a three-dimensional model. Then, the region determination unit 302 sends information on the attention region determined according to the scene to the role determination unit 303. The role determination unit 303 of the second embodiment determines the role of each camera 112 based on the information on the attention region determined according to the scene as described above. And see para: 0053; 0063]. It would have been obvious to the person of ordinary skill in the art before the effective filing date of the claimed invention to modify the information processing apparatus includes a reception unit configured to receive, from a plurality of image capturing apparatuses disclosed by Wakamatsu to add the teachings of Sakakima as above, in order to provide a means for improving the performance of the cameras functionality, image processing algorithm will detect any specific subject/targets and assigned to the one of the sub camera and these commands or instructions will vary camera by camera [Sakakima see para: 0086; 0050]. Regarding claim 8, Wakamatsu and Sakakima disclose all the limitation of claim 1 and are analyzed as previously discussed with respect to that claim. Wakamatsu does not explicitly disclose: “wherein the control unit is configured to control a capture direction of the second image capture apparatus so as to track a plurality of subjects including the tracking target subject”. However, Sakakima, from the same or similar field of endeavor teaches: “wherein the control unit is configured to control a capture direction of the second image capture apparatus so as to track a plurality of subjects including the tracking target subject [see para: 0027; The setting control unit 304 controls the settings of each camera 112 and the setting of the shooting direction by the camera platform 113 according to the role of the camera determined by the role determination unit 303. For example, the setting control unit 304 controls camera settings such as the zoom ratio (shooting angle of view) of each camera 112, and the setting of the shooting direction of the camera 112 by the angle setting of the camera platform 113. The setting control unit 304 then transmits camera setting information, which includes information indicating the role of the camera in addition to information indicating the settings of the camera 112 and the settings of the camera platform 113, to the image generating device 122]. It would have been obvious to the person of ordinary skill in the art before the effective filing date of the claimed invention to modify the information processing apparatus includes a reception unit configured to receive, from a plurality of image capturing apparatuses disclosed by Wakamatsu to add the teachings of Sakakima as above, in order to track subjects in the scene, the controller of the computer processing unit send signals to control capture direction angle or to change the direction of one of the camera amongst multiple camera or sub camera [Sakakima see para: 0027]. Regarding claim 9, Wakamatsu and Sakakima disclose all the limitation of claim 8 and are analyzed as previously discussed with respect to that claim. Furthermore, Wakamatsu discloses: “wherein when controlling the capture direction of the second image capture apparatus so as to track the plurality of subjects including the tracking target subject, the control unit is configured to control the capture direction of the second image capture apparatus so as to track a mass center of positions of the plurality of subjects [see para: 0027; Accordingly, tracking of the subject is controlled in such a manner that the subject is captured in the vicinity of the center of the screen by performing an automatic pan/tilt/zoom control on the currently set subject. After the processing of steps S1804 and S1805 is performed, the processing proceeds to step S1813]. Regarding claim 10, Wakamatsu and Sakakima disclose all the limitation of claim 1 and are analyzed as previously discussed with respect to that claim. Furthermore, Wakamatsu discloses: “wherein the subject of interest and the angle of view of the first image capture apparatus are controlled by a user operation [see para: 0189; By the above-described method, designation of the image capturing area can be performed with a simple operation. Further, in the designated image capturing area, assistance can be provided for framing adjustment which is performed by the plurality of image capturing apparatuses 101 working in conjunction with each other and for automatic image capturing processing which is performed by the image capturing apparatuses 101 focusing the designated image capturing area. Accordingly, automatic image capturing that enables the user to easily capture a desired video image can be performed]. Regarding claim 11, Wakamatsu and Sakakima disclose all the limitation of claim 1 and are analyzed as previously discussed with respect to that claim. Furthermore, Wakamatsu discloses: “wherein the plurality of image capture apparatuses include a plurality of the second image capture apparatus, and the control unit is configured to control subjects that are tracking targets of the plurality of respective second image capture apparatuses and angles of view thereof [see para: 0130; In a case where pan, tilt, or zoom driving is automatically performed in such a manner that the designated subject is captured at a predetermined position on the screen (e.g., in the vicinity of the center of the screen), the subject may be designated by a touch operation as illustrated in FIG. 5C. In a case where the subject is designated by a touch operation, tracking of the subject is automatically controlled and a subject frame 1709 is displayed in such a manner that the subject that is currently tracked can be visually recognized as illustrated in FIG. 5D. See Fig. 3 for plurality of cameras]. Regarding claim 12, Wakamatsu and Sakakima disclose all the limitation of claim 1 and are analyzed as previously discussed with respect to that claim. Furthermore, Wakamatsu discloses: “wherein the control unit is configured to determine a capture direction of the second image capture apparatus based on a video captured by an image capture apparatus [see para: 0117; The area 1202 is an area in which a live video image captured by the designated image capturing apparatus 101 is displayed. In a case where the installation position of the image capturing apparatus 101 is tapped on the display portion 1201 as illustrated in FIG. 13E, the live video image captured by the designated image capturing apparatus 101 is displayed on the area 1202] that is different from the first image capture apparatus and the second image capture apparatus and that captures an area including entirety of captured areas of the first image capture apparatus and the second image capture apparatus [see para: 0082; Image capturing apparatuses (101 a, 101 b, 101 c, and 101 d) are each connected to a controller (smart device) 301, which includes a communication function, by wireless communication, and therefore each of the image capturing apparatuses is capable of sending an operation instruction to the other image capturing apparatuses and acquiring control information supplied from the other image capturing apparatuses. In the configuration illustrated in FIG. 3, the image capturing apparatuses (101 a, 101 b, 101 c, and 101 d) and the smart device 301 are each connected to an access point 302 to communicate with each other via the access point 302 and transfer information]. Regarding claim 13 and 14, claim 13 and 14 is rejected under the same art and evidentiary limitations as determined for the method of claim 1. Regarding claim 15, claim 15 is rejected under the same art and evidentiary limitations as determined for the method of claim 2. Regarding claim 16, claim 16 is rejected under the same art and evidentiary limitations as determined for the method of claim 3. Regarding claim 17, claim 17 is rejected under the same art and evidentiary limitations as determined for the method of claim 4. Regarding claim 18, claim 18 is rejected under the same art and evidentiary limitations as determined for the method of claim 5. Regarding claim 21, claim 21 is rejected under the same art and evidentiary limitations as determined for the method of claim 8. Regarding claim 22, claim 22 is rejected under the same art and evidentiary limitations as determined for the method of claim 10. Regarding claim 23, claim 23 is rejected under the same art and evidentiary limitations as determined for the method of claim 11. Regarding claim 24, claim 24 is rejected under the same art and evidentiary limitations as determined for the method of claim 12. Regarding claim 25, claim 25 is rejected under the same art and evidentiary limitations as determined for the method of claim 1 but for non-transitory computer readable medium [see Wakamatsu para: 0230; Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s)]. Allowable Subject Matter Claims 6, 7, 19 and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. JP 2001025003 A Any inquiry concerning this communication or earlier communications from the examiner should be directed to Masum Billah whose telephone number is (571)270-0701. The examiner can normally be reached Mon - Friday 9 - 5 PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jamie J. Atala can be reached at (571) 272-7384. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MASUM BILLAH/Primary Patent Examiner, Art Unit 2486
Read full office action

Prosecution Timeline

Aug 30, 2024
Application Filed
Feb 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603983
APPARATUS AND METHOD FOR GENERATING OBJECT-BASED STEREOSCOPIC IMAGES
2y 5m to grant Granted Apr 14, 2026
Patent 12597123
RAIL FEATURE IDENTIFICATION SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12597258
ALERT DIRECTIVES AND FOCUSED ALERT DIRECTIVES IN A BEHAVIORAL RECOGNITION SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12591954
DEPTH INFORMATION DETECTOR, TIME-OF-FLIGHT CAMERA, AND DEPTH IMAGE ACQUISITION METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12581101
TEMPLATE MATCHING REFINEMENT FOR AFFINE MOTION
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+21.4%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 419 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month