Prosecution Insights
Last updated: April 19, 2026
Application No. 18/567,614

STORE OPERATION SUPPORT DEVICE, AND STORE OPERATION SUPPORT METHOD

Final Rejection §101§102§103
Filed
Dec 06, 2023
Examiner
BYRD, UCHE SOWANDE
Art Unit
3624
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Panasonic Intellectual Property Management Co., Ltd.
OA Round
2 (Final)
23%
Grant Probability
At Risk
3-4
OA Rounds
4y 8m
To Grant
51%
With Interview

Examiner Intelligence

Grants only 23% of cases
23%
Career Allow Rate
81 granted / 350 resolved
-28.9% vs TC avg
Strong +28% interview lift
Without
With
+27.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 8m
Avg Prosecution
51 currently pending
Career history
401
Total Applications
across all art units

Statute-Specific Performance

§101
42.2%
+2.2% vs TC avg
§103
41.9%
+1.9% vs TC avg
§102
10.0%
-30.0% vs TC avg
§112
5.3%
-34.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 350 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Status of the Application Claims 1-7 have been examined in this application. This communication is the first action on the merits. The information disclosure statement (IDS) submitted on 12/06/2023; was filed with this application. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status This action is a Non-Final Action on the merits in response to the application filed on 12/06/2023. Claims 1-7 remain pending in this application. Specification The abstract of the disclosure is objected to because it exceeds 150 words (i.e. 154 words) in length. A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). Foreign Priority The Examiner/office acknowledges that the applicant claims foreign priority to the date 6/11/2021. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-6 are directed towards a device and claim 7 are directed towards a method, both of which are among the statutory categories of invention. Claims 1-7 are rejected under 35 U.S.C. 101 because the claims are directed to a judicial exception without significantly more. Regarding claims 1-7, under Step 2A claims 1-7 recite a judicial exception (abstract idea) that is not integrated into a practical application. With respect to claims 1-7, the independent claims (claims 1 and 7) are directed to managing of customers interactions (e.g. presenting result to user, detect and identifies person, acquiring persons behavior information.). These claim elements are considered to be abstract ideas because they are directed to a method of organizing human activity which include managing personal behavior such as social activities and following rules or instructions. The managing personal behavior is entered into when the customers activities are detected, then are analyzed and generate results. If a claim limitation, under its broadest reasonable interpretation, covers managing personal behavior, then it falls within the “method of organizing human activity” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. This judicial exception is not integrated into a practical application. In particular, the claim recites additional element – device, processor, camera, to perform the claim steps. The processor in both steps is recited at a high-level of generality (i.e., as a generic processor performing a generic computer function of providing and processing information at 0031) such that it amounts no more than mere instructions to apply the exception using a generic computer component. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea. The independent claims are additionally directed to claim elements such as device, processor, camera. When considered individually, the device, processor, camera, claim elements only contribute generic recitations of technical elements to the claims. It is readily apparent, for example, that the claim is not directed to any specific improvements of these elements. Examiner looks to Applicant’s specification in at ([0003]) “analysis regarding the merchandise evaluation state of customers in the store, there is conventionally known a technology which detects, based on the camera images of the exhibition shelfs, “a change caused by placing a merchandise item on the merchandise shelf” or “a change caused by shifting the position of a merchandise item that has been placed on the merchandise shelf,” identifies a merchandise item that a customer was interested in but did not purchase, and acquires the frequency at which a customer was interested in but did not purchase a merchandise item.” [0039] “The analysis server 2 performs analysis regarding the state of merchandise evaluation by the customers in the store. The analysis server 2 is constituted of a PC or the like. Note that other than being installed in the store, the analysis server 2 may be a cloud computer.” These passages, as well as others, makes it clear that the invention is not directed to a technical improvement. When the claims are considered individually and as a whole, the additional elements noted above, appear to merely apply the abstract concept to a technical environment in a very general sense – i.e. a generic computer receives information from another generic computer, processes the information and then sends information back. The most significant elements of the claims, that is the elements that really outline the inventive elements of the claims, are set forth in the elements identified as an abstract idea. The fact that the generic computing devices are facilitating the abstract concept is not enough to confer statutory subject matter eligibility. Dependent claims 2-6 directed to managing of customers interactions. This process is similar to the abstract idea noted in the independent claims because they further the limitations of the independent claim which are directed to a method of organizing human activity which include managing personal behavior such as social activities and following rules or instructions. Accordingly, these claim elements do not serve to confer subject matter eligibility to the claims since they are directed to abstract ideas. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-3, 6, and 7 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by United States Patent Publication US 20190104866, Kobayashi, et al. Referring to Claim 1, Kobayashi teaches a store operation support device provided with a processor which executes a process of performing, based on camera images of persons staying in front of exhibition areas in a store, an analysis regarding a merchandise evaluation state of the persons and presenting a result of the analysis to a user, wherein the processor ( Kobayashi: Sec. 0057, The storage unit 280 is realized using a storage device provided in the image display device 200. The storage device provided in the image display device 200 may be a storage device built into the image display device 200 or a storage device externally attached to the image display device 200. Kobayashi: Sec. 0058, The controller 290 controls each part of the image display device 200 to perform various processes. The controller 290 is realized, for example, by a central processing unit (CPU) provided in the image display device 200 reading and executing a program from the storage unit 280.) detects persons from the camera images and identifies persons to be analyzed ( Kobayashi: Sec. 0025, The image display system 1 analyzes a customer's shelf-front behavior and displays the analysis result such that the analysis result is superimposed on an image of shelves) detects behaviors of the persons from the camera images, acquires behavior information of each person to be analyzed, in association with merchandise items, and accumulates the behavior information of each person in a storage ( Kobayashi: Sec. 0025, The image display system 1 analyzes a customer's shelf-front behavior and displays the analysis result such that the analysis result is superimposed on an image of shelves. The shelf-front referred to here is the front of an item display shelves (in particular, the vicinity of the front face of the item display shelves), and the shelf-front behavior referred to here is a behavior performed by the customer in front of the item display shelves. Hereinafter, the item display shelf is simply referred to as a shelf. Kobayashi: Sec. 0036, The image display device 200 analyzes a shelf-front behavior on the basis of sensing data from the shelf-front behavior measurement sensor(s) 110. Kobayashi: Sec. 0095, As a result, the image display system 1 can display information indicating the customer's behavior in more detail. In particular, the image display system 1 can display information indicating the customer's behavior for each item in more detail than that for each display shelf. By referring to this display, the user can perform more sophisticated analysis of customers' behavior.), generates, based on the behavior information accumulated in the storage, the merchandise evaluation information including at least a time needed for evaluation of each merchandise item, and accumulates the merchandise evaluation information for each person in the storage ( Kobayashi: Sec. 0054, On the basis of an image captured by the shelf situation imaging device 120, the image display device 200 detects a period from when the customer stops in front of the shelves to when leaving the shelves and adds the same group number to information (a combination of the sensing time and the item identification information) which is based on the sensing data that the shelf-front behavior measurement sensor 110 transmitted during this period. This makes it possible to detect that one customer has extended their hand to the shelves 910 a plurality of times as shown in the example of FIG. 5. Kobayashi: Sec. 0055, the information stored in the storage unit 280 is not limited to that shown in the example of FIG. 6 in which the sensing time and the item identification information are associated with each other. For example, the storage unit 280 may store sensing data from the shelf-front behavior measurement sensor 110 and time information indicating the sensing time in association with each other. Alternatively, the storage unit 280 may store information, which is obtained by converting the sensing data from the shelf-front behavior measurement sensor 110 into positions in the vertical and horizontal directions of the shelves 910, in the form of coordinate values. Alternatively, the storage unit 280 may store information indicating, for each item 920, the number of times that the customer has extended their hand to the item.), and Kobayashi describes storing the time period for when a customer is evaluating an item, which is equivalent to the Applicants spec. at 0046. acquires, based on the merchandise evaluation information accumulated in the storage, an analysis result in which the merchandise evaluation state corresponding to each merchandise item is visualized ( Kobayashi: Sec. 0041, FIG. 4 is an explanatory diagram showing an example of display of the customer behavior index value by the display unit 220. In the example shown in FIG. 4, the display unit 220 displays an image of shelves 910 in which items 920 are arranged in an area A11. The display unit 220 displays images of the items in colors according to the number of times that customers have extended their hands to each of the items. Kobayashi: Sec. 0041, FIG. 5 is an explanatory diagram showing an example of display, by the display unit 220, of a customer behavior index value indicating a correlation between a behavior that a customer has performed for an item designated by the user of the image display system 1 and a behavior that the same customer has performed for an item other than the designated item on the display unit 220. In the example of FIG. 5, the display unit 220 displays an image of the shelves 910 in which items 920 are arranged in an area A11, similar to the example of FIG. 4. The display unit 220 also displays a legend indicating the association between the number of times that customers have extended their hands to each item and the color in an area A12.). Kobayashi describes based on customer and item evaluation presenting a visual representation, which is equivalent to the Applicants spec. at 0027. Referring to Claim 2, Kobayashi teaches the store operation support device according to claim 1, wherein when, based on feature information of a person detected from the camera image, the processor determines that the person is a store clerk, the processor excludes the person from an analysis target ( Kobayashi: Sec. 0106, The image display system 1 may display information other than the shelf-front behavior information together with the display of the shelf-front behavior index value described above. For example, the storage unit 280 may previously store information indicating whether each customer is a member or not and the image display system 1 may extract and display information such as information indicating that people who are members are likely to pick up the item as the shelf-front behavior index value.). Kobayashi describes distinguishing the difference from various types of people in which the Examiner is interpreting this as including clerks. Referring to Claim 3, Kobayashi teaches the store operation support device according to claim 1, wherein the processor detects, as a behavior related to merchandise evaluation by each person, an item holding behavior and an item gazing behavior, and acquires the behavior information including a detection result thereof ( Kobayashi: Sec. 0029, comparing images before and after the customer extends their hand to the shelves, the image display device 200 can detect that the customer has picked up an item and that the customer has returned the item picked up by the customer to the shelves Kobayashi: Sec. 0054, On the basis of an image captured by the shelf situation imaging device 120, the image display device 200 detects a period from when the customer stops in front of the shelves to when leaving the shelves and adds the same group number to information (a combination of the sensing time and the item identification information) which is based on the sensing data that the shelf-front behavior measurement sensor 110 transmitted during this period. This makes it possible to detect that one customer has extended their hand to the shelves 910 a plurality of times as shown in the example of FIG. 5. Kobayashi: Sec. 0055, the information stored in the storage unit 280 is not limited to that shown in the example of FIG. 6 in which the sensing time and the item identification information are associated with each other. For example, the storage unit 280 may store sensing data from the shelf-front behavior measurement sensor 110 and time information indicating the sensing time in association with each other. Alternatively, the storage unit 280 may store information, which is obtained by converting the sensing data from the shelf-front behavior measurement sensor 110 into positions in the vertical and horizontal directions of the shelves 910, in the form of coordinate values. Alternatively, the storage unit 280 may store information indicating, for each item 920, the number of times that the customer has extended their hand to the item. Kobayashi: Sec. 0102, The display unit 220 may display a period of time during which a customer picked up an item. For example, the display unit 220 may display an average period of time per pickup of an item by a customer, using each of a period of time from when the customer picks up the item to when returning the item and a period of time from when the customer picks up the item to when leaving the shelf as the period of time during which the customer picked up the item.) Referring to Claim 6, Kobayashi teaches the store operation support device according to claim 1, wherein based on the behavior information, the processor acquires, as the merchandise evaluation information, a number of times of item holding, an item gazing time, and a number of held items, and, based on the number of times of item holding, the item gazing time, and the number of held items, acquires a merchandise evaluation degree which quantifies a degree of undecidedness of each person in merchandise evaluation ( Kobayashi: Sec. 0037, The image display device 200 stores the arrangement of the items 920 on the shelves 910 in advance and estimates an item to which the customer has extended the hand on the basis of the position of the hand of the customer. For example, for each item, the image display device 200 counts the number of times that customers have extended their hands to the item and displays the count result (for example, a total count for all customers within a predetermined period) such that the count result is superimposed on the image of the shelves 910). Kobayashi describes storing a number of times of item holding. Kobayashi: Sec. 0045, FIG. 5 is an explanatory diagram showing an example of display, by the display unit 220, of a customer behavior index value indicating a correlation between a behavior that a customer has performed for an item designated by the user of the image display system 1 and a behavior that the same customer has performed for an item other than the designated item on the display unit 220. In the example of FIG. 5, the display unit 220 displays an image of the shelves 910 in which items 920 are arranged in an area A11, similar to the example of FIG. 4. The display unit 220 also displays a legend indicating the association between the number of times that customers have extended their hands to each item and the color in an area A12. Kobayashi: Sec. 0047, FIG. 5, the display unit 220 displays the count result of the number of times that customers who extended their hands to the designated items (one or more times) have extended their hands to each item 920 other than the designated items (for example, a total count for all customers within a predetermined period) in a heat map format. Specifically, the display unit 220 displays an image of each item 920 other than the designated items in a color corresponding to the number of times that customers who extended their hands to the designated items (one or more times) have extended their hands to the item 920 other than the designated items. In the example of FIG. 5, customers who extended their hands to the items (designated items) displayed in the area A21 have extended their hands to an item displayed in an area A22 many times and therefore the display unit 220 displays an image of the item 920 in the area A22 in a color indicating that the number of times that the customers have extended their hands to the item is large. Kobayashi describes storing a number of times of item holding and a number of held items Kobayashi: Sec. 0055, the information stored in the storage unit 280 is not limited to that shown in the example of FIG. 6 in which the sensing time and the item identification information are associated with each other. For example, the storage unit 280 may store sensing data from the shelf-front behavior measurement sensor 110 and time information indicating the sensing time in association with each other. Alternatively, the storage unit 280 may store information, which is obtained by converting the sensing data from the shelf-front behavior measurement sensor 110 into positions in the vertical and horizontal directions of the shelves 910, in the form of coordinate values. Alternatively, the storage unit 280 may store information indicating, for each item 920, the number of times that the customer has extended their hand to the item.), and Kobayashi describes storing the item gazing time, i.e. the time period for when a customer is evaluating an item, which is equivalent to the Applicants spec. at 0046. Kobayashi: Sec. 0106, The image display system 1 may display information other than the shelf-front behavior information together with the display of the shelf-front behavior index value described above. For example, the storage unit 280 may previously store information indicating whether each customer is a member or not and the image display system 1 may extract and display information such as information indicating that people who are members are likely to pick up the item as the shelf-front behavior index value. Kobayashi describes based on stored information determining the likelihood of a customer getting an item Claim 7 recite limitations that stand rejected via the art citations and rationale applied to claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 4 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over United States Patent Publication US 20190104866, Kobayashi, et al. to hereinafter United States Patent Publication US 20190102612, Takemoto, et al. Referring to Claim 4, Kobayashi teaches the store operation support device according to claim 1, wherein the processor outputs the analysis result including a map image in which an image visualizing the merchandise evaluation information for each exhibition area is depicted on an image representing a layout in the store (See Kobayashi)( Kobayashi: Sec. 0041, FIG. 4 is an explanatory diagram showing an example of display of the customer behavior index value by the display unit 220. In the example shown in FIG. 4, the display unit 220 displays an image of shelves 910 in which items 920 are arranged in an area A11. The display unit 220 displays images of the items in colors according to the number of times that customers have extended their hands to each of the items. Kobayashi: Sec. 0041, FIG. 5 is an explanatory diagram showing an example of display, by the display unit 220, of a customer behavior index value indicating a correlation between a behavior that a customer has performed for an item designated by the user of the image display system 1 and a behavior that the same customer has performed for an item other than the designated item on the display unit 220. In the example of FIG. 5, the display unit 220 displays an image of the shelves 910 in which items 920 are arranged in an area A11, similar to the example of FIG. 4. The display unit 220 also displays a legend indicating the association between the number of times that customers have extended their hands to each item and the color in an area A12.). Kobayashi describes based on customer and item evaluation presenting a visual representation, which is equivalent to the Applicants spec. at 0027. Kobayashi does not explicitly teach image representing a layout in the store. However, Takemoto teaches image representing a layout in the store ( Takemoto: Sec. 0012, FIG. 2 is a store plan diagram illustrating a layout of a store and an installation state of cameras 1. Takemoto: Sec. 0013, FIG. 3A is an explanatory diagram illustrating a target area which is set on a store map image and a state in which a digest image of the target area is superimposed on the store map image. Takemoto: Sec. 0014, FIG. 3B is an explanatory diagram illustrating the target area which is set on the store map image and a state in which the digest image of the target area is superimposed on the store map image.) Kobayashi and Takemoto are both directed to the analysis of customers in a store environment (See Kobayashi at 0025-0029; Takemoto at 0002, 0057, 0060). Kobayashi discloses that additional elements, such customer behavior can be considered (See Kobayashi at 0026). It would have been obvious for one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Kobayashi, which teaches detecting and repairing information technology problems in view of Takemoto, to efficiently apply analysis of customers in a store environment to enhancing the capability to determining customer positions with in a store. (See Takemoto at 0091, 0098, 0186). Referring to Claim 5, Kobayashi teaches the store operation support device according to claim 4, Kobayashi does not explicitly teach wherein the processor outputs the analysis result including the camera image corresponding to the exhibition area selected by an operation of the user to select the exhibition area on a screen displaying the map image. However, Takemoto teaches wherein the processor outputs the analysis result including the camera image corresponding to the exhibition area selected by an operation of the user to select the exhibition area on a screen displaying the map image ( Takemoto: Sec. 0023, FIG. 12 is an explanatory diagram illustrating the relevant information display screen which is displayed in a case where a display item of a camera image is selected. Takemoto: Sec. 0023, In addition, in a case where an operation of selecting (clicking) the target area is performed on the map display screen, display item selection menu (selector) 125 is displayed. Display item selection menu 125 includes the time-series heat map, the histogram, the graph, the camera image, the merchandise exhibition state, and respective correlated display items. In a case where one of them is selected, transition is performed to the relevant information display screen (refer to FIG. 9 to FIG. 14). Takemoto: Sec. 0136, In addition, in a case where an operation of selecting (clicking) the target area is performed on the map display screen, display item selection menu (selector) 125 is displayed. Display item selection menu 125 includes the time-series heat map, the histogram, the graph, the camera image, the merchandise exhibition state, and respective correlated display items. In a case where one of them is selected, transition is performed to the relevant information display screen (refer to FIG. 9 to FIG. 14). In addition, display item selection menu 125 includes a densified display item. In a case where the densified display item is selected, transition is performed to the map display screen (refer to FIG. 15) which displays heat map image 161 acquired by densifying digest image 62. ). Kobayashi and Takemoto are both directed to the analysis of customers in a store environment (See Kobayashi at 0025-0029; Takemoto at 0002, 0057, 0060). Kobayashi discloses that additional elements, such customer behavior can be considered (See Kobayashi at 0026). It would have been obvious for one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Kobayashi, which teaches detecting and repairing information technology problems in view of Takemoto, to efficiently apply analysis of customers in a store environment to enhancing the capability to determining customer positions with in a store. (See Takemoto at 0091, 0098, 0186). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Higa et al., W.O. Pub. 2019171574, (discussing the detecting the behavior of customers in a store environment). Cai et al., U.S. Pub. 20150269642, (discussing the capturing and analyzing imaging of customers shopping.). Lu et al., A Video-Based Automated Recommender VAR System for Garments, https://www.jstor.org/stable/44012166?seq=1, Proceedings of the 2nd ACM SIGCOMM workshop on Green networking, 2011 (discussing the capturing and analyzing imaging of customers to assist with shopping.). Any inquiry concerning this communication or earlier communications from the examiner should be directed to UCHE BYRD whose telephone number is (571)272-3113. The examiner can normally be reached Mon.-Fri.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patricia Munson can be reached at (571) 270-5396. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /UCHE BYRD/Examiner, Art Unit 3624
Read full office action

Prosecution Timeline

Dec 06, 2023
Application Filed
May 30, 2025
Non-Final Rejection — §101, §102, §103
Aug 29, 2025
Response Filed
Dec 12, 2025
Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12499469
DATA ANALYSIS TO DETERMINE OFFERS MADE TO CREDIT CARD CUSTOMERS
2y 5m to grant Granted Dec 16, 2025
Patent 12499460
INFORMATION DELIVERY METHOD, APPARATUS, AND DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Dec 16, 2025
Patent 12282930
USING A PICTURE TO GENERATE A SALES LEAD
2y 5m to grant Granted Apr 22, 2025
Patent 12236377
METHOD AND SYSTEM FOR SWITCHING AND HANDOVER BETWEEN ONE OR MORE INTELLIGENT CONVERSATIONAL AGENTS
2y 5m to grant Granted Feb 25, 2025
Patent 12147927
Machine Learning System and Method for Predicting Caregiver Attrition
2y 5m to grant Granted Nov 19, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
23%
Grant Probability
51%
With Interview (+27.9%)
4y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 350 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month