Prosecution Insights
Last updated: April 19, 2026
Application No. 17/641,060

TRACKING SYSTEM FOR IDENTIFICATION OF SUBJECTS

Final Rejection §103
Filed
Mar 07, 2022
Examiner
TAYLOR, MEREDITH IREENE DUPAI
Art Unit
2671
Tech Center
2600 — Communications
Assignee
The Johns Hopkins University
OA Round
4 (Final)
67%
Grant Probability
Favorable
5-6
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
33 granted / 49 resolved
+5.3% vs TC avg
Strong +54% interview lift
Without
With
+54.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
27 currently pending
Career history
76
Total Applications
across all art units

Statute-Specific Performance

§101
8.7%
-31.3% vs TC avg
§103
56.7%
+16.7% vs TC avg
§102
15.8%
-24.2% vs TC avg
§112
15.8%
-24.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 49 resolved cases

Office Action

§103
DETAILED ACTION Response to Arguments Applicant has amended claims 1 and 26; with claims 1-16 and 26-29 currently pending. Applicant’s arguments, filed 08/21/2025, with respect to the rejection(s) of claim(s) 1-3, 9, 11, 14-16, and 26 under 35 U.S.C. 102 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of newly found reference de Chaumont (de Chaumont F, Ey E, Torquet N, Lagache T, Dallongeville S, Imbert A, Legou T, Le Sourd AM, Faure P, Bourgeron T, Olivo-Marin JC. Live Mouse Tracker: real-time behavioral analysis of groups of mice. BioRxiv. 2018 Jun 14:345132.). Therefore this action is made Final. Claim Interpretation Claim 10 recites limitation “substantially real time” which includes a term of degree. However, ¶[0064] of applicant’s specification provides the definition of substantially as “frame-by-frame, as frames are received, before a next frame is received, and/or the like.” Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 9-11, 14-16, 26 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger (Unger J, Mansour M, Kopaczka M, Gronloh N, Spehr M, Merhof D. An unsupervised learning approach for tracking mice in an enclosed area. BMC bioinformatics. 2017 Dec;18:1-4.) in view of de Chaumont (de Chaumont F, Ey E, Torquet N, Lagache T, Dallongeville S, Imbert A, Legou T, Le Sourd AM, Faure P, Bourgeron T, Olivo-Marin JC. Live Mouse Tracker: real-time behavioral analysis of groups of mice. BioRxiv. 2018 Jun 14:345132.). Regarding claim 1, Unger discloses A method performed by one or more processors, comprising: (Unger Section Discussion – found on p. 11; the method is implemented in MATLAB using Intel 15 with 3.3 GHz memory.) identifying, in a first frame of a video feed captured by a camera and using a first computer vision technique, a first subject based on a plurality of reference points of the first subject; (Unger Fig. 3 and Section Initializing the learning process – found on p. 5; foreground/background separation is performed and then shape matching is performed with right and left ears, nose and tail as reference points.) determining whether the first subject is merged with a second subject in a second frame of the video feed; (Unger Fig. 3, Section Occlusion events: separation of individuals – found on p. 6, and Section Initialization and adaptation of the ASM – found on p. 7;when subjects are close together the segmentation covers both individuals and they need to be separated using shape and image information. When subjects overlap/ occlude each other (subjects are merged) the algorithm only uses landmark points outside the overlapping area. Therefore the algorithm is determining if the subjects overlap. See also Fig. 7.) selectively processing to identify the first subject in the second frame using the first computer vision technique, or using a second computer vision technique, based on whether the first subject is merged with the second subject in the second frame, (Unger Section Initialization and adaptation of the ASM – found on p. 7; when subjects overlap/ occlude each other (subjects are merged) the algorithm only uses landmark points outside the overlapping area. Meaning a second computer vision technique is used because not all reference points are used in the model. See also Fig. 7.) wherein the second computer vision technique is based on a shape context of the first subject; (Unger Section Identity preservation – found on p. 7; identity is assigned based on the maximum overlap between shapes of successive frames (i.e. the context of the shape).) wherein the shape context includes histograms of points on contours of a shape of the first subject that are obtained by mapping an inner distance and an inner angle, and wherein the inner distance is defined as a length of the shortest path connecting two points within the shape; (Unger Section Shape matching found on p. 5-6; a distribution of matching contour points represented by the log-polar histogram is disclosed. In the last paragraph of the section using inner-distance and the shortest path is disclosed. Further, the instant application utilizes the same equations see ¶65-67 of originally filed specification.) determining, based on selectively processing to identify, that an identity of at least one of the first subject or the second subject is not identifiable; (Unger Section Results – Tracking performance - ¶2 – found on p. 9; identity switches are likely to happen during/after when mice contact/overlap so identities are manually fixed after such interactions. Therefore during an occlusion it is determined that the identities are not accurate/ are not identifiable.) determining log information associated with the first subject or the second subject based on identifying the first subject in the first frame and the second frame; and (Unger Section Background – paragraphs 5-6 – found on p. 2; tracking individuals and specific conditions and social interactions are identified (log information) and the method is compared to previous methods.) storing or providing the log information. (Unger Section Tracking performance – paragraphs 2-3 – found on p. 8-9; the tracking information (part of logged information) is compared to MiceProfiler method. Therefore the log information was provided to the comparison.) Unger does not explicitly disclose backpropagating one or more identities, associated with the identity of at least one of the first subject or the second subject, to older frames of the video feed in which the identity have been accurately determined. De Chaumont, however, discloses backpropagating one or more identities, associated with the identity of at least one of the first subject or the second subject, to older frames of the video feed in which the identity have been accurately determined (de Chaumont Section Automatic tracking quality control based on RFID reading – found on p. 29 of the attached pdf which is p. 11 of the supplementary methods; identities are verified using RFID tags. If the identity needs correction it can be propagated back in time.) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify method of Unger with the teachings of de Chaumont by including automatic identity checks using RFID from de Chaumont in order to reduce the need for human intervention by automating identity checks, especially after mouse interactions and occlusions. Regarding claim 26, it is the corresponding device claim to claim 1 and the rejection is incorporated herein. The combination of Unger and de Chaumont further disclose A device, comprising: one or more memories; and one or more processors, coupled to the one or more memories, (Unger Section Discussion – found on p. 11; the method implemented in MATLAB on an Intel I5 with 16 GB of memory is disclosed.) Regarding claim 2, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the first subject and the second subject are laboratory animals. (Unger Section Animals – found on p. 2; lab mice are disclosed.) Regarding claim 3, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the first subject in the second frame is identified based on a plurality of reference points that include or are based on at least one of: a head point, a tail point, or one or more ear tags. (Unger Fig. 3 – box a) Preprocessing shows landmarks of the nose (a head point), tail, and right and left ears.) Regarding claim 9, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the log information indicates at least one of: a distance moved by the first subject, a position of the first subject, a pose of the first subject, a speed of the first subject, a social behavior of the first subject, or a feeding behavior of the first subject. (Unger Section Background – paragraphs 5-6 – found on p. 2; tracking individuals (position of subjects and distances moved) and specific conditions and social interactions are identified/ logged (social behavior). Fig. 5 shows the shape (pose) information. Fig. 2 indicate speed and distances of subjects as an indication of their behavior.) Regarding claim 10, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the first computer vision technique is performed in real time or substantially real time. (de Chaumont Section Discussion ¶2 – found on p.14-15; tracking mice in real time is disclosed.) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to optimize the method of the combination of Unger and de Chaumont as in de Chaumont by performing the method in real time in order to reduce processing time. Regarding claim 11, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose further comprising: providing, to a management device associated with the one or more processors, at least a segment of the video feed. (Unger Section Tracking performance – paragraph 2 – found on p. 9; identity switches were corrected for all precision evaluations for USM (the currently disclosed method). Therefore, segments of the video feed were provided to be able to make the corrections.) Regarding claim 14, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the second computer vision technique is based on an inner distance shape context calculation regarding at least one of the first subject or the second subject. (Unger Section Shape matching – paragraph 2 – found on p. 6; an inner-distance helps to increase tracking accuracy.) Regarding claim 15, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the log information includes information regarding a social interaction between the first subject and the second subject. (Unger Section Automatic recognition of behavioral states – found on p. 9-11; Different interactions between mice are logged. Fig. 2 lists the logged states.) Regarding claim 16, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the first subject and the second subject are included in a plurality of subjects, and wherein the method further comprises: (Unger Section Video data and Manual annotation – found on p. 2-3; videos are interactions between two mice.) identifying each subject, of the plurality of subjects, in the video feed; and (Unger Section Identity preservation – found on p. 7; each mouse has an active shape model tracking built where the identity is kept track. See Fig. 7 where different color represent the identity of the mouse.) storing log information identifying each subject of the plurality of subjects and including information regarding the plurality of subjects. (Unger Section Tracking performance – paragraphs 2-4 – found on p. 9; the log information from the disclosed method is compared to MiceProfiler, including identity switches. Therefore each subjects information is logged including their identity.) Claim(s) 4-7 and 27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger (Unger J, Mansour M, Kopaczka M, Gronloh N, Spehr M, Merhof D. An unsupervised learning approach for tracking mice in an enclosed area. BMC bioinformatics. 2017 Dec;18:1-4.) in view of de Chaumont (de Chaumont F, Ey E, Torquet N, Lagache T, Dallongeville S, Imbert A, Legou T, Le Sourd AM, Faure P, Bourgeron T, Olivo-Marin JC. Live Mouse Tracker: real-time behavioral analysis of groups of mice. BioRxiv. 2018 Jun 14:345132.) and Salem (Pub. No. US20160150758A1). Regarding claim 4, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 3, as described above. The combination of Unger and de Chaumont does not explicitly disclose wherein the first subject is differentiated from the second subject based on which ear tags, of the one or more ear tags, are affixed to the first subject and the second subject. Salem, however, discloses wherein the first subject is differentiated from the second subject based on which ear tags, of the one or more ear tags, are affixed to the first subject and the second subject. (Salem ¶[0058]; ear tags are used to help identify and distinguish animals housed together) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger and de Chaumont with the teachings of Salem by including ear tags from Salem in order help identify animals housed together (Salem ¶[0058]). Regarding claim 5, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 3, as described above. Unger does not explicitly disclose wherein the one or more ear tags are observable by the camera. Salem, however, discloses wherein the one or more ear tags are observable by the camera. (Salem ¶[0058]; ear tags are readily detectable by the cameras.) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger and de Chaumont with the teachings of Salem by including ear tags from Salem in order help identify animals housed together (Salem ¶[0058]). Regarding claim 6, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. Unger further discloses using a Panasonic WV-CP480 camera(Unger Section Experimental setup), but not explicitly wherein the camera is associated with a wide angle lens. Salem, however, discloses wherein the camera is associated with a wide angle lens. (Salem ¶[0054]; discloses fisheye lenses (wide angle lens) for use in monitoring animal cages.) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger and de Chaumont with the teachings of Salem by including a fisheye lens from Salem in order to provide a view sufficiently wide to capture images of the outer edges of the entire cage volume (Salem ¶[0054]). Regarding claim 7, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. De Chaumont discloses utilizing infrared imaging (de Chaumont Section Introduction ¶3 – found on p. 2), however the combination of Unger and de Chaumont does not explicitly disclose wherein the camera captures images in a near- infrared range, and wherein the first subject and the second subject are illuminated using near- infrared light. Salem, however, discloses wherein the camera captures images in a near- infrared range, and wherein the first subject and the second subject are illuminated using near- infrared light. (Salem ¶[0036]; Infrared light is used for mice.) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger and de Chaumont with the teachings of Salem by including infrared light as in Salem in order to use a light that does not significantly affect the circadian rhythm of mice ¶[0036]. Regarding claim 27, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 26, as described above. The combination of Unger and de Chaumont does not explicitly disclose wherein the first subject is differentiated from the second subject based on a plurality of tags affixed on the first subject and the second subject. Salem, however, discloses wherein the first subject is differentiated from the second subject based on a plurality of tags affixed on the first subject and the second subject. (Salem ¶[0058]; ear tags are used to help identify and distinguish animals housed together) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger and de Chaumont with the teachings of Salem by including ear tags from Salem in order help identify animals housed together (Salem ¶[0058]). Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger (Unger J, Mansour M, Kopaczka M, Gronloh N, Spehr M, Merhof D. An unsupervised learning approach for tracking mice in an enclosed area. BMC bioinformatics. 2017 Dec;18:1-4.) in view of de Chaumont (de Chaumont F, Ey E, Torquet N, Lagache T, Dallongeville S, Imbert A, Legou T, Le Sourd AM, Faure P, Bourgeron T, Olivo-Marin JC. Live Mouse Tracker: real-time behavioral analysis of groups of mice. BioRxiv. 2018 Jun 14:345132.) and Chen (Patent No. US 9342759B1). Regarding claim 8, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. They further disclose wherein the first computer vision technique is based on determining respective first outlines and respective first centers of the first subject and (Unger Section Shape catalog – paragraph 1; midpoint (centers) between both ears is found to estimate viewing direction of the mouse. Fig. 5 shows viewing direction and outlines of the mice.) the second subject in the first frame and respective second outlines and respective second centers of the first subject and the second subject in the second frame, (Unger Fig. 9; it can be seen in first column of frames that each mouse has an outline and a viewing direction (calculated using midpoint/center) indicated before any overlap between subjects occurs (first computer vision algorithm since all points in model for both mice are used.)) Unger discloses assigning identity according to the maximum overlap between shapes of successive frames(Unger Section Identify preservation), however the combination of Unger and de Chaumont does not explicitly disclose wherein identifying the first subject in the second frame is based on a distance between the first center of the first subject and the second center of the first subject being smaller than a distance between the first center of the first subject and the second center of the second subject. Chen, however, discloses wherein identifying the first subject in the second frame is based on a distance between the first center of the first subject and the second center of the first subject being smaller than a distance between the first center of the first subject and the second center of the second subject. (Chen Col 6 lines 27-41 and Fig. 2; the detection that is closest between frames, being defined as distance between centers of the detections, is assigned the same identity. It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger and Chaumont with the teachings of Chen by including the rule that the tracked object be matched to the detected object with the closest distance between frames in order to ensure high confidence and unambiguous matching (Chen Col 6 lines 27-41). Claim(s) 12-13, 28 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger (Unger J, Mansour M, Kopaczka M, Gronloh N, Spehr M, Merhof D. An unsupervised learning approach for tracking mice in an enclosed area. BMC bioinformatics. 2017 Dec;18:1-4.) in view of de Chaumont (de Chaumont F, Ey E, Torquet N, Lagache T, Dallongeville S, Imbert A, Legou T, Le Sourd AM, Faure P, Bourgeron T, Olivo-Marin JC. Live Mouse Tracker: real-time behavioral analysis of groups of mice. BioRxiv. 2018 Jun 14:345132.) and Mead (Pub. No. WO2004066705A2). Regarding claim 12, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 1, as described above. The combination of Unger and de Chaumont does not explicitly disclose further comprising: determining that a condition for an interaction associated with the first subject or the second subject is satisfied; and triggering an interaction device to perform the interaction based on the condition for the interaction being satisfied. Mead, however, discloses further comprising: determining that a condition for an interaction associated with the first subject or the second subject is satisfied; and triggering an interaction device to perform the interaction based on the condition for the interaction being satisfied. (Mead p. 4 lines 20-30; a triggering element that causes administration of aerosolized drugs is disclosed.) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger and de Chaumont with the teachings of Mead by a triggered drug administration from Mead in order to expand the utility of the method to include the study of drug addiction (Mead p. 4 lines 5-19). Regarding claim 13, the combination of Unger, de Chaumont, and Mead discloses the claim limitations with regards to claim 12, as disclosed above. Mead further discloses wherein the log information includes information that is determined based on the interaction. (Mead p. 13 lines 29-33; frequency of aerosol dispersions are recorded (logged).) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the method of the combination of Unger, de Chaumont, and Mead further with the teachings of Mead by including logging frequency of triggers in order to provide more information when studying drug addiction (Mead p. 4 lines 5-19). Regarding claim 28, the combination of Unger and de Chaumont disclose the claim limitations with regards to claim 26, as disclosed above. Additionally limitations correspond to claim 12 and are rejected for similar reasons. Claim(s) 29 is/are rejected under 35 U.S.C. 103 as being unpatentable over Unger (Unger J, Mansour M, Kopaczka M, Gronloh N, Spehr M, Merhof D. An unsupervised learning approach for tracking mice in an enclosed area. BMC bioinformatics. 2017 Dec;18:1-4.) in view of de Chaumont (de Chaumont F, Ey E, Torquet N, Lagache T, Dallongeville S, Imbert A, Legou T, Le Sourd AM, Faure P, Bourgeron T, Olivo-Marin JC. Live Mouse Tracker: real-time behavioral analysis of groups of mice. BioRxiv. 2018 Jun 14:345132.) and Perez-Escudero (Pérez-Escudero A, Vicente-Page J, Hinz RC, Arganda S, De Polavieja GG. idTracker: tracking individuals in a group by automatic identification of unmarked animals. Nature methods. 2014 Jul;11(7):743-8.). Regarding claim 29, the combination of Unger and de Chaumont discloses the claim limitations with regards to claim 26, as described above. The combination of Unger and de Chaumont does not explicitly disclose wherein the one or more processors are further configured to: process each video segment with well identified tracks for each of the first subject and the second subject as an independent stream of video; and produce disjointed logs based on separately analyzing behaviors associated with the first subject and the second subject within each of the independent stream of video. Perez-Escudero, however discloses wherein the one or more processors are further configured to: process each video segment with well identified tracks for each of the first subject and the second subject as an independent stream of video; (Perez-Escudero Fig. 2 and Online Methods – Fragments of Trajectories, Selection of images that belong to a single individual, and Collection of reference images; video segments where no subjects are overlapping are grouped. Each fragment is extracted as a collection of single individual blobs from the video (independent stream). It can be seen in Fig. 2 that each subject has its own track that is color coated (well identified tracks).) and produce disjointed logs based on separately analyzing behaviors associated with the first subject and the second subject within each of the independent stream of video. (Perez-Escudero Fig. 2; Colored tracks for each subject can be seen in Fig. 2. These are disjoint logs showing subjects movements (behaviors).) It would have been obvious, before the effective filing date of the claimed invention, to one of ordinary skill in the art to modify the device of the combination of Unger and de Chaumont with the teachings of Perez-Escudero by including fragment trajectories in order to prevent perpetuation of subject identification errors when subjects overlap (Perez-Escudero Abstract). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEREDITH TAYLOR whose telephone number is (571)270-5805. The examiner can normally be reached M-Th 7:30-5. Examiner’s email is Meredith.taylor@uspto.gov. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vincent Rudolph can be reached on (571)272-8243. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MEREDITH TAYLOR/Examiner, Art Unit 2671 /VINCENT RUDOLPH/Supervisory Patent Examiner, Art Unit 2671
Read full office action

Prosecution Timeline

Mar 07, 2022
Application Filed
Nov 06, 2024
Non-Final Rejection — §103
Jan 27, 2025
Response Filed
Feb 07, 2025
Final Rejection — §103
Apr 23, 2025
Response after Non-Final Action
May 06, 2025
Request for Continued Examination
May 08, 2025
Response after Non-Final Action
May 29, 2025
Non-Final Rejection — §103
Aug 21, 2025
Response Filed
Sep 05, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12579602
END-TO-END CAMERA CALIBRATION FOR BROADCAST VIDEO
2y 5m to grant Granted Mar 17, 2026
Patent 12551299
SYSTEM AND METHOD OF UTILIZING COMPUTER-AIDED IDENTIFICATION WITH MEDICAL PROCEDURES
2y 5m to grant Granted Feb 17, 2026
Patent 12511724
IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND MAGNETIC RESONANCE IMAGING DEVICE
2y 5m to grant Granted Dec 30, 2025
Patent 12511888
COMPUTER-IMPLEMENTED METHOD OF HANDLING AN EMERGENCY INCIDENT, COMMUNICATION NETWORK, AND EMERGENCY PROCESSING UNIT
2y 5m to grant Granted Dec 30, 2025
Patent 12505651
Image Identification System and Image Identification Method for Identifying Images Based on Divided Training Images
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+54.3%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 49 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month