Prosecution Insights
Last updated: April 19, 2026
Application No. 18/350,692

PROCESSING A MEDICAL IMAGE

Non-Final OA §101§102§103§112
Filed
Jul 11, 2023
Examiner
SHIN, SOO JUNG
Art Unit
2667
Tech Center
2600 — Communications
Assignee
Ib Lab GmbH
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
527 granted / 604 resolved
+25.3% vs TC avg
Strong +16% interview lift
Without
With
+16.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
28 currently pending
Career history
632
Total Applications
across all art units

Statute-Specific Performance

§101
7.6%
-32.4% vs TC avg
§103
37.5%
-2.5% vs TC avg
§102
19.9%
-20.1% vs TC avg
§112
24.2%
-15.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 604 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “means adapted to execute” in claim 14. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 3-6 and 12 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 3-4, 6, and 12 recite the limitation “when applicable” and “in particular.” The limitation renders the claims indefinite because it is not clear whether the processes are required or optional, and thus the examiner is required to subjectively determine whether other inventions would infringe on these claimed limitations. In addition, the term “particular” is a relative and/or subjective term that is not defined by the claim. The specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. A claim that requires the exercise of subjective judgment without restriction renders the claim indefinite. In re Musgrave, 431 F.2d 882, 893, 167 USPQ 280, 289 (CCPA 1970). Claim scope cannot depend solely on the unrestrained, subjective opinion of a particular individual purported to be practicing the invention. Datamize LLC v. Plumtree Software, Inc., 417 F.3d 1342, 1350, 75 USPQ2d 1801, 1807 (Fed. Cir. 2005)); see also Interval Licensing LLC v. AOL, Inc., 766 F.3d 1364, 1373, 112 USPQ2d 1188 (Fed. Cir. 2014). Claim 5 depends from claim 4 and therefore inherit all of the deficiencies of claims 3-4. For the purpose of further examination, the claim limitations have been interpreted as follows: Claim 3: “The method of claim 1, wherein the object detection and classification is configured for objects detecting and classifying one or more of instances of body parts, one or more instances of body implants, and one or more instances of outside structures, wherein the outside structures comprise one or more of annotations, measurements, and calibration objects.” Claim 4: “The method of claim 3, wherein the at least one convolutional neural network used for object detection and classification is trained with training data comprising medical images with annotated and classified objects, wherein the annotated and classified objects are one or more from a group consisting of the* body parts, the* body implants, and the* outside structures, wherein the outside structures comprise one or more of annotations, measurements, and calibration objects.” Note the difference between “consist” and “comprise.” * “the” can be changed to “said”, corresponding to the limitations already recited in claim 3. Claim 6: “The method of claim 1, wherein the object classification is configured to discriminate laterality of the detected objects.” Claim 12: “The method of claim 1, wherein the medical image is a radiographic image, , an ultrasound image, a computer tomography image, a magnetic resonance image, or a positron emission image, wherein the radiographic image is a two-dimensional x-ray image.” Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 35 U.S.C. 101 requires that a claimed invention must fall within one of the four eligible categories of invention (i.e. process, machine, manufacture, or composition of matter) and must not be directed to subject matter encompassing a judicially recognized exception as interpreted by the courts. MPEP 2106. The four eligible categories of invention include: (1) process which is an act, or a series of acts or steps, (2) machine which is an concrete thing, consisting of parts, or of certain devices and combination of devices, (3) manufacture which is an article produced from raw or prepared materials by giving to these materials new forms, qualities, properties, or combinations, whether by hand labor or by machinery, and (4) composition of matter which is all compositions of two or more substances and all composite articles, whether they be the results of chemical union, or of mechanical mixture, or whether they be gases, fluids, powders or solids. MPEP 2106(I). Claims 15-16 are rejected under 35 U.S.C. 101 as not falling within one of the four statutory categories of invention because the broadest reasonable interpretation of the instant claims in light of the specification encompasses transitory signals ([0032]-[0033] of the specification only states “computer-readable medium” or “computer program product” and does not specify that the transitory signals are excluded). Transitory signals are not within one of the four statutory categories (i.e. non-statutory subject matter). See MPEP 2106(I). Claims directed toward a non-transitory computer readable medium may qualify as a manufacture and make the claim patent-eligible subject matter. MPEP 2106(I). Therefore, amending the claims to recite a “non-transitory computer-readable medium” would resolve this issue. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1, 2, 6-9, and 11-16 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bai et al. (“Feature fusion Siamese network for breast cancer detection comparing current and prior mammograms,” Med Phys. 2022;49:3654–3669, DOI: 10.1002/mp.15598), hereinafter referred to as Bai. Regarding claim 1, Bai teaches a method for processing a medical image, the method comprising the following steps: receiving a medical image (Bai pg. 3656 left column: “developing a novel end-to-end model based on the Siamese CNN model that uses previous year and current year images as paired inputs to predict the probability of malignancy”); performing an image content analysis for object detection and classification on said medical image (Bai pg. 3656 left column: “extract intraimage and interimage features from pairs of patients’ previous and current year FFDMs for more accurate breast cancer classification”; Bai pg. 3656 right column: “Vector d1 is concatenated with scalar d2 to build the distance feature for classification”), comprising: propagating the medical image in one iteration through at least one convolutional neural network (Bai pg. 3666 left column: “The one-shot learning characteristic of Siamese-based models can contribute to the superior performance of the proposed model”; Bai Figs. 1-3: the CNNs show the images are propagated in one iteration, indicated by an arrow), and determining after said one iteration one or more detected objects together with a respective classification label identifying one of two or more different available classes and with respective positional parameters relative to the medical image (Bai Table 3 & pg. 3661 left column: “493 mammogram pairs are labeled cancer … and 581 mammogram pares are labelled normal”; Bai pg. 3656: “designing a new distance learning function … The distance learning network measures the distance between the feature maps from the twin networks and employs a fully connected (FC) network to learn the differences between the feature maps (interimage features) … d1 measures the pixel-wise distance of fc and fp, d2 measures the Euclidean distance between fc and fp … Vector d1 is concatenated with scalar d2 to build the distance feature for classification”; Bai Fig. 7: “from the same location”); and storing the determined classification label and positional parameters of one or more detected objects in association with the medical image (Bai Fig. 3 & pg. 3659-3660: “We used four datasets (three for pretraining and one for training and testing): (1) Digital Database for Screening Mammography (DDSM) … ”; Bai pg. 3662 right column: “We used Tesla V100 GPUs with 32 GB memory to train and test all models”). Regarding claim 2, Bai teaches the method of claim 1, wherein the image content analysis is configured to detect and classify also partially cropped and/or at least partially overlapped objects and the determined classification label and positional parameters of said partially cropped and/or at least partially overlapped objects detected in the medical image (Bai pg. 3655 left column: “In a mammography there is a likelihood of missing small tumors surrounded by dense fibroglandular breast tissue, resulting in delays in the diagnosis and missing early detection”; Bai pg. 3661: “In order to increase the generalizability of the data set, we included a variety of tumor and breast density types. The mass type in the data set contains round, oval, architectural distortion, irregular, and lobulated … The data set contains all types of breast density including fatty breast, fibroglandular dense breast, heterogeneously dense breast, and extremely dense breast … All mammograms … where N is height, and M is width, are cropped”; Bai Fig. 4: shows that the cancer is detected and classified even in dense breast tissue, in which the cancer is cropped/overlaid by the dense tissue). Regarding claim 6, Bai teaches the method of claim 1, wherein the object classification is configured to discriminate laterality of the detected objects when applicable (Bai pg. 3661 discussed above teaches distinguishing between different mass types; also see Bai pg. 3656 left column: “learn both interimage (between images) and intraimage features form both Craniocaudal (CC) and (mediolateral oblique) MLO views of patient’s particular breast”). Regarding claim 7, Bai teaches the method of claim 6, wherein two or more specialized processing modules are selected based on the at least one matching parameter, wherein the image is processed with all of the selected processing modules, wherein labels corresponding to medical conditions detected by different processing modules are collectively stored in association with the same image or stored in association with separate copies of the image (Bai pg. 3656 right column: “The goal of the proposed model is to predict the similarity between a current year image, denoted by C, and its corresponding previous year image, denoted by P, where ‘similar’ means normal and ‘dissimilar’ means cancer”; Bai pg. 3657 left column: “at the output layer a sigmoid function, as given in Equation (3) is applied to the distance feature to predict the probability of dissimilarity (cancer) or similarity (normal) … the similarity probability represents the likelihood of abnormal changes between current year and previous year images”; Bai Figs. 1-3). Regarding claim 8, Bai teaches the method of claim 1, wherein the object classification is configured to discriminate view position of the detected objects (Bai pg. 3661 left column: “The FFDMS of two breasts and two view for each breast (LCC, RCC, LMLO, and RMLO) from a majority of patients are included in the data set … The cancer cases were defined as labeled breast CC views and MLO views with biopsy confirmed cancerous breast lesions”). Regarding claim 9, Bai teaches the method of claim 1, further comprising the following steps: providing two or more specialized processing modules, wherein each specialized processing module is associated with one or more compatible mandatory object classes (Bai pg. 3656-3657 & Figs. 1-3 discussed above); comparing the one or more detected object classes associated with the image with each of the one or more compatible mandatory object classes to determine at least one matching parameter for each specialized processing module (Bai pg. 3656-3657 & Figs. 1-3 discussed above), selecting at least one of the two more specialized processing modules based on the at least one matching parameter (Bai pg. 3656-3657 discussed above; see Bai Eq. (3)-(7) for FFS-CNN & Eq. (8) for Siamese CNN), processing the image with the selected at least one processing module, wherein the selected processing module detects one or more medical conditions and stores one or more corresponding labels in association with the image for displaying to a viewer of the image (Bai pg. 3656-3657 & Figs. 1-3 discussed above; also see Bai Fig. 4 & 7). Regarding claim 11, Bai teaches the method of claim 9, wherein each specialized processing module is associated with zero or more compatible optional object classes, wherein the comparing step comprises comparing the one or more detected object classes associated with the image with each of the one or more compatible mandatory object classes and each of the zero or more compatible optional object classes to determine at least one matching parameter for each specialized processing module (Bai pg. 3656-3657, Figs. 1-3, & Eq. (3)-(8) discussed above). Regarding claim 12, Bai teaches the method of claim 1, wherein the medical image is a radiographic image, in particular a two-dimensional x-ray image, an ultrasound image, a computer tomography image, a magnetic resonance image, or a positron emission image (Bai Figs. 1-4 & pg. 3661 left column: “493 mammogram pairs are labeled cancer … 581 mammogram pairs are labeled normal”). Regarding claim 13, Bai teaches the method of claim 1, wherein the medical image is received in the Digital Imaging and Communications in Medicine (DICOM) format (Bai pg. 3661 left column: “the DICOMs were exported from Picture Archiving and Communication Systems (PACS)”). Regarding claim 14, Bai teaches a medical image processing system comprising means adapted to execute the method described in claim 1 (Bai Figs. 1-3 & pg. 3662 right column: “We used Tesla V100 GPUs with 32 GB memory”). Therefore, claim 14 is rejected using the same rationale as applied to claim 1 discussed above. Regarding claim 15, Bai teaches a computer program product comprising instructions to cause a medical image processing system to execute the method described in claim 1 (Bai Figs. 1-3 & pg. 3662 right column: “We used Tesla V100 GPUs with 32 GB memory”). Therefore, claim 15 is rejected using the same rationale as applied to claim 1 discussed above. Regarding claim 16, Bai teaches a computer-readable medium having stored thereon the computer program of claim 15 (Bai Fig. 3 & pg. 3662 right column discussed above). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bai et al. (Med Phys. 2022;49:3654–3669, DOI: 10.1002/mp.15598), in view of Poltaretskyi et al. (US 2019/0380792 A1), hereinafter referred to as Bai and Poltaretskyi, respectively. Regarding claim 3, Bai teaches the method of claim 1, wherein the object detection and classification is configured for objects detecting and classifying one or more instances of body parts (Bai Fig. 4). However, Bai does not appear to explicitly teach detecting body implants and outside structures e.g., measurement/calibration objects. Pertaining to the same field of endeavor, Poltaretskyi teaches detecting body implants and outside structures (Poltaretskyi Fig. 28: 2200 bone structure and 1506 implant components & Poltaretskyi ¶¶0277: “The optimization process can employ any suitable optimization algorithm (e.g., a minimization algorithm such as an Iterative Closest Point or genetic algorithm) to perfect alignment of virtual bone model 1008 with observed bone structure 2200. At block 2016 of FIG. 20A, upon completion of execution of the optimization algorithm, the registration procedure is complete”; Poltaretskyi Fig. 30 & ¶¶0286-¶¶0290: “fixed optical marker 3010 may include a planar fiducial marker … stickers 3016A-3016C that include planar fiducial markers”). Bai and Poltaretskyi are considered to be analogous art because they are directed to medical image processing. It would have been obvious to one of ordinary skill in the art at the time the invention was made to have modified the feature fusion Siamese network for breast cancer detection (as taught by Bai) to detect implants and other structures (as taught by Poltaretskyi) because the combination aids the surgeons by guiding tools to correct locations during surgical procedures (Poltaretskyi Abstract & ¶¶0162). Allowable Subject Matter Claims 4-5 and 10 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: Regarding claim 4, the prior art of record teaches that it was known at the time the application was filed to use the method of claim 3, wherein the at least one convolutional neural network used for object detection and classification is trained with training data comprising medical images with annotated and classified objects, wherein the annotated and classified objects are one or more of body parts, body implants and outside structures, in particular annotation, measurements and calibration objects (Bai pg. 3656: “We used pretrained ResNEt as the backbone … Pairs of current and previous mammogram images are inputs of the proposed model, FFS-CNN … Define S … to present the training data set, where yi represent the class label”; Bai Fig. 2; Poltaretskyi Figs. 28, 30 & ¶¶0277, ¶¶0286-¶¶0290). However, the prior art, alone or in combination, does not appear to teach or suggest that the annotated and classified objects are one or more from a group consisting of body parts, body implants and outside structures, in particular annotation, measurements and calibration objects, i.e., the group consists of no other additional objects and no less than the listed objects due to the use of the phrase “consisting of”. Refer to the 35 U.S.C. 112(b) rejection above regarding the limitation “in particular” for the examiner’s interpretation. The examiner’s statement of reasons for indicating allowable subject matter applies to the claims only as interpreted by the examiner due to the indefinite claim limitation. The statement may no longer apply if any amendments change the scope of the claim. Claim 5 is objected to for the same reason as claim 4 discussed above due to dependency. Regarding claim 10, the prior art of record teaches that it was known at the time the application was filed to use the method of claim 9, wherein the selecting step uses a distance measure applied to the at least one matching parameter and selects processing module(s) corresponding to the smallest distance measure of the two or more specialized processing modules (Bai pg. 3656: “The proposed model consists of two identical parallel CNNS (twin CNNs) with shared weights as twin networks followed by a distance learning network … These feature vectors are input to the distance learning functions … measures the Euclidean distance between fc and fp”; Bai pg. 3656-3657 & Eq. (3)-(8) discussed above). However, the prior art, alone or in combination, does not appear to teach or suggest selecting exactly one processing module corresponding to the smallest distance measure of the two or more specialized processing modules. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SOO J SHIN whose telephone number is (571)272-9753. The examiner can normally be reached M-F; 10-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at (571)272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Soo Shin/Primary Examiner, Art Unit 2667
Read full office action

Prosecution Timeline

Jul 11, 2023
Application Filed
Oct 15, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602768
SURFACE DEFECT DETECTION MODEL TRAINING METHOD, AND SURFACE DEFECT DETECTION METHOD AND SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12586411
TARGET IDENTIFICATION DEVICE, ELECTRONIC DEVICE, TARGET IDENTIFICATION METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12586204
Detecting Optical Discrepancies In Captured Images
2y 5m to grant Granted Mar 24, 2026
Patent 12586216
METHOD OF DETERMINING A MOTION OF A HEART WALL
2y 5m to grant Granted Mar 24, 2026
Patent 12573021
ULTRASONIC DEFECT DETECTION AND CLASSIFICATION SYSTEM USING MACHINE LEARNING
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+16.0%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 604 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month