Prosecution Insights
Last updated: April 19, 2026
Application No. 18/473,101

CANCER MAPPING USING MACHINE LEARNING

Final Rejection §103§112
Filed
Sep 22, 2023
Examiner
ROGERS, SCOTT A
Art Unit
2683
Tech Center
2600 — Communications
Assignee
Avenda Health, Inc.
OA Round
2 (Final)
92%
Grant Probability
Favorable
3-4
OA Rounds
2y 1m
To Grant
93%
With Interview

Examiner Intelligence

Grants 92% — above average
92%
Career Allow Rate
574 granted / 625 resolved
+29.8% vs TC avg
Minimal +1% lift
Without
With
+0.9%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 1m
Avg Prosecution
18 currently pending
Career history
643
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
37.7%
-2.3% vs TC avg
§102
25.6%
-14.4% vs TC avg
§112
12.9%
-27.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 625 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment & Arguments Applicant's amendment and arguments filed 04 December 2025 have been fully considered. The arguments on pages 8-9 with respect to “an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses all clinically significant cancer,” as recited in amended clams 1-12 and “an encapsulation confidence score representing a probability of all clinically significant cancer being within a specified lesion contour," as recited in amended claims 13-16 based on the 35 USC 112(a) rejection below. The arguments on pages 9-10 with respect to “an adjustable lesion contour representing at least one of a size of a cancer lesion or a margin determination for the cancer lesion,” as recited in amended claims 13-16 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 112(a) The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-16 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1-12 are drawn to a device for mapping cancer comprising a machine learning algorithm configured to receive inputs and produce an output based on the inputs, wherein the inputs comprise data elements from a medical image and the output comprises an estimate of clinically significant cancer likelihood at each voxel of a three-dimensional image, and a lesion contour representing a lesion size of a cancer lesion and comprising an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses all clinically significant cancer. Claims 13-16 are drawn to a method for mapping cancer comprising inputting data elements from medical images into a machine learning model estimating a likelihood of clinically significant cancer in a patient, and generating an output, via the machine learning model, including an estimate of the clinically significant cancer likelihood at each voxel of a three-dimensional image, the output further comprising an adjustable lesion contour representing at least one of a size of a cancer lesion or a margin determination for the cancer lesion, and an encapsulation confidence score representing a probability of all clinically significant cancer being within a specified lesion contour. The MPEP states that the purpose of the written description requirement is to ensure that the inventor had possession, as of the filing date of the application, of the specific subject matter later claimed. The MPEP lists factors that can be used to determine if sufficient evidence of possession has been furnished in the disclosure of the application. These include: (1) Level of skill and knowledge in the art, (2) Predictability of the claimed invention in the art (3) Scope of the invention description and drawings (4) Sufficient, relevant, identifying (structural and/or functional) characteristics of the claimed invention, (5) Method of making the claimed invention, and (6) Actual reduction to practice of the claimed invention. See MPEP 2163. While all of the factors have been considered, a sufficient amount for a prima facie case are discussed below. (1) Level of skill and knowledge, and (2) Predictability in the art: The level of skill in the art of detecting cancerous lesions is high. In particular, devices and methods for mapping cancer by inputting data elements from medical images into a machine learning model to estimate a likelihood (probability) of clinically significant cancer in a patient were known to those skilled in the art at the time of the invention. However, one skilled in the art would not reasonably have the knowledge, nor would it be reasonably predictable in the art, to produce an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses all clinically significant cancer or an encapsulation confidence score representing a probability of all clinically significant cancer being within a specified lesion contour. (3) Scope of the invention description and drawings: In the Detailed Description (par. 35, 38-39, 51-52 of the specification), Applicant provides examples of the machine learning model estimating the likelihood of clinically significant cancer. In one example of the methods described herein, a machine learning algorithm can receive MRI and biopsy data, including biopsy pathology labels, as inputs to determine lesion thresholds and the likelihood of cancer encapsulation, with associated encapsulation confidence scores, to minimize the risk of overexposure and over-resection during ablative, radiation, and/or surgical interventions. The machine learning algorithms described herein can be trained on large-population datasets to increase the accuracy of cancer mapping, confidence scores, and threshold boundaries. In at least one example, the algorithm can output an estimate of clinically significant cancer likelihood at each voxel of a three-dimensional image. In another example, inputs from one, two, or more data elements are analyzed, producing an output including a cancer estimation map (CEM), otherwise referred to as a cancer probability map (CPM). It should be understood within the context of this disclosure the words cancer estimation map (CEM) and cancer probability map (CPM) are used interchangeably and define the likelihood of clinically significant prostate cancer at each voxel of a 3-dimensional image. In some examples, inputs from at least one, two, or more data elements, such as medical imaging including MRI imaging, X-ray and ultrasound imaging, other relevant medical imaging, tracked biopsy, biopsy pathology, biopsy core locations, fusion based biopsy data, biomarkers such as PSA, patient demographics such as age, genomic markers, and or other inputs can be utilized by a machine learning algorithm, which can output an estimate of clinically significant cancer at each voxel of a three-dimension image to create the CEM, CLC, and the encapsulation confidence score noted above. The estimate of clinically significant cancer can be used to identify and narrow treatment therapies (e.g., chemotherapy, radiation, surgery, etc.). In one example, the encapsulation confidence score represents the estimated likelihood that a lesion contour encompasses all csPCa. FIG. 2 illustrates an exemplary data flow diagram 200 of a machine learning model estimating the likelihood of clinically significant cancer. As an example, the systems described herein are described with reference to prostate cancer. However, the systems and methods described herein can be applied to other types of cancer as well. In at least one example, a software program can utilize at least one, two, or many data elements as an input 202 including an MRI data element 204, a biopsy pathology data element 206, a prostate specific antigen (PSA) data element 208, and/or other data elements for determining cancer probability. The detection of prostate cancer, and specifically the use of PSA as an input to systems described herein, are exemplary only and not meant to be limiting. Rather, as noted above, the systems and methods described herein can be applied to other types of cancers with other types of antigens, imaging modalities, genetic information, demographic data, or biomarkers indicative of other types of cancer used as inputs to the system. In one example, the input 202 can exclude the PSA data element 208. In at least one example, the data elements 204, 206, 208 can serve as the input 202 to the machine learning model 210. The machine learning model can a single model, multiple models operating in series or in parallel, and/or one or more models with the addition of post-processing analyses. In one example, the machine learning model 210 can then estimate the likelihood of clinically significant cancer. In one example, clinically significant cancer can be defined as Gleason grade group 2 or higher disease in the case of prostate cancer. In at least one example, the machine learning model 210 can subsequently provide an output 216 including an estimate of clinically significant cancer likelihood at each voxel of a 3D image defined herein as a cancer estimation map (CEM) 212. The final output can be a lesion contour 214. The lesion contour can be a 3D surface generated thresholding of cancer probability. The output can include additional data related to or derived from the machine learning model such as the estimated probability of tumor encapsulation, an estimate of tumor staging, an estimate of the patient's suitability for a particular therapy (surgery, radiation, ablative therapy . . . etc.), a segmentation of the tumor, and/or a segmentation of anatomical structures (the prostate, urethra, bladder, seminal vesicles, prostatic zones, vas deferens, rectum, pelvic bone . . . etc.). In par. 39 of the specification, Applicant states: the metadata includes the encapsulation confidence score, which represents the probability of all clinically significant cancer being contained within a specified lesion contour. Applicant notes the systems and methods described can be applied to other types of cancer and notes a range of anatomical structures in which the likelihood of cancer can be estimated using the disclosed machine learning model. However, there are no specific details supporting this conclusion nor that Applicant had possession of such a capability. Applicant also states that the encapsulation confidence score represents the probability of all clinically significant cancer being contained within a specified lesion contour. There is no evidence in the specification that supports Applicants having possession of such an all-encompassing invention to determine or estimate the probability of all clinically significant cancer being contained within a specified lesion contour. At most, Applicants provide specific support for the likelihood or probability of clinically significant prostate cancer (csPCa). (4) Sufficient, relevant, identifying characteristics, (5) Method of making, and (6) Actual reduction to practice of the claimed invention: As described above, the specification does not contain sufficient, relevant, identifying characteristics, describe a method of making, or demonstrate an actual reduction to practice of the claimed invention with respect to producing an encapsulation confidence score represents the probability of all clinically significant cancer being contained within a specified lesion contour. Although the claims and disclosure may recite some functional characteristics with respect to the results of the machine learning model estimating a likelihood (probability) of clinically significant cancer in a patient based on input data elements from medical images, there is no specific disclosure of functional characteristics with respect to an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses all clinically significant cancer or an encapsulation confidence score representing a probability of all clinically significant cancer being within a specified lesion contour. Moreover, the specification lacks any actual reduction to practice of producing an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses all clinically significant cancer or an encapsulation confidence score representing a probability of all clinically significant cancer being within a specified lesion contour. The description requirement of the patent statue requires a description of an invention, not an indication of a result that one might achieve if one made that invention. See In re Wilder, 736, F.2d 1516, 1521, 222 USPQ 369, 372-73 (Fed. Cir. 1984) (affirming rejection because the specification does “little more than outline goals appellants hope the claimed invention achieves and the problems the invention will hopefully ameliorate.”). Accordingly, it is deemed that the specification fails to provide adequate written description for the genus of the claims and does not reasonably convey to one skilled in the relevant art that the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1-16 are further rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, because the specification, while being enabling for mapping cancer by inputting data elements from medical images into a machine learning model to estimate a likelihood (probability) of clinically significant cancer in a patient (par. 35, 38-39, 51-52), does not reasonably provide enablement for producing an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses all clinically significant cancer or an encapsulation confidence score representing a probability of all clinically significant cancer being within a specified lesion contour. The specification does not enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make or use the invention commensurate in scope with these claims. The claims are generally directed to a device and method for mapping cancer by inputting data elements from medical images into a machine learning model to output an estimate of clinically significant cancer likelihood at each voxel of a three-dimensional image and an encapsulation confidence score representing an estimated likelihood (probability) that the lesion contour encompasses all clinically significant cancer. Therefore, included within the scope of the claims is an encapsulation confidence score representing an estimated likelihood (probability) that the lesion contour encompasses all clinically significant cancer. First, the scope sought by this limitation cannot be determined (see 112(b) rejection below). Second, the claims include non-working embodiments within the scope of the claimed invention. For example, one of ordinary skill would not understand how to produce an encapsulation confidence score representing an estimated likelihood (probability) that the lesion contour encompasses all clinically significant cancer. Further, it is unclear if the probability of all types of clinically significant cancer could be determined or estimated, including types of cancer that have not yet been discovered. There is also no guidance presented within the specification with regard to how this is achieved. That is, one of skilled in the art would not know how to make or use the invention to produce an encapsulation confidence score representing an estimated likelihood (probability) that the lesion contour encompasses all clinically significant cancer (i.e., all types or any type of clinically significant cancer). Rather, additional training, testing, assessment and use of a machine learning model for this purpose would need to be done well beyond simply performing the claimed methods. The courts have stated that “tossing out the mere germ of an idea does not constitute enabling disclosure.” Genentech, 108 F.3d at 1366 (quoting Brenner v. Manson, 383 U.S. 519, 536 (1966) (stating, in context of the utility requirement, that “a patent is not a hunting license. It is not a reward for the search, but compensation for its successful conclusion”)). “[R]easonable detail must be provided in order to enable members of the public to understand and carry out the invention.” Id. In the instant case, such reasonable detail is lacking because the claims encompass producing an encapsulation confidence score representing an estimated likelihood (probability) that the lesion contour encompasses all clinically significant cancer (i.e., all types or any type of clinically significant cancer). However, the specification does not describe with any particularity on how this is achieved. As indicated above, it would take a large amount of experimentation with training, testing, and assessment of a machine learning model in order to be able to carry out the invention as claimed in its full scope, assuming that would even be possible, which is suspect. Such an endeavor, without knowledge of the outcome, would constitute undue experimentation. For at least the reasons indicated above, the specification fails to teach the skilled artisan how to make and use the claimed invention in its full scope without undue experimentation. Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The scope of the claimed invention is unclear with respect to an encapsulation confidence score representing an estimated likelihood or probability of “all clinically significant cancer”, since one of ordinary skill would not understand what this encompasses in light of the disclosure. For purposes of examination, the claims have been interpretation to encompass what is actually disclosed, an encapsulation confidence score representing an estimated likelihood or probability of clinically significant cancer, and specifically with respect to some dependent claims, the likelihood of prostate cancer. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-7 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Rix (WO 2022/053817 A1) in view of Guyon et al (US 2012/0008838 A1). Referring to claim 1: Rix discloses a device for mapping cancer, comprising: a processor electrically coupled to memory component storing electronic instructions that, when executed by the processor, cause the device to execute a machine learning algorithm configured to receive inputs and produce an output based on the inputs (inherent as the machine learning method is performed by a computer executing a program – page 6, line 30 to page 7, line 4 and page 7, lines 12-18), wherein: the inputs comprise data elements from a medical image ("embodiments that are able to account for a biomarker test result, such as PSA, by reference to medical imaging, available clinical and patient data, and histopathology findings." – page 4, lines 9-11; "processing images . . . to identify tissue types within regions of interest and determining, by a machine learning method, a relationship between the amount of each tissue type and the known biomarker level" – page 6, line 30 to page 7, line 4); and the output comprises an estimate of clinically significant cancer likelihood at each voxel of a three-dimensional image ("In a preferred embodiment, each tissue type output by the segmentation method is associated with a measure of biomarker production, such as a biomarker density, and the segmentation method enables each voxel of a 3D image (or pixel for a 2D image) that falls within the determined segmentations to be associated with at least one tissue type, or to be determined that no relevant tissue type is present at that location. For each location in the segmentations, the biomarker calculation method (140) retrieves the applicable measure of biomarker production or biomarker density, and sums these measures (accounting for the relative volume of each voxel or pixel) to determine at least one overall biomarker level indicative of the biomarker production of the imaged tissues...The at least one overall biomarker level is output as a result (150) and may be used for clinical interpretation or subsequent tests. The clinical history data may also be output. The at least one overall biomarker level may be used in place of or in addition to a physical biomarker test result for any of the following purposes: confirming a physical biomarker test result; identifying patients who may avoid biopsy; determining the need to biopsy; identifying disease progression; identifying treatment response; selecting treatment; calculating a risk of cancer" – page 10, lines 1-18). Rix fails to explicitly teach the output further comprising a lesion contour representing a lesion size of a cancer lesion shown in the three-dimensional image and the lesion contour including an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses a clinically significant cancer. However, Guyon et al teach such a lesion contour representing a lesion size of a cancer lesion shown in the three-dimensional image ("In the exemplary embodiment, edge detection is used to identify the outer contours of the suspected lesion, as shown in FIG. 7," par. 131; "A data browser was written to visualize the data and the annotations, and to allow editing of the annotations, thus permitting the extraction of approximate values of size and evolution for every cropped image," par. 247) and the lesion contour including an encapsulation confidence score representing an estimated likelihood that the lesion contour encompasses clinically significant cancer ("The resulting vote (score) of the ensemble or second level classifier is post-processed to obtain a mapping of the output to probabilities. The output is converted into an alphanumeric and/or graphical display that may be stored in a memory medium and/or transmitted to the remote user to provide an overall probability, i.e., a confidence level, that the lesion in the image is melanoma," par. 34). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the medical imaging and cancer prediction system as taught by Rix with the support for calculating a confidence score of a cancer detection based on a lesion contour and size as taught by Guyon et al, because such systems and methods allow for segmenting lesions, determining their size, and using machine learning to calculate a likelihood of a cancer diagnosis (Guyon: par. 34, 131, and 247). Furthermore, both Rix and Guyon et al are directed to systems and methods for medical imaging and cancer prediction. Referring to claim 2: Rix discloses the inputs further comprise prostate specific antigen (PSA) and the clinically significant cancer likelihood includes clinically significant prostate cancer (csPCa) likelihood ("By combining PSA density and other clinical indications with MRI scores according to the Pl-RADS 2.1 or similar classification scheme, these are able to obtain an accurate prediction about whether a patient has clinically significant prostate cancer.” – page 2, lines 12-16). Referring to claim 3: Rix discloses the output further comprises a cancer estimation map (CEM) ("Where such ML methods have been applied in cancer imaging, researchers have focused on providing information such as heatmaps for analysis by radiologists (computer aided diagnostics, image fusion), segmentations of organs or tumors to support biopsy, radiotherapy or analysis of tumor progression, and in risk calculations such as identifying patients for biopsy or evaluating prognosis. Systems using these methods have shown promising performance when compared to manual methods such as Pl-RADS. As with manual evaluation, performance of such systems, in particular of risk calculators, may be enhanced by introducing information from biomarkers such as PSA or PSA density, as well as scores from radiological or clinical evaluation." – page 3, lines 23-31). Referring to claim 4: Rix discloses the CEM illustrates a color-coded heat map representing a likelihood of cancer at each voxel of the three-dimensional image (page 3, lines 23-31; page 10, lines 1-18). Referring to claim 5: Rix discloses the medical image is an MRI image of a patient's anatomy ("Optionally, the biomarker is PSA, the imaging method is MRI, and the region of interest comprises the subject's prostate." – page 6, lines 19-20). Referring to claim 6: Rix discloses the anatomy includes a prostate (page 6, lines 19-20). Referring to claim 7: In the combination of Rix and Guyon et al, the CEM in Rix comprises the lesion contour representing the lesion size of the cancer lesion shown in the three-dimensional image as taught in Guyon et al (as cited therein above). Referring to claim 12: Rix discloses the medical image includes an MRI image (page 6, lines 19-20). Claims 13-16 are rejected under 35 U.S.C. 103 as being unpatentable over Rix in view of Guyon et al and Park et al (US 20220059227 A1). Referring to claim 13: Rix discloses a method for mapping cancer, comprising: inputting data elements from medical images into a machine learning model estimating a likelihood of clinically significant cancer in a patient (page 4, lines 9-11; page 6, line 30 to page 7, line 4); and generating an output, via the machine learning model, including an estimate of the clinically significant cancer likelihood at each voxel of a three-dimensional image (page 10, lines 1-18) Rix fails to explicitly teach the output further comprising a lesion contour representing a size of a cancer lesion shown in the three-dimensional image and an encapsulation confidence score representing a probability of a clinically significant cancer being within a specified lesion contour. However, Guyon et al teach such a lesion contour representing a size of a cancer lesion shown in the three-dimensional image ("In the exemplary embodiment, edge detection is used to identify the outer contours of the suspected lesion, as shown in FIG. 7," par. 131; "A data browser was written to visualize the data and the annotations, and to allow editing of the annotations, thus permitting the extraction of approximate values of size and evolution for every cropped image," par. 247) and an encapsulation confidence score representing a probability of a clinically significant cancer being within a specified lesion contour ("The resulting vote (score) of the ensemble or second level classifier is post-processed to obtain a mapping of the output to probabilities. The output is converted into an alphanumeric and/or graphical display that may be stored in a memory medium and/or transmitted to the remote user to provide an overall probability, i.e., a confidence level, that the lesion in the image is melanoma," par. 34). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the medical imaging and cancer prediction system as taught by Rix with the support for calculating a confidence score of a cancer detection based on a lesion contour and size as taught by Guyon et al, because such systems and methods allow for segmenting lesions, determining their size, and using machine learning to calculate a probability of a cancer diagnosis (Guyon: par. 34, 131, and 247). Furthermore, both Rix and Guyon et al are directed to systems and methods for medical imaging and cancer prediction. The combination of Rix and Guyon et al do not provide an adjustable lesion contour representing at least one of a size of a cancer lesion or a margin determination for the cancer lesion. However, Park et al teach that when the contour of the identified lesion area is changed, the shape of the lesion area, the orientation of the lesion, the margin feature of the lesion, and the like may be changed and accordingly, the diagnostic result of the lesion may also be varied (par. 159). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Rix and Guyon et al in view of Park et al to provide an adjustable lesion contour representing at least one of a size of a cancer lesion or a margin determination for the cancer lesion in order to allow the lesion area to be accurately extracted and thus the diagnostic result (probability of clinically significant cancer) to be accurately and reliably derived (Park et al – par. 159). Referring to claim 14: Rix discloses inputting data elements from biopsy and biopsy pathology labels into the machine learning model ("Biopsy and histopathology analysis is often available for certain tissues of interest as part of the process of diagnosing or managing cancer, and the presence of cancer in such analysis provides a positive diagnosis with high confidence. In a preferred embodiment, the clinical data input (111) permits the entry of at least one histopathology finding indicating that a particular tissue type is present at a location. An annotation segmentation may be provided indicating the expected extent of such tissue type. In this embodiment, tissues in proximity to such at least one histopathology finding are assigned to the indicated tissue type" – page 12, lines 14-21). Referring to claim 15: Rix discloses the machine learning model is trained on a population data set including the data elements ("A segmentation method may be implementing using a plurality of deep learning or machine learning segmentation models, as described in any of references [10][11][12][13], trained and validated using expert human annotations of the respective tissue types in a selection of patient cases according to training and validation methods known in the art of machine learning" – page 13, lines 16-22). Referring to claim 16: Rix discloses the output includes a visual representation of the three-dimensional image with a color-coded heat map representing the clinically significant cancer likelihood at each voxel (page 3, lines 23-31; page 10, lines 1-18). Allowable Subject Matter Claims 17-20 are allowed and claims 8-11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(a & b) set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.. Referring claims 8-11, the prior art does not teach or suggest a visual curve representing the encapsulation confidence score versus the lesion size. Although Guyon discusses the use of GUI sliders to adjust certain imaging parameters, this does not teach or suggest a graphical curve being adjusted by a user. Referring to claims 17-20, while Rix discloses a method for mapping cancer, comprising: inputting data elements from medical images into a machine learning model estimating a likelihood of clinically significant cancer in a patient (page 4, lines 9-11; page 6, line 30 to page 7, line 4); and displaying a visual representation of the likelihood at each voxel of a three-dimensional image, the visual representation comprising: a cancer estimation map (CEM) illustrating a color-coded heat map representing to likelihood of clinically significant cancer overlying the image (page 3, lines 23-31; page 10, lines 1-18), and Guyon et al teach such a method, wherein the CEM includes a lesion contour representing a size of a cancer lesion (par. 131, 247), the prior art fails to explicitly teach or suggest such a method further comprising in combination: a curve representing an encapsulation confidence score versus the size, the curve including a point representing the lesion size and the encapsulation confidence score; wherein: the point is configured to be visually manipulated along the curve to change the lesion size and the encapsulation confidence score represented by the point; and manipulating the point alters the lesion contour, as discussed above. Information Disclosure Statement The information disclosure statement(s) submitted on 04 December 2025 was filed in compliance with the provisions of 37 CFR 1.97 and 1.98. Accordingly, the statement has been considered by the examiner. The relevance of the cited documents, not otherwise applied above or indicated below, can be found in the European Search Report and/or Written Opinion from the EPO dated 25 September 2025 for application no. EP 23 86 9253 (of record). Cited Art The prior art and other references made of record and not relied upon are considered pertinent to applicant's disclosure. Ray et al (US 7844087 B2) disclose a method of segmenting a lesion (910) from normal anatomy in a 3-dimensional image comprising the steps of: receiving an initial set of voxels (520) that are contained within the lesion to be segmented; growing a region which includes the lesion from the initial set of voxels; identifying a second set of voxels (530) on a surface of the normal anatomy; determining a surface containing the second set of voxels which demarks a boundary (540) between the lesion and the normal anatomy; and classifying voxels which are part of the lesion. Ye et al (US 8379950 B2) disclose computer-implemented method of detecting an object in a three-dimensional medical image comprises determining the values of a plurality of features at each voxel in at least a portion of the medical image. Each feature characterizes a respective property of the medical image at a particular voxel. The likelihood probability distribution of each feature is calculated based on the values of the features and prior medical knowledge. A probability map is generated by using Bayes' law to combine the likelihood probability distributions, and the probability map is analyzed to detect an object. See claim 1. Voros et al (US 8970578 B2) disclose methods and systems utilizing the data provided by a non-contrast-enhanced CAC scan that is left unused by the "whole-heart" Agatston or volume scores. Agatston and volume scores summarize overall coronary calcium burden, but do not show the number of vessels involved, the geographic distribution of the lesions, the size and shape of the individual lesions and the distance of the lesions from the coronary ostium. The methods and systems described herein extract and use the enhanced information provided by 3-D CAC scan data and significantly increases its clinical predictive value by providing vessel and lesion specific CAC scores which are superior to the whole-heart Agatston and volume scores in predicting obstructive Coronary artery disease (CAD). See claim 1. Zhang et al (US 10603007 B2) disclose a method and system acquiring, processing and displaying breast ultrasound images in a way that makes breast ultrasound screening more practical and thus more widely used, and reduces the occurrence of missing cancers in screening and diagnosis, using automated scanning of chestwardly compressed breasts with ultrasound. Enhanced, whole-breast navigator overview images are produced from scanning breasts with ultrasound that emphasize abnormalities in the breast while excluding obscuring influences of non-breast structures, particularly those external to the breast such as ribs and chest wall, and differentiating between likely malignant and likely benign abnormalities and otherwise enhancing the navigator overview image and other images, thereby reducing the time to read, screen, and/or diagnose to practical time limits and also reduce screening or diagnostic errors. See claim 1. Anand et al (US 12417533 B2) disclose systems and methods that provide for automated analysis of medical images to determine a predicted disease status (e.g., prostate cancer status) and/or a value corresponding to predicted risk of the disease status for a subject. The approaches described herein leverage artificial intelligence (AI) to analyze intensities of voxels in a functional image, such as a PET image, and determine a risk and/or likelihood that a subject's disease, e.g., cancer, is aggressive. The approaches described herein can provide predictions of whether a subject that presents a localized disease has and/or will develop aggressive disease, such as metastatic cancer. These predictions are generated in a fully automated fashion and can be used alone, or in combination with other cancer diagnostic metrics (e.g., to corroborate predictions and assessments or highlight potential errors). As such, they represent a valuable tool in support of improved cancer diagnosis and treatment. Rajagopal et al (US 20230410301 A1) disclose techniques for non-invasive tumor identification, classification, and grading using mixed exam-, region-, and voxel-wise supervision. Particularly, aspects are directed to a computer implemented method that includes obtaining medical images of a subject, inputting the medical images into a three-dimensional neural network model constructed to produce a voxelwise cancer risk map of lesion occupancy and cancer grade as two output channels using an objective function having a first loss function that captures strongly supervised loss for regression in lesions and a second loss function that captures weakly supervised loss for regression in regions, generating an estimated segmentation boundary around one or more lesions, predicting a cancer grade for each pixel or voxel within the medical images, and outputting the voxelwise cancer risk map of lesion occupancy determined based on the estimated segmentation boundary and the cancer grade for each pixel or voxel within the medical images. The use of fusion based biopsy data id disclosed (par. 34). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Scott Rogers whose telephone number is 571-272-7467. The examiner can normally be reached 8 am to 7 pm flex. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abderrahim Merouan can be reached on 571-270-5254. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Scott A Rogers/ Primary Examiner, Art Unit 2683 19 February 2026
Read full office action

Prosecution Timeline

Sep 22, 2023
Application Filed
Sep 06, 2025
Non-Final Rejection — §103, §112
Dec 04, 2025
Response Filed
Feb 19, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597125
APPARATUS AND METHOD FOR CORRECTING A CONTOUR OF AN OBJECT IN A MEDICAL IMAGE
2y 5m to grant Granted Apr 07, 2026
Patent 12597120
PRINTED IMAGE DEFECT DISCRIMINATION DEVICE AND METHOD DISPLAYING DETECTED DEFECTS IN LIST BY TYPE IN DISPLAY MODE ACCORDING TO STATE OF DEFECT
2y 5m to grant Granted Apr 07, 2026
Patent 12597138
SYSTEMS AND METHODS FOR ANNOTATING TARGET IMAGES BASED ON FEATURES THEREIN AND SELECTED CANDIDATE SAMPLE IMAGES WITH ANNOTATIONS
2y 5m to grant Granted Apr 07, 2026
Patent 12586391
Systems and Methods for Deconvolving Cell Types in Histology Slide Images, Using Super-Resolution Spatial Transcriptomics Data
2y 5m to grant Granted Mar 24, 2026
Patent 12578488
IMPROVED ATTENUATION MAP GENERATED BY LSO BACKGROUND
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
92%
Grant Probability
93%
With Interview (+0.9%)
2y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 625 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month