Prosecution Insights
Last updated: April 19, 2026
Application No. 18/298,468

System and Method for Monitoring Lesion Progression Over Multiple Medical Scans

Non-Final OA §101§103
Filed
Apr 11, 2023
Examiner
KOETH, MICHELLE M
Art Unit
2671
Tech Center
2600 — Communications
Assignee
Wisconsin Alumni Research Foundation
OA Round
3 (Non-Final)
77%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
94%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
331 granted / 429 resolved
+15.2% vs TC avg
Strong +17% interview lift
Without
With
+16.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
34 currently pending
Career history
463
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
62.2%
+22.2% vs TC avg
§102
8.5%
-31.5% vs TC avg
§112
14.7%
-25.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 429 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/23/2025 has been entered. Response to Arguments Applicant’s arguments and amendments in the Amendment with RCE filed December 23, 2025, (herein “Amendment”), with respect to the rejection of claim 3 under 35 U.S.C. 112(b) have been fully considered and are persuasive. The rejection of claim 3 under 35 U.S.C. 112(b) has been withdrawn. Applicant's arguments and amendments in the Amendment regarding the rejection of the independent claims, and claims depending therefrom under 35 U.S.C. 101 have been fully considered but they are not persuasive. Applicants argue on pages 5–6 of the Amendment that the amendment, with claim 2 as an example, of “receive a set of at least three scans … in the form of an array of digitized image values in an electronic format from a diagnostic imaging machine,” overcomes the 101 rejection because the human mind does not have the ability to receive digitized medical records in electronic form. However, setting aside that such an argument depends on a narrowed interpretation of “electronic form,” even if such amendments now recite the “receive a set of at least three scans” to not be directed towards an abstract idea, they nonetheless are not enough to provide a practical application as such operations can be considered insignificant extra-solution activity, and also not be significantly more as such operations are well-understood routine and conventional in the image processing arts. Therefore, the rejection under 101 is maintained, with the rejection rationale having been updated below to reflect the claim amendments. Applicant’s arguments and amendments in the Amendment with respect to the rejections of claims 2 and 4, and claims depending therefrom under 102 or 103 have been fully considered and are persuasive in part. The scope of claims 2 and 4 are different, where in claim 2, further narrowing amendments regarding “an aggregate measure” are not explicitly taught in Jeraj or Brynolfsson1, and for these reasons, upon further consideration, a new ground of rejection is made in view of Brynolfsson et al., US 2023/0410985 (herein “Brynolfsson2”), previously cited as pertinent to Applicant’s disclosure, albeit not previously cited in a rejection. It is noted however, that Brynolfsson1 does at least teach claim 2’s newly amended “link lesion changes to particular organs for multiple organs and lesions” in that ¶¶ 138 and 144 teach analyzing PET and SPECT images to determine the particular organs that a radiopharmaceuticals that are designed to bind to cancerous tissue in a variety of organs, including prostate, lungs, and bones, has accumulated in. Further, ¶¶146, 151 and 154, disclose that the locations of hotspots indicative of accumulated radiopharmaceutical and a potential lesion are identified and displayed, and labeled as corresponding (linked) to a bone, lymph or prostate (multiple organs). Regarding claim 4, while Jeraj does not teach the newly amended “graphically depicted link,” already of record reference Brynolfsson (now Brynolfsson1) does teach these new limitations, in a new ground of rejection under 103 of the combination of Jeraj and Brynolfsson, set forth below. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 2, and 4–12 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without a practical application or significantly more. Regarding claims 2 and 4, these claims recite the following limitations which are found to be abstract ideas not reciting a practical application or significantly more: (a) receive a set of at least three scans of tissue of the patient revealing diseased tissue; (b) determine lesion volumes in the scan as assigned to identifiers; (c) determine an overlapping of lesion volumes between all pairs of scans of the set to provide a set of overlap measures for each pair of scans for each pair of identifiers;[claim 1 only – (f) link lesion changes to particular organs for multiple organs and lesions;] (all of these limitations directed towards abstract idea as a mental process as a human mind is capable of receiving a set of scans, determining lesion volumes and assigning identifiers in the scans, overlapping and overlap measures for the lesions in the scans); (d) link pairs of the identifiers of different scans to globally maximize the overlap measures of the set over all of the scans (abstract idea as mathematical concepts per MPEP §2106.04(a)(2)(I)). Further regarding claim 2 specifically, the additionally claimed “identifies a set of different organs within the scans and wherein the output further identifies an aggregate measure of lesion changes for multiple lesions in a given organ linked to the given organ for each of the multiple organs,” is also reciting an abstract idea as a mental process as a human mind is capable of identifying a change of a lesion to different organs, for example, a doctor studying different images is able to mentally identify changes of lesions to different organs. Further regarding claim 4, the additionally claimed “outputs a graphic display graphically indicating a linkage between representations of lesions of different scans,” would be an abstract idea as a mental process since it can be practically performed by a doctor using a pencil and paper aid, where a doctor could sketch linkages between two images for lesions on different scans. Also, the graphic display is an additional element, and outputting on the graphic display, is insignificant extra solution activity since it is merely data output (see MPEP §2106.05(g)). Moreover, this element amounts to outputting data in a computer based system and is well understood, routine, conventional activity. See MPEP 2106.05(d), subsection II. Further regarding claims 2 and 4, the additional element of “receive a set of at least three scans … in the form of an array of digitized image values in an electronic format from a diagnostic imaging machine” if not an abstract idea, would be considered insignificant extra-solution activity as it is directed towards data gathering, per MPEP 2106.05(g), and also would be well-understood routine and conventional activity, hence, not significantly more. This judicial exception is not integrated into a practical application for the following reasons. Claims 2 and 4 recite the additional element of the step of (e) output a display indicating a lesion change identified to given linked lesions, which while not necessarily being an abstract idea, is insignificant extra solution activity since it is merely data output (see MPEP §2106.05(g)). Moreover, this element amounts to outputting data in a computer based system and is well understood, routine, conventional activity. See MPEP 2106.05(d), subsection II. Claims 2 and 4 further recite additional elements an apparatus for assessing treatment of a patient comprising: an electronic computer executing a stored program to. While the apparatus, and computer executing a stored program of claim 1 are additional elements, they are not sufficient to recite a practical application of the abstract ideas recited in claim 1 as they amount to mere generic computer elements and thus amount to no more than a recitation of the words "apply it" (or an equivalent) or are no more than mere instructions to implement an abstract idea or other exception on a computer. see MPEP §2106.05(f). Further, the claims do not include any further additional elements that are sufficient to amount to significantly more than the judicial exception because when considered separately and in combination, the above recited additional elements from claim 1 does not add significantly more (also known as an “inventive concept”) to the exception. Rather, the additional elements disclosed above perform well-understood, routine, conventional computer functions as recognized by the court decisions listed in MPEP § 2106.05(d). Therefore, independent claims 2 and 4 are directed towards an abstract idea without a practical application or significantly more. Regarding claim 5, the limitations in this claim does not integrate the recited abstract ideas into a practical application for the following reasons. Claim 5 recites the additional element directed towards receiving input from a user, which while not necessarily being an abstract idea, is insignificant extra solution activity since it is merely data gathering (see MPEP §2106.05(g)). Regarding claim 6, the limitations in these claims are do not integrate the recited abstract ideas into a practical application for the following reasons. Claim 6 recites the additional elements directed towards a graphic display and outputting on the graphic display, which while not necessarily being an abstract idea, is insignificant extra solution activity since it is merely data output (see MPEP §2106.05(g)). Moreover, this element amounts to outputting data in a computer based system and is well understood, routine, conventional activity. See MPEP 2106.05(d), subsection II. Moreover, this element amounts to receiving data in a computer based system and is well understood, routine, conventional activity. See MPEP 2106.05(d), subsection II. Regarding claim 7, this claim simply further limits the abstract ideas recited in claim 4 as to the type of data being processed, and therefore, is simply an extension of the abstract idea, without providing a practical application or significantly more. Regarding claims 8–9, the limitations are merely directed towards further abstract ideas, specifically mathematical concepts per MPEP §2106.04(a)(2)(I). Regarding claims 10–11, the limitations in these claims extend upon the insignificant extra solution activity as claimed in claim 4, as merely being data output (see MPEP §2106.05(g)). Moreover, these elements amount to outputting data in a computer based system and is well understood, routine, conventional activity. See MPEP 2106.05(d), subsection II. Regarding claim 12, the limitations of “the output indicating a linkage is superimposed on at least one scan image” recite an abstract idea as a mental process as a doctor using a pen and paper aid would be capable of indicating a linkage and superimposition on a scan image. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Jeraj et al., US Patent Application Publication No. US 2022/0338805 A1 (herein “Jeraj”) in view of Brynolfsson et al., US Patent Application Publication No. US 2023/0410985 A1 (herein “Brynolfsson2”). Regarding claim 2, with deficiencies of Jeraj noted in square brackets [], Jeraj teaches an apparatus for assessing treatment of a patient comprising (Jeraj Abstract, ¶¶5–6, apparatus for tracking disease progression and therapeutic response): an electronic computer executing a stored program to (Jeraj ¶44): (a) receive a set of at least three scans of tissue of the patient revealing diseased tissue, the three scans in the form of an array of digitized image values in an electronic format from a diagnostic imaging machine (Jeraj ¶43, patient imaged at least two different scans of image data (digitized image values in an electronic format), and then supplemented with scans from other scanners (diagnostic imaging machine)); (b) determine lesion volumes in the scan as assigned to identifiers (Jeraj ¶49, lesion mask representing volumes, with each voxel having a value of either 1 for lesion present, or 0 for absence of a lesion); (c) determine an overlapping of lesion volumes between all pairs of scans of the set to provide a set of overlap measures for each pair of scans for each pair of identifiers (Jeraj ¶52, amounts of overlap for each lesion in the scans are compiled and recorded in a matrix); (d) link pairs of the identifiers of different scans to globally maximize the overlap measures of the set over all of the scans (Jeraj ¶¶56, 58, linear assignment (link) of lesions in one scan to lesions in another scan is solved such that the amount of overlap between corresponding lesions is globally maximized); and (e) output a display indicating a lesion change identified to given linked lesions (Jeraj ¶¶58, 44, identification of corresponding lesions from among the scans is provided on a display 30 in the form of a chart); (f) link lesion changes to particular organs for multiple organs and lesions (Jeraj ¶¶44–45, output indicating disease progression or regression based on measures based on the scans, where the tracking can be of skin or brain lesions (skin and brain are organs), where ¶ 58 teaches the changes of lesions indicated x for disappearing lesions, or n for appearing lesions); and wherein the electronic computer executing the stored program further identifies a set of different organs within the scans (Jeraj ¶¶44–45, output indicating disease progression or regression based on measures based on the scans, where the tracking can be of skin or brain lesions (skin and brain are organs)) and wherein the output further identifies [an aggregate measure of] lesion changes for multiple lesions in a given organ linked to the given organ for each of the multiple organs (Jeraj ¶58, each lesion is identified and marked with an “x” for disappearing lesions or “n” for appearing lesions (change), where ¶¶44–45, teaches indicating disease progression or regression based on measures based on the scans, where the tracking can be of skin or brain lesions (skin and brain are multiple organs)). While Jeraj at least suggests that the output identifies some measure of a lesion change to one of the different organs for each of the different organs, Jeraj does not explicitly teach that the identification is “an aggregate measure of.” Brynolfsson2 teaches an aggregate measure of (Brynolfsson2 ¶¶41, 81–82, identifying hotspots indicating lesions and calculating a set of values tracking a change (aggregate measure) of the tumor burden over time for a plurality of medical images, and causing the display of a graphical representation of the values). Therefore, taking the teachings of Jeraj and Brynolfsson2 together as a whole, it would have been obvious to a person having ordinary skill in the art (herein “PHOSITA”) before the effective filing date of the claimed invention to have modified the image output of Jeraj to include the aggregate measures as disclosed in Brynolfsson2 at least because doing so would allow for informing clinical decision making, evaluating treatment efficacy, and predicting patient responses. See Brynolfsson2 Abstract. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Brynolfsson et al., US Patent Application Publication No. US 2023/0115732 A1 (herein “Brynolfsson1”) in view of Dzyubachyk et al., “Comparative exploration of whole-body MR through locally rigid transforms,” Int J CARS, Springer, 2013 (herein “Dzyubachyk”) in view of Brynolfsson2. Regarding claim 2, with deficiencies noted in square brackets [], Brynolfsson1 teaches an apparatus for assessing treatment of a patient comprising (Brynolfsson1 Abstract, ¶5, systems to analyze 3D images for lesion classification, where such information is used by physician to provide a recommended course of treatment to the patient and to track the progression of disease): an electronic computer executing a stored program to (Brynolfsson1 ¶284, environment for the disclosed system including application servers (electronic computer) with storage and retrieval capabilities, and software to process data): (a) receive a set of [at least three] scans of tissue of the patient revealing diseased tissue, [the three scans] in the form of an array of digitized image values in an electronic format from a diagnostic imaging machine (Brynolfsson1 Fig. 7, ¶¶168, 201, 209, an anatomical CT image and a functional PET image forming a composite image pair are received, the images including pelvic lymph regions with hotspots that can be classified as a tumor (diseased tissue)); (b) determine lesion volumes in the scan as assigned to identifiers (Brynolfsson1 ¶¶171–173, 201, 203–204, Fig. 9A, regions and boundaries in the anatomical and functional image are identified from reference markers in a pelvic atlas image that identify particular sub-volumes within the pelvic image, including a sub-volume associated with a particular pelvic lymph sub-region which is identified); (c) determine an overlapping of lesion volumes between all pairs of scans of the set to provide a set of overlap measures for each pair of scans for each pair of identifiers (Brynolfsson1 ¶¶180, 183, the Dice Score (overlap measures) is calculated as a performance metric to maximize (optimize) overlap between pelvic bone regions of a target segmentation map which represents a target anatomical image, and a pelvic bone regions of a pelvic atlas image (pair of identifiers)); (d) link pairs of the identifiers of different scans to globally maximize the overlap measures of the set over all of the scans (Brynolfsson1 ¶¶180, 183, optimizing alignment (globally maximize) by co-registering the pelvic atlas image pelvic bone regions with the corresponding (link) pelvic bone regions of a target segmentation map derived from a target anatomic image from the CT image (scan)); and (e) output a display indicating a lesion [change] identified to given linked lesions (Brynolfsson1 ¶¶287, 204, computer system including a display, and where hotspots located within a pelvic region can be overlaid on the functional image to identify corresponding volumes within the functional image, where the hotspot is identified as belonging to a potential lesion located within a particular one of the one or more pelvic lymph sub-regions, and where claim 16, element (e) teaches providing the transformed 3D pelvic atlas image); (f) link lesion [changes] to particular organs for multiple organs and lesions (Brynolfsson1 ¶¶ 138 and 144 teach analyzing PET and SPECT images to determine the particular organs that a radiopharmaceuticals that are designed to bind to cancerous tissue in a variety of organs, including prostate, lungs, and bones, has accumulated in, and ¶¶146, 151 and 154, disclose that the locations of hotspots indicative of accumulated radiopharmaceutical and a potential lesion are identified and displayed, and labeled as corresponding (linked) to a bone, lymph or prostate (multiple organs)); wherein the electronic computer executing the stored program further identifies a set of different organs within the scans and wherein the output further identifies [an aggregate measure of lesion changes] for multiple lesions in a given organ linked to the given organ for each of the multiple organs (Brynolfsson1 ¶¶140, 144, teach binding agents administered to a patient for the nuclear medicine imaging facilitate imaging organs (different organs) and regions for evaluating metastatic prostate cancer, and where the anatomical images analyzed are done so together with the nuclear medicine images to determine the organs the radiopharmaceutical has accumulated in). While Brynolfsson1 teaches that a set of anatomical images are received, Brynolfsson1 does not explicitly teach that the set consists of at least three scans. Further, while Brynolfsson1 teaches lesion hotspots shown on the image being provided and that the disclosed system includes a display, Brynolfsson1 does not explicitly teach the lesion change being indicated in the display, or lesion changes or an aggregate measure of lesion changes. Dzyubachyk teaches receiving a set of at least three scans of tissue (Dzyubachyk page 642, Fig. 6, visualization of changes between a baseline scan and three consecutive follow-up scans of lesions (tissue)), and the lesion changes being indicated in the display (Dzyubachyk pages 641–642, figs. 5–6, color fusion in the displayed images simplifying visual assessment of changes between the baseline and follow-up scans). Brynolfsson2 teaches linking lesion changes, and an aggregate measure of lesion changes (Brynolfsson2 ¶¶41, 81–82, identifying hotspots indicating lesions and calculating a set of values tracking (linking) a change (aggregate measure) of the tumor burden over time for a plurality of medical images, and causing the display of a graphical representation of the values). Therefore, taking the teachings of Brynolfsson1 and Dzyubachyk together as a whole, it would have been obvious to a person having ordinary skill in the art (herein “PHOSITA”) to have modified the images processed and displayed in Brynolfsson1 to be at least three scans and indicating lesion change in the display as taught in Dzyubachyk, at least because doing so would greatly simplify visual assessment of changes for a radiologist, thus saving time (see Dzyubachyk Abstract, Fig. 6). Further, taking the teachings of Brynolfsson1 and Brynolfsson2 together as a whole, it would have been obvious to a PHOSITA before the effective filing date of the claimed invention to have modified the images processed and displayed in Brynolfsson1 to include the aggregate measures as disclosed in Brynolfsson2 at least because doing so would allow for informing clinical decision making, evaluating treatment efficacy, and predicting patient responses. See Brynolfsson2 Abstract. Claims 4–12 are rejected under 35 U.S.C. 103 as being unpatentable over Brynolfsson1 in view of Dzyubachyk et al., “Comparative exploration of whole-body MR through locally rigid transforms,” Int J CARS, Springer, 2013 (herein “Dzyubachyk”). Regarding claim 4, Brynolfsson1 teaches with deficiencies noted in square brackets [], Brynolfsson1 teaches an apparatus for assessing treatment of a patient comprising (Brynolfsson1 Abstract, ¶5, systems to analyze 3D images for lesion classification, where such information is used by physician to provide a recommended course of treatment to the patient and to track the progression of disease): an electronic computer executing a stored program to (Brynolfsson1 ¶284, environment for the disclosed system including application servers (electronic computer) with storage and retrieval capabilities, and software to process data): (a) receive a set of [at least three] scans of tissue of the patient revealing diseased tissue, [the three scans] in the form of an array of digitized image values in an electronic format from a diagnostic imaging machine (Brynolfsson1 Fig. 7, ¶¶168, 201, 209, an anatomical CT image and a functional PET image forming a composite image pair are received, the images including pelvic lymph regions with hotspots that can be classified as a tumor (diseased tissue)); (b) determine lesion volumes in the scan as assigned to identifiers (Brynolfsson1 ¶¶171–173, 201, 203–204, Fig. 9A, regions and boundaries in the anatomical and functional image are identified from reference markers in a pelvic atlas image that identify particular sub-volumes within the pelvic image, including a sub-volume associated with a particular pelvic lymph sub-region which is identified); (c) determine an overlapping of lesion volumes between all pairs of scans of the set to provide a set of overlap measures for each pair of scans for each pair of identifiers (Brynolfsson1 ¶¶180, 183, the Dice Score (overlap measures) is calculated as a performance metric to maximize (optimize) overlap between pelvic bone regions of a target segmentation map which represents a target anatomical image, and a pelvic bone regions of a pelvic atlas image (pair of identifiers)); (d) link pairs of the identifiers of different scans to globally maximize the overlap measures of the set over all of the scans (Brynolfsson1 ¶¶180, 183, optimizing alignment (globally maximize) by co-registering the pelvic atlas image pelvic bone regions with the corresponding (link) pelvic bone regions of a target segmentation map derived from a target anatomic image from the CT image (scan)); and (e) output a display indicating a lesion [change] identified to given linked lesions (Brynolfsson1 ¶¶287, 204, computer system including a display, and where hotspots located within a pelvic region can be overlaid on the functional image to identify corresponding volumes within the functional image, where the hotspot is identified as belonging to a potential lesion located within a particular one of the one or more pelvic lymph sub-regions, and where claim 16, element (e) teaches providing the transformed 3D pelvic atlas image); further including a graphic display and wherein the electronic computer executing the stored program further outputs a graphic display providing a graphically depicted link (Brynolfsson1 ¶¶ 154–155, 160, fig. 4, each segmented hotspot (of a lesion) within a hotspot map is labeled, for example CIR, CIL, OBR, OBL) between representations of lesions of different scans of the patient (Brynolfsson1 ¶¶28, 116, transformed 3D pelvic images comprising the identified one or more pelvic lymph sub-regions aligned to the 3D anatomical images and segmentation thereof, and providing the transformed 3D pelvic atlas images to a display). While Brynolfsson1 teaches that a set of anatomical images are received, Brynolfsson1 does not explicitly teach that the set consists of at least three scans. Further, while Brynolfsson1 teaches that a transformed image with lesion hotspots shown on the image being provided and that the disclosed system includes a display, Brynolfsson1 does not explicitly teach the lesion change being indicated in the display. Dzyubachyk teaches receiving a set of at least three scans of tissue (Dzyubachyk page 642, Fig. 6, visualization of changes between a baseline scan and three consecutive follow-up scans of lesions (tissue)), and the lesion change being indicated in the display (Dzyubachyk pages 641–642, figs. 5–6, color fusion in the displayed images simplifying visual assessment of changes between the baseline and follow-up scans). Therefore, taking the teachings of Brynolfsson1 and Dzyubachyk together as a whole, it would have been obvious to a PHOSITA to have modified the images processed and displayed in Brynolfsson1 to be at least three scans and indicating lesion change in the display as taught in Dzyubachyk, at least because doing so would greatly simplify visual assessment of changes for a radiologist, thus saving time (see Dzyubachyk Abstract, Fig. 6). Regarding claim 5, Brynolfsson1 does not explicitly teach, but Dzyubachyk teaches wherein the electronic computer executing the stored program further: receives input from a user to alter the linking of (d) and after that alteration repeats (e) (Dzyubachyk pages 640–641, the user interface allows users to click on a displayed image and align (linking) the follow-up image to the baseline using the locally rigid transform estimation surrounding the clicked point, where the images include overlay components including uncertainty contours overlay). Therefore, taking the teachings of Brynolfsson1 and Dzyubachyk together as a whole, it would have been obvious to a PHOSITA to have modified the images processed and displayed in Brynolfsson to allow for user input resulting in changes in the display as taught in Dzyubachyk, at least because doing so would greatly simplify visual assessment of changes for a radiologist, thus saving time (see Dzyubachyk Abstract, Fig. 6). Regarding claim 6, Brynolfsson1 does not explicitly teach, but Dzyubachyk teaches wherein the electronic computer executing the stored program further outputs uncertainty values on the graphics display associated with the linkage (Dzyubachyk page 641, an overlay on the displayed images including uncertainty contours which indicate accuracy of the estimation of the organ structures). Therefore, taking the teachings of Brynolfsson1 and Dzyubachyk together as a whole, it would have been obvious to a person having ordinary skill in the art (herein “PHOSITA”) to have modified the images processed and displayed in Brynolfsson1 to include the uncertainty contours in the display as taught in Dzyubachyk, at least because doing so would greatly simplify visual assessment of changes for a radiologist, thus saving time (see Dzyubachyk Abstract, Fig. 6). Regarding claim 7, Brynolfsson1 teaches wherein the lesion volumes are dilations of lesion images (Brynolfsson1 ¶¶190–191, 193, 201, multiple registration transformations are performed including a coarse registration transformation, and a fine registration transformation, of pelvic regions including lesions, the fine registration being determined using a second different resolution (dilation)). Regarding claim 8, Brynolfsson1 teaches where in the electronic computer executing the stored program further clusters lesions in a given image to present a combined lesion volume at (b) (Brynolfsson1 ¶¶101, 340, processing of 3D images to identify cancerous lesions including image segmentation based on identification of clusters of voxels connected to each other in an n-component fashion having intensity values about a threshold thereby defining segmented volume for the hotspot (combined lesion)). Regarding claim 9, Brynolfsson1 teaches wherein the clustering is according to a distance derived from an overlapping lesion from an other scan of the pair (Brynolfsson1 ¶¶282–291, in generating the hotspot map identifying cancerous lesions, matching hotspot volumes (overlapping) are identified based on proximity such as centers of gravity within a threshold distance). Regarding claim 10, Brynolfsson1 does not explicitly teach, but Dzyubachyk teaches wherein the output characterizes the lesions as appearing or disappearing (Dzyubachyk page 64, color fusion view depicts areas that are decreasing (disappearing) with an orange color, and areas that are increasing (appearing) with a blue color). Therefore, taking the teachings of Brynolfsson1 and Dzyubachyk together as a whole, it would have been obvious to a person having ordinary skill in the art (herein “PHOSITA”) to have modified the images processed and displayed in Brynolfsson1 to include the color fusion view in the display as taught in Dzyubachyk, at least because doing so would greatly simplify visual assessment of changes for a radiologist, thus saving time (see Dzyubachyk Abstract, Fig. 6). Regarding claim 11, Brynolfsson1 teaches wherein the output indicates lesion volume (Brynolfsson ¶¶151, volumes within the functional image with segmentation map overlaid to identify volumes to classify hotspots). Brynolfsson1 does not explicitly teach, but Dzyubachyk teaches and change in other lesion measurements between scans (Dzyubachyk page 64, color fusion view depicts areas that are decreasing with an orange color, and areas that are increasing with a blue color (change in other lesion measurements)). Therefore, taking the teachings of Brynolfsson1 and Dzyubachyk together as a whole, it would have been obvious to a person having ordinary skill in the art (herein “PHOSITA”) to have modified the images processed and displayed in Brynolfsson1 to include the color fusion view in the display as taught in Dzyubachyk, at least because doing so would greatly simplify visual assessment of changes for a radiologist, thus saving time (see Dzyubachyk Abstract, Fig. 6). Regarding claim 12, Brynolfsson1 teaches wherein the output indicating a linkage is superimposed on at least one scan image (Brynolfsson1 ¶¶28, 116, transformed 3D pelvic images comprising the identified one or more pelvic lymph sub-regions aligned (linkage) to the 3D anatomical images and segmentation thereof, and providing the transformed 3D pelvic atlas images (superimposing) to a display). Claims 4, and 7–12 are rejected under 35 U.S.C. 103 as being unpatentable over Jeraj in view of Brynolfsson1. Regarding claim 4, Jeraj teaches an apparatus for assessing treatment of a patient comprising (Jeraj Abstract, ¶¶5–6, apparatus for tracking disease progression and therapeutic response): an electronic computer executing a stored program to (Jeraj ¶44): (a) receive a set of at least three scans of tissue of the patient revealing diseased tissue, the three scans in the form of an array of digitized image values in an electronic format from a diagnostic imaging machine (Jeraj ¶43, patient imaged at least two different scans of image data (digitized image values in an electronic format), and then supplemented with scans from other scanners (diagnostic imaging machine)); (b) determine lesion volumes in the scan as assigned to identifiers (Jeraj ¶49, lesion mask representing volumes, with each voxel having a value of either 1 for lesion present, or 0 for absence of a lesion); (c) determine an overlapping of lesion volumes between all pairs of scans of the set to provide a set of overlap measures for each pair of scans for each pair of identifiers (Jeraj ¶52, amounts of overlap for each lesion in the scans are compiled and recorded in a matrix); (d) link pairs of the identifiers of different scans to globally maximize the overlap measures of the set over all of the scans (Jeraj ¶¶56, 58, linear assignment (link) of lesions in one scan to lesions in another scan is solved such that the amount of overlap between corresponding lesions is globally maximized); and (e) output a display indicating a lesion change identified to given linked lesions (Jeraj ¶¶58, 44, identification of corresponding lesions from among the scans is provided on a display 30 in the form of a chart); and further including a graphic display (Jeraj ¶44, graphics display 30) and wherein the electronic computer executing the stored program further outputs a graphic display between representations of lesions of different scans of the patient (Jeraj ¶58, each lesion is identified from corresponding lesions in the scans, where an “x” for disappearing lesions or “n” for appearing lesions and a circle for corresponding lesions (linkage), additionally shading can be applied to the patient image with respect to increase or decrease in lesion volume between scans 16a and 16b). While Jeraj teaches a circle indication for corresponding lesions, Jeraj does not teach “providing a graphically depicted link” as claimed. Brynolfsson1 teaches providing a graphically depicted link (Brynolfsson1 ¶¶ 154–155, 160, fig. 4, each segmented hotspot (of a lesion) within a hotspot map is labeled, for example CIR, CIL, OBR, OBL). Therefore, taking the teachings of Jeraj and Brynolfsson1 together as a whole, it would have been obvious to a PHOSITA before the effective filing date of the claimed invention to have modified the visual display taught in Jeraj to include the graphical labels as disclosed in Brynolfsson1, at least because doing so would help determine predictions of cancer status, progression and response to treatment. Brynolfsson1 ¶151. Regarding claim 7, Jeraj teaches wherein the lesion volumes are dilations of lesion images (Jeraj ¶¶50–51, identified lesions are dilated or expanded to create an expanded region about the lesions). Regarding claim 8, Jeraj teaches where in the electronic computer executing the stored program further clusters lesions in a given image to present a combined lesion volume at (b) (Jeraj ¶¶53–54, possibility of merged lesions addressed through clustering operation where determined individual lesions are clustered to be a single logical lesion). Regarding claim 9, Jeraj teaches wherein the clustering is according to a distance derived from an overlapping lesion from an other scan of the pair (Jeraj ¶54, clustering based on distance d between centroids of the determined individual lesions). Regarding claim 10, Jeraj teaches wherein the output characterizes the lesions as appearing or disappearing (Jeraj ¶58, each lesion is identified from corresponding lesions in the scans, where an “x” for disappearing lesions or “n” for appearing lesions). Regarding claim 11, Jeraj teaches wherein the output indicates lesion volume and change in other lesion measurements between scans (Jeraj ¶¶49, 58, lesion mask representing volumes, with each voxel having a value of either 1 for lesion present, or 0 for absence of a lesion, and each lesion is identified from corresponding lesions in the scans, where an “x” for disappearing lesions or “n” for appearing lesions (change)). Regarding claim 12, Jeraj teaches wherein the output indicating a linkage is superimposed on at least one scan image (Jeraj ¶58, each lesion is identified from corresponding lesions in the scans, where an “x” for disappearing lesions or “n” for appearing lesions (linkage) and a circle for corresponding lesions, where fig. 5 teaches that the indications are marked (superimposed) on the image 66 of the patient (at least one scan image)). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHELLE M KOETH whose telephone number is (571)272-5908. The examiner can normally be reached Monday-Thursday, 09:00-17:00, Friday 09:00-13:00, EDT/EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vincent Rudolph can be reached at 571-272-8243. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. MICHELLE M. KOETH Primary Examiner Art Unit 2671 /MICHELLE M KOETH/Primary Examiner, Art Unit 2671
Read full office action

Prosecution Timeline

Apr 11, 2023
Application Filed
Jun 23, 2025
Non-Final Rejection — §101, §103
Sep 22, 2025
Response Filed
Oct 06, 2025
Final Rejection — §101, §103
Dec 23, 2025
Request for Continued Examination
Jan 18, 2026
Response after Non-Final Action
Feb 17, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586221
METHOD AND APPARATUS FOR ESTIMATING DEPTH INFORMATION OF IMAGES
2y 5m to grant Granted Mar 24, 2026
Patent 12579651
IMPEDED DIFFUSION FRACTION FOR QUANTITATIVE IMAGING DIAGNOSTIC ASSAY
2y 5m to grant Granted Mar 17, 2026
Patent 12567241
Method For Generating Training Data Used To Learn Machine Learning Model, System, And Non-Transitory Computer-Readable Storage Medium Storing Computer Program
2y 5m to grant Granted Mar 03, 2026
Patent 12567177
METHOD, ELECTRONIC DEVICE, AND COMPUTER PROGRAM PRODUCT FOR IMAGE PROCESSING
2y 5m to grant Granted Mar 03, 2026
Patent 12566493
METHODS AND SYSTEMS FOR EYE-GAZE LOCATION DETECTION AND ACCURATE COLLECTION OF EYE-GAZE DATA
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
77%
Grant Probability
94%
With Interview (+16.7%)
2y 4m
Median Time to Grant
High
PTA Risk
Based on 429 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month