DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 07/28/2025 has been entered.
Response to Amendment
This action is in response to the remarks filed on 07/28/2025.
The amendments filed on 07/28/2025 have been entered. Accordingly claims 1-2, 4-18, and 20-29 remain pending. Independent claims 1, 18, and 27 are presently amended.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is: “lesion matching engine” in claim 1 and all dependents thereof.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitation:
Regarding the “lesion matching engine” the specification discloses “the computing system 102 also includes a tissue deformation model 107 and a lesion matching engine 109” ([0027] of the US PG Pub. version of the specification). Therefore, the means-plus-function limitation of “lesion matching engine” has been interpreted as computing system/processor (associated algorithm is provided in prose at least in [0038], [0045], [0086]), or any equivalents thereof in light of the specification.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-2, 4-18, and 20-29 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
Amended independent claim 1 recites the limitation “the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion”. Amended independent claims 18 and 27 recite analogous limitations. However, neither the original claims, original drawings, nor the specification as originally filed provides support for this limitation. Applicant has indicated, in the Remarks filed 07/28/2025, that support for this amendment can allegedly be found in the drawings at Figure 10 and in the specification at least at paragraph [0098]. However, Figure 10 and its corresponding description in paragraph [0098] provide support for a confidence level indicator (610) being displayed on a GUI (130) that displays an x-ray image (602) and an ultrasound image (604). The x-ray image and ultrasound image display is distinct from the architectural map (362) which is disclosed in Figure 8 and its corresponding description. As such, claims 1-2, 4-18, and 20-29 contain new matter.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-2, 6, 8, 9, 16-18, 22, 24, and 25 are rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko (US 2021/0212665, filed January 13, 2020) in view of Rijken et al. (US 2021/0248744, corresponding PCT filed June 14, 2019), Venkataraman et al. (US 2018/0008236, January 11, 2018, applicant submitted prior art via the IDS, hereinafter “Venkataraman”), and Dascal et al. (US 2014/0270436, September 18, 2014).
Regarding claim 1, Tsymbalenko discloses a method of mapping a region of interest within a breast (“Methods and systems are provided for automatically characterizing lesions in ultrasound images.” Abstract; also see “characterizing a lesion, such as a breast lesion” [0010]), the method comprising:
during a first imaging procedure, capturing first diagnostic medical images of breast tissue (“method 300 proceeds to 304 to obtain a B-mode image that includes a region of interest (ROI). The ROI may be a lesion or another anatomical feature of interest, such as a thyroid” [0039]; also see “ROI, such as a breast lesion” [0036]);
identifying, in the first diagnostic medical images, a region of interest within the breast (“At 310, the B-mode image and the processed elastography image are entered as inputs to an A/B ratio model. The A/B ratio model (e.g., A/B ratio model 208) may include one or more a deep learning/machine learning models trained to identify the ROI/anatomical feature of interest in the B-mode image and in the elastography image.” [0044]);
measuring a vascularity of the breast tissue including the region of interest (“the one or more modules may process color Doppler data [vascularity], which may include traditional color flow Doppler, power Doppler, HD flow, and the like.” [0019]);
measuring a stiffness of the breast tissue including the region of interest (“While in the strain mode, the elastography circuit 103 may control the probe 106 to generate a mechanical force (e.g., surface vibration, freehand or step quasi-static surface displacement, or the like) or radiation force on the patient or ROI to measure the stiffness or strain of the ROI of the patient.” [0021]);
saving the first diagnostic medical images and the stiffness as an architectural map of the region of interest (“The elastography image may include color or grayscale elastography information indicative of measured tissue stiffness, and the elastography information may be displayed as an overlay on a B-mode image.” [0041]) in an electronic record (“Non-transitory memory 206 may further store ultrasound image data 212, such as ultrasound images captured by the ultrasound imaging system 100 of FIG. 1. The ultrasound image data 212 may include both B-mode images and elastography images (whether obtained using shear-wave elastography or strain elastography). Further, ultrasound image data 212 may store ultrasound images, ground truth output, iterations of machine learning model output, and other types of ultrasound image data that may be used to train the A/B ratio model 208, when training module 210 is stored in non-transitory memory 206.” [0031]).
Although Tsymbalenko discloses identifying the region of interest, as stated above, Tsymbalenko fails to disclose wherein identifying the region of interest in the first diagnostic medical images comprises: receiving, at a lesion matching engine, a previously-captured set of images including a target lesion corresponding to the region of interest; receiving, at the lesion matching engine, the first diagnostic medical images, wherein the first diagnostic medical images include a potential lesion; analyzing the previously-captured set of images and the first diagnostic medical images; and determining that the potential lesion is the same as the target lesion.
However, Rijken teaches, in the same field of endeavor, wherein identifying the region of interest in the first diagnostic medical images (“lesion of interest” [0020]) comprises: receiving, at a lesion matching engine (“computer” [0061]), a previously-captured set of images including a target lesion corresponding to the region of interest (“one or more mammograms” in training steps [0026]); receiving, at the lesion matching engine, the first diagnostic medical images, wherein the first diagnostic medical images include a potential lesion (“By receiving multiple input images, radiologists can perform case-wise analysis for patients and substantially determine the likelihood of a malignant lesion after analysing multiple mammographic views.” [0010]); analyzing the previously-captured set of images (“The machine learning algorithm analyses the training data” [0065]) and the first diagnostic medical images (“the step of performing the first analysis on the plurality of mammograms is conducted using one or more trained convolutional neural network classifier” [0012]; also see [0017]); and determining that the potential lesion is the same as the target lesion (“The mammograms, pre-processed or not, are then fed into a convolutional neural network (CNN) classifier 30 which has been trained to analyse the images and assess whether the image shows a malignant lesion.” [0043]; also see [0046], [0062]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with wherein identifying the region of interest in the first diagnostic medical images comprises: receiving, at a lesion matching engine, a previously-captured set of images including a target lesion corresponding to the region of interest; receiving, at the lesion matching engine, the first diagnostic medical images, wherein the first diagnostic medical images include a potential lesion; analyzing the previously-captured set of images and the first diagnostic medical images; and determining that the potential lesion is the same as the target lesion as taught by Rijken in order to provide a trained neural network that is representative of real-world data ([0065] of Rijken).
Although Tsymbalenko discloses measuring vascularity and saving the images and stiffness as an architectural map of the target site, as shown above, Tsymbalenko fails to explicitly disclose saving the first diagnostic medical images, vascularity, and stiffness as an architectural map of the region of interest.
However, Venkataraman teaches, in the same field of endeavor, saving the first diagnostic medical images, vascularity, and stiffness as an architectural map of the region of interest (“As shown in FIG. 7A, in a first rotational scan a standard B-Mode ultrasound image is acquired. In the present example, the image is of a patient's prostate. After the first image set is acquired, the probe is re-rotated using a second imagining modality. In the present example, an elastography ultrasound is performed during the second scan. See FIG. 7B. As shown in the exemplary scan, a number of areas in the scan (A, B and C) having an elasticity above or below a predetermined threshold are illustrated. After the elastography image is obtained, the probe may again be re-rotated using a third imaging modality. In the present example, a Doppler image is performed during the third scan to identify areas (1, 2 and 3) of blood flow above or below a predetermined threshold. Additional or different scans may be performed. FIG. 7D illustrates the registration of the three images into a mpUS image. This image with multiple modes of information may then be analyzed to identify potential regions of interest.” [0098]; also see Figs. 7A-7D, reproduced below, and corresponding descriptions).
PNG
media_image1.png
489
579
media_image1.png
Greyscale
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with saving the first diagnostic medical images, vascularity, and stiffness as an architectural map of the region of interest as taught by Venkataraman in order to provide enhanced detection of suspicious regions by providing an image with multiple modes of information (Abstract and [0098] of Venkataraman).
Tsymbalenko also fails to disclose during a second imaging procedure, capturing at least one second diagnostic medical image of the breast tissue within the breast; receiving scan information associated with the at least one second diagnostic medical image; accessing the architectural map of the region of interest in the electronic record; and analyzing the scan information associated with the at least one second diagnostic medical image to identify the region of interest within the at least one second diagnostic medical image based on the architectural map.
However, Venkataraman further teaches, in the same field of endeavor, during a second imaging procedure, capturing at least one second diagnostic medical image of the breast tissue within the breast (“real-time image” [0053]); receiving scan information associated with the at least one second diagnostic medical image (“the image(s) taken by the probe 10 are output from the ultrasound imaging system 16 to an image registration system 2” [0054]); accessing the architectural map of the region of interest in the electronic record (“The mpUS images [architectural map] (i.e., 2D or 3D) may be used for enhanced and/or automated detection of one or more suspicious regions. After identifying one or more suspicious regions, the mpUS images may be utilized with a real-time image to guide biopsy or therapy the region(s).” [0053]); and analyzing the scan information associated with the at least one second diagnostic medical image to identify the region of interest within the at least one second diagnostic medical image based on the architectural map (“See FIG. 10. Of note in FIG. 10, rather than showing a live ultrasound image (e.g., B-mode) registered with the mpUS image, the real-time image simply includes a target point ROI that was identified from the mpUS image. In this example, the system may use a real time image or volume to provide guidance for an introducer (e.g., needle, trocar etc.) of a targeted focal therapy (TFT) device.” [0110]; also see [0104]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with during a second imaging procedure, capturing at least one second diagnostic medical image of the breast tissue within the breast; receiving scan information associated with the at least one second diagnostic medical image; accessing the architectural map of the region of interest in the electronic record; and analyzing the scan information associated with the at least one second diagnostic medical image to identify the region of interest within the at least one second diagnostic medical image based on the architectural map as taught by Venkataraman in order to provide real time guidance to a target region ([0110] of Venkataraman).
Tsymbalenko fails to disclose, as best understood in light of the 35 U.S.C. 112(a) rejection above, the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion.
However, Dascal teaches, in the same field of endeavor, the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion (“Generating a confidence score/figure of merit (FOM) is performed using one or more software modules 1501. In one embodiment, the confidence score or (FOM) is provided to a user by graphical representation on a computer monitor, for example by providing a color-code on the X-ray or OCT image indicating regions of the OCT pullback that have high or low confidence of being co-registered. Regions of low confidence may, for example, be indicated by a red strip or bar on the X-ray image near the vessel segment where low FOM values were obtained. The FOM/Score reflects a confidence measure in the returned results. The score is in the range of [0, 1] where 0 reflects the lowest confidence and 1 reflects the highest. A FOM threshold value can be selected to define a boundary between high confidence and low confidence co-registration results.” [0114])
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion as taught by Dascal in order to allow for user input or intervention when the result is not acceptable ([0157] of Dascal).
Regarding claim 2, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above and Tsymbalenko further discloses further comprising recording spectral parameters of the breast tissue and saving the spectral parameters with the architectural map (“data may be processed in different mode-related modules by the processor 116 (e.g., B-mode, Color Doppler, M-mode, Color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and the like) to form 2D or 3D data. For example, one or more modules may generate B-mode, color Doppler, M-mode, color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and combinations thereof, and the like. As one example, the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like. The image lines and/or frames are stored in memory and may include timing information indicating a time at which the image lines and/or frames were stored in memory.” [0019]).
Regarding claim 6, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above. Tsymbalenko fails to disclose further comprising presenting, on a display, visual guidance to the region of interest based on a current position of an ultrasound probe.
However, Venkataraman further teaches, in the same field of endeavor, presenting, on a display, visual guidance to the region of interest based on a current position of an ultrasound probe (“FIG. 11 shows one biopsy therapy device that is incorporated with the probe 10. As shown, a cradle assembly 40, which connects to the positioning system, supports the probe 10 during image acquisition. Such a cradle assembly is set forth in co-pending U.S. patent application Ser. No. 15/203,417, which is incorporated by reference in its entirety. The cradle assembly 40 includes a guide assembly 50, which supports a biopsy needle or therapy delivery trocar 90 within a plane of the probe. Along these lines, once a region of interest is identified, the probe may be rotated to align with the ROI and the guide assembly rotates to align a trajectory of the needle/trocar 90 with the ROI. The needle/trocar may then be advanced to the ROI under real time guidance. See FIG. 10. Of note in FIG. 10, rather than showing a live ultrasound image (e.g., B-mode) registered with the mpUS image, the real-time image simply includes a target point ROI that was identified from the mpUS image. In this example, the system may use a real time image or volume to provide guidance for an introducer (e.g., needle, trocar etc.) of a targeted focal therapy (TFT) device.” [0110]; also see Fig. 11 and corresponding description).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with further comprising presenting, on a display, visual guidance to the region of interest based on a current position of an ultrasound probe as taught by Venkataraman in order to more easily align a device with a region of interest that requires treatment ([0110] of Venkataraman).
Regarding claim 8, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above and Tsymbalenko further discloses wherein the first diagnostic medical images of breast tissue are captured using ultrasound (“The ultrasound imaging system 100 includes a transmit beamformer 101 and a transmitter 102 that drives elements (e.g., transducer elements) 104 within a transducer array, herein referred to as probe 106, to emit pulsed ultrasonic signals (referred to herein as transmit pulses) into a body (not shown).” [0014]; also see [0019]).
Regarding claim 9, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 8 as stated above and Tsymbalenko further discloses wherein the ultrasound performs B-Mode imaging (“In various embodiments of the present invention, data may be processed in different mode-related modules by the processor 116 (e.g., B-mode, Color Doppler, M-mode, Color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and the like) to form 2D or 3D data.” [0019]).
Regarding claim 16, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above and Tsymbalenko further discloses wherein stiffness is measured using elastography (“the elastography circuit 103 may control the probe 106 to generate a mechanical force (e.g., surface vibration, freehand or step quasi-static surface displacement, or the like) or radiation force on the patient or ROI to measure the stiffness or strain of the ROI of the patient.” [0021]).
Regarding claim 17, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 16 as stated above and Tsymbalenko further discloses wherein the elastography is shear-wave elastography (“The ultrasound imaging system 100 includes an elastography circuit 103 configured to enable shear-wave and/or stain elastography imaging. While in the shear-wave mode, the elastography circuit 103 may control the probe 106 to generate a shear wave at a site within a region of interest (ROI) of an imaging subject (e.g., a patient).” [0020]).
Regarding claim 18, Tsymbalenko discloses a system for mapping a region of interest within a breast (“Methods and systems are provided for automatically characterizing lesions in ultrasound images.” Abstract; also see “characterizing a lesion, such as a breast lesion” [0010]), the system comprising:
at least one data store (ultrasound image data 212 in Fig. 2 and corresponding description);
a processor (processor 116 in Fig. 1 and corresponding description; processor 204 in Fig. 2 and corresponding description); and
a memory storing instructions that, when executed by the processor, facilitate performance of operations (memory 120 in Fig. 1 and corresponding description; non-transitory memory 206 in Fig. 2 and corresponding description; also see [0027]), comprising:
mapping the region of interest within the breast (“Methods and systems are provided for automatically characterizing lesions in ultrasound images.” Abstract; also see “characterizing a lesion, such as a breast lesion” [0010]) by:
recording at least one first image of the region of interest using diagnostic medical imaging (“method 300 proceeds to 304 to obtain a B-mode image that includes a region of interest (ROI). The ROI may be a lesion or another anatomical feature of interest, such as a thyroid” [0039]; also see “ROI, such as a breast lesion” [0036]), wherein:
the at least one first image is captured during a first imaging procedure (obtain B-mode image including ROI, step 304 in Fig. 3 and corresponding description), and
the region of interest is identified in the at least one first image (“At 310, the B-mode image and the processed elastography image are entered as inputs to an A/B ratio model. The A/B ratio model (e.g., A/B ratio model 208) may include one or more a deep learning/machine learning models trained to identify the ROI/anatomical feature of interest in the B-mode image and in the elastography image.” [0044])
measuring a vascularity of the region of interest (“As one example, the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like.” [0019]);
measuring a density of the region of interest (“While in the strain mode, the elastography circuit 103 may control the probe 106 to generate a mechanical force (e.g., surface vibration, freehand or step quasi-static surface displacement, or the like) or radiation force on the patient or ROI to measure the stiffness or strain of the ROI of the patient.” [0021]; Examiner notes that applicant’s instant disclosure in paragraph [0025] of the pre-grant publication of the instant specification, states “The stiffness, or density, can be measured with SHEARWAVE™ elastography, which outputs a color map representing relative values of stiffness throughout the tissue volume”); and
saving the at least one first image and the density as an architectural map in an electronic record associated with the breast (“The elastography image may include color or grayscale elastography information indicative of measured tissue stiffness, and the elastography information may be displayed as an overlay on a B-mode image.” [0041]) in an electronic record (“Non-transitory memory 206 may further store ultrasound image data 212, such as ultrasound images captured by the ultrasound imaging system 100 of FIG. 1. The ultrasound image data 212 may include both B-mode images and elastography images (whether obtained using shear-wave elastography or strain elastography). Further, ultrasound image data 212 may store ultrasound images, ground truth output, iterations of machine learning model output, and other types of ultrasound image data that may be used to train the A/B ratio model 208, when training module 210 is stored in non-transitory memory 206.” [0031]; also see “The image lines and/or frames are stored in memory and may include timing information indicating a time at which the image lines and/or frames were stored in memory. The modules may include, for example, a scan conversion module to perform scan conversion operations to convert the acquired images from beam space coordinates to display space coordinates.” [0019]).
Although Tsymbalenko discloses identifying the region of interest, as stated above, Tsymbalenko fails to disclose wherein identifying the at least one region of interest in the at least one first image comprises: receiving, at a lesion matching engine, a previously-captured at least one image including a target lesion corresponding to the region of interest; receiving, at the lesion matching engine, the at least one first image, wherein the at least one first image includes a potential lesion; analyzing the previously-captured at least one image and the at least one first image; and determining that the potential lesion is the same as the target lesion.
However, Rijken teaches, in the same field of endeavor, wherein identifying the at least one region of interest in the at least one first image (“lesion of interest” [0020]) comprises: receiving, at a lesion matching engine (“computer” [0061]), a previously-captured at least one image including a target lesion corresponding to the region of interest (“one or more mammograms” in training steps [0026]); receiving, at the lesion matching engine, the at least one first image, wherein the at least one first image includes a potential lesion (“By receiving multiple input images, radiologists can perform case-wise analysis for patients and substantially determine the likelihood of a malignant lesion after analysing multiple mammographic views.” [0010]); analyzing the previously-captured at least one image (“The machine learning algorithm analyses the training data” [0065]) and the at least one first image (“the step of performing the first analysis on the plurality of mammograms is conducted using one or more trained convolutional neural network classifier” [0012]; also see [0017]); and determining that the potential lesion is the same as the target lesion (“The mammograms, pre-processed or not, are then fed into a convolutional neural network (CNN) classifier 30 which has been trained to analyse the images and assess whether the image shows a malignant lesion.” [0043]; also see [0046], [0062]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with wherein identifying the at least one region of interest in the at least one first image comprises: receiving, at a lesion matching engine, a previously-captured at least one image including a target lesion corresponding to the region of interest; receiving, at the lesion matching engine, the at least one first image, wherein the at least one first image includes a potential lesion; analyzing the previously-captured at least one image and the at least one first image; and determining that the potential lesion is the same as the target lesion as taught by Rijken in order to provide a trained neural network that is representative of real-world data ([0065] of Rijken).
Although Tsymbalenko discloses measuring vascularity and saving the at least one first image, and the density as an architectural map of the target site, as shown above, Tsymbalenko fails to explicitly disclose saving the at least one first image, the vascularity, and the density an architectural map of the target site.
However, Venkataraman teaches, in the same field of endeavor, saving the at least one first image, the vascularity, and the density an architectural map of the target site (“As shown in FIG. 7A, in a first rotational scan a standard B-Mode ultrasound image is acquired. In the present example, the image is of a patient's prostate. After the first image set is acquired, the probe is re-rotated using a second imagining modality. In the present example, an elastography ultrasound is performed during the second scan. See FIG. 7B. As shown in the exemplary scan, a number of areas in the scan (A, B and C) having an elasticity above or below a predetermined threshold are illustrated. After the elastography image is obtained, the probe may again be re-rotated using a third imaging modality. In the present example, a Doppler image is performed during the third scan to identify areas (1, 2 and 3) of blood flow above or below a predetermined threshold. Additional or different scans may be performed. FIG. 7D illustrates the registration of the three images into a mpUS image. This image with multiple modes of information may then be analyzed to identify potential regions of interest.” [0098]; also see Figs. 7A-7D, reproduced below, and corresponding descriptions).
PNG
media_image2.png
330
390
media_image2.png
Greyscale
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with saving the at least one first image, the vascularity, and the density an architectural map of the target site as taught by Venkataraman in order to provide enhanced detection of suspicious regions by providing an image with multiple modes of information (Abstract and [0098] of Venkataraman).
Tsymbalenko also fails to disclose during a second imaging procedure, capturing at least one second image; receiving scan information associated with the at least one second image; accessing the architectural map of the region of interest in the electronic record; and analyzing the scan information associated with the at least one second image to identify the region of interest within the at least one second image based on the architectural map.
However, Venkataraman further teaches, in the same field of endeavor, during a second imaging procedure, capturing at least one second image (“real-time image” [0053]); receiving scan information associated with the at least one second image (“the image(s) taken by the probe 10 are output from the ultrasound imaging system 16 to an image registration system 2” [0054]); accessing the architectural map of the region of interest in the electronic record (“The mpUS images [architectural map] (i.e., 2D or 3D) may be used for enhanced and/or automated detection of one or more suspicious regions. After identifying one or more suspicious regions, the mpUS images may be utilized with a real-time image to guide biopsy or therapy the region(s).” [0053]); and analyzing the scan information associated with the at least one second image to identify the region of interest within the at least one second image based on the architectural map (“See FIG. 10. Of note in FIG. 10, rather than showing a live ultrasound image (e.g., B-mode) registered with the mpUS image, the real-time image simply includes a target point ROI that was identified from the mpUS image. In this example, the system may use a real time image or volume to provide guidance for an introducer (e.g., needle, trocar etc.) of a targeted focal therapy (TFT) device.” [0110]; also see [0104]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with during a second imaging procedure, capturing at least one second image; receiving scan information associated with the at least one second image; accessing the architectural map of the region of interest in the electronic record; and analyzing the scan information associated with the at least one second image to identify the region of interest within the at least one second image based on the architectural map as taught by Venkataraman in order to provide real time guidance to a target region ([0110] of Venkataraman).
Tsymbalenko fails to disclose the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion.
However, Dascal teaches, in the same field of endeavor, the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion (“Generating a confidence score/figure of merit (FOM) is performed using one or more software modules 1501. In one embodiment, the confidence score or (FOM) is provided to a user by graphical representation on a computer monitor, for example by providing a color-code on the X-ray or OCT image indicating regions of the OCT pullback that have high or low confidence of being co-registered. Regions of low confidence may, for example, be indicated by a red strip or bar on the X-ray image near the vessel segment where low FOM values were obtained. The FOM/Score reflects a confidence measure in the returned results. The score is in the range of [0, 1] where 0 reflects the lowest confidence and 1 reflects the highest. A FOM threshold value can be selected to define a boundary between high confidence and low confidence co-registration results.” [0114])
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion as taught by Dascal in order to allow for user input or intervention when the result is not acceptable ([0157] of Dascal).
Regarding claim 22, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 18 as stated above. Tsymbalenko further discloses wherein the diagnostic medical imaging is ultrasound imaging (“The ultrasound imaging system 100 includes a transmit beamformer 101 and a transmitter 102 that drives elements (e.g., transducer elements) 104 within a transducer array, herein referred to as probe 106, to emit pulsed ultrasonic signals (referred to herein as transmit pulses) into a body (not shown).” [0014]; also see [0019]).
Tsymbalenko fails to disclose wherein the diagnostic medical imaging is x-ray imaging.
However, Rijken further teaches, in the same field of endeavor, wherein the diagnostic medical imaging is x-ray imaging (“Mammography makes use of “soft” X-rays to produce detailed images of the internal structure of the human breast” [0002]).
Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with wherein the diagnostic medical imaging is x-ray imaging as taught by Rijken in order to provide the gold standard for early detection of breast abnormalities to provide a valid diagnosis of a cancer in a curable phase ([0002] of Rijken).
Regarding claim 24, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 18 as stated above and Tsymbalenko further discloses wherein the vascularity and the density are measured using ultrasound (“In various embodiments of the present invention, data may be processed in different mode-related modules by the processor 116 (e.g., B-mode, Color Doppler, M-mode, Color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and the like) to form 2D or 3D data. For example, one or more modules may generate B-mode, color Doppler, M-mode, color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and combinations thereof, and the like. As one example, the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like.” [0019]; also see [0020]).
Regarding claim 25, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 24 as stated above and Tsymbalenko further discloses wherein the ultrasound performs B-Mode imaging (“For example, when characterizing a lesion, such as a breast lesion, a clinician may evaluate the lesion using standard B-mode ultrasound imaging” [0010]).
Claims 4 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko in view of Rijken, Venkataraman, and Dascal as applied to claims 1-2 above and further in view of Ramsay et al. (US 2020/0219237, July 9, 2020, applicant submitted prior art via the IDS, hereinafter “Ramsay”).
Regarding claim 4, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above but fails to disclose further comprising comparing the scan information with the architectural map in the electronic record to identify any changes in the breast tissue.
However, Ramsay teaches, in the same field of endeavor, comparing the scan information with the architectural map in the electronic record to identify any changes in the breast tissue (“FIGS. 11a through 11d illustrate the consistency with which one embodiment of this application performs across different imaging modalities. The pattern responses for breast images reveal consistent colors and tissue characterizations for modalities 3D Tomosynthesis in FIG. 11a, synthetic 2D from 3D in FIG. 11b, Full Field Digital Mammography (FFDM) in FIG. 11c, and digitized film in FIG. 11d. This provides a radiologist and their patients the ability to compare changes over time using only one set of algorithms, even when a patient's images were generated historically using different imaging modalities.” [0342]; also see [0343]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko modified by Rijken and Venkataraman with further comprising comparing the scan information with the architectural map in the electronic record to identify any changes in the breast tissue as taught by Ramsay in order to monitor changes in tumors in the region of interest during or after medical treatments ([0343] of Ramsay).
Regarding claim 5, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above but fails to disclose wherein the later time at which scan information is received is at least an hour after the capturing diagnostic medical images.
However, Ramsay suggests, in the same field of endeavor, wherein the later time at which scan information is received is at least an hour after the capturing diagnostic medical images (“FIGS. 11a through 11d illustrate the consistency with which one embodiment of this application performs across different imaging modalities. The pattern responses for breast images reveal consistent colors and tissue characterizations for modalities 3D Tomosynthesis in FIG. 11a, synthetic 2D from 3D in FIG. 11b, Full Field Digital Mammography (FFDM) in FIG. 11c, and digitized film in FIG. 11d. This provides a radiologist and their patients the ability to compare changes over time using only one set of algorithms, even when a patient's images were generated historically using different imaging modalities.” [0342]; also see “The patterns can also be utilized to monitor changes in a tumor during and after medical treatments such as chemo therapy, hormone therapy, immunotherapy, and radiation.” [0343], examiner notes that because the changes are monitored after treatments such as chemo therapy, hormone therapy, etc. it is suggested that the later time would be at least an hour later). Tsymbalenko modified by Venkataraman and Ramsay does not explicitly teach wherein the later time at which scan information is received is at least an hour after the capturing diagnostic medical images. However, it would have been an obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the later time taught by Venkataraman and Ramsay to at least an hour after, because such a modification is part of routine experimentation optimization and applicant has not disclosed that this timing provides an unexpected advantage, is used for a particular purpose, or solves a stated problem. Furthermore, it has been held before that "it is not inventive to discover the optimum or workable ranges by routine experimentation" (see MPEP 2144.05.II.A), one of ordinary skill in the art could have made the modification with known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention.
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above and further in view of Xu et al. (US 2023/0038498, corresponding PCT filed January 7, 2021, hereinafter “Xu”).
Regarding claim 7, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above but fails to disclose wherein the architectural map of the breast tissue includes a margin of normal tissue around the region of interest.
However, Xu teaches, in the same field of endeavor, wherein the architectural map of the breast tissue includes a margin of normal tissue around the region of interest (“Create 3D grid coordinates to target the tumor volume with a treatment margin—The user of the system can identify the boundaries of the tumor in each slice or image of the prior MR or CT scans. The user can also input the desired treatment margin (e.g., a margin of 1 cm beyond the tumor boundary). The surgical navigation system or the robotic positioning system can then calculate and create 3D grid locations that cover the entirety of the tumor with the desired 1 cm treatment margin surrounding the tumor.” [0165]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with wherein the architectural map of the breast tissue includes a margin of normal tissue around the region of interest as taught by Xu in order to provide a safety factor when treating a target tissue ([0165] of Xu).
Claims 10 and 26 are rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko modified by Rijken, Venkataraman, and Dascal as applied to claims 1, 8, 18, and 22 above, respectively, and further in view of Jerebko et al. (US 2017/0132792, May 11, 2017, hereinafter “Jerebko”).
Regarding claim 10, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 8 as stated above but fails to disclose wherein the first diagnostic medical images of breast tissue are also captured using digital breast tomosynthesis.
However, Jerebko teaches, in the same field of endeavor, wherein the first diagnostic medical images of breast tissue are also captured using digital breast tomosynthesis (“However, in recent times, the two-dimensional mammography is being replaced ever more frequently by digital breast tomosynthesis (DBT), in particular for differential diagnoses. However, the use of DBT has already been proposed for screening for breast cancer.” [0003]; also see “In DBT, a plurality of two-dimensional projection images are recorded from different projection directions, i.e. from different projection angles. Using reconstruction methods employing concepts from computed tomography, it is possible to obtain three-dimensional tomosynthesis image data records which allow an improved spatial localization of target structures, in particular suspect lesions.” [0004]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with wherein the first diagnostic medical images of breast tissue are also captured using digital breast tomosynthesis as taught by Jerebko in order to provide improved spatial localization of target structures ([0004] of Jerebko).
Regarding claim 26, Tsymbalenko modified by Rijken, Venkataraman, and Dascal a discloses the limitations of claim 22 as stated above. In particular, Rijken was relied on to teach the x-ray imaging. Tsymbalenko fails to disclose wherein the x-ray imaging is performed using digital breast tomosynthesis.
However, Jerebko further teaches, in the same field of endeavor, wherein the x-ray imaging is performed using digital breast tomosynthesis (“However, in recent times, the two-dimensional mammography is being replaced ever more frequently by digital breast tomosynthesis (DBT), in particular for differential diagnoses. However, the use of DBT has already been proposed for screening for breast cancer.” [0003]; also see “In DBT, a plurality of two-dimensional projection images are recorded from different projection directions, i.e. from different projection angles. Using reconstruction methods employing concepts from computed tomography, it is possible to obtain three-dimensional tomosynthesis image data records which allow an improved spatial localization of target structures, in particular suspect lesions.” [0004]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with wherein the x-ray imaging is performed using digital breast tomosynthesis as taught by Jerebko in order to provide improved spatial localization of target structures ([0004] of Jerebko).
Claims 11-14, 20, 21, and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko modified by Rijken, Venkataraman, and Dascal as applied to claims 1 and 18 above, respectively, and further in view of Caluser (US 2019/0000318, January 3, 2019, applicant submitted prior art via the IDS).
Regarding claim 11, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above but fails to disclose wherein the first diagnostic medical images of breast tissue are captured using magnetic resonance imaging (MRI).
However, Caluser teaches, in the same field of endeavor, wherein the first diagnostic medical images of breast tissue are captured using magnetic resonance imaging (MRI) (“images obtained with other modalities like MRI” [0063]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with wherein the first diagnostic medical images of breast tissue are captured using magnetic resonance imaging (MRI) as taught by Caluser in order to create a more realistic representation of the target region by using additional modalities ([0063] of Caluser).
Regarding claim 12, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 1 as stated above but fails to disclose further comprising recording location coordinates of the region of interest.
However, Caluser teaches, in the same field of endeavor, recording location coordinates of the region of interest (“The non-deformable surface in the medical image is registered to positional coordinates of anatomical reference point(s) within the reference state model. The position of a target pixel in the medical image is projected to the reference state model based on a relative location of the target pixel between the deformable and non-deformable surfaces.” Abstract).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with further comprising recording location coordinates of the region of interest as taught by Caluser in order to track a target location during times of deformation ([0049] of Caluser).
Regarding claim 13, Tsymbalenko modified by Rijken, Venkataraman, Dascal and Caluser discloses the limitations of claim 12 as stated above, in particular Caluser was relied on to teach recording location coordinates of the region of interest as shown above. Tsymbalenko fails to disclose wherein the location coordinates of the region of interest are defined by a clock position relative to a nipple of the breast, a depth from a surface of the breast, and a distance from the nipple.
However, Caluser further teaches, in the same field of endeavor, wherein the location coordinates of the region of interest are defined by a clock position relative to a nipple of the breast (“Positional coordinates of targets F and G also may be displayed on TDMD display 38, either using an hourly format in reference to nipple C or using any other coordinate system.” [0048]), a depth from a surface of the breast (“The algorithm calculates the distance of each pixel in an image from the chest wall surface and from the skin surface and accounts for tissue deformation and compression during scanning.” [0067]; also see [0068]), and a distance from the nipple (“The distance to nipple C and clock face position of a particular pixel or lesion identified in an acquired image can be calculated in the reference state model.” [0084]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with wherein the location coordinates of the region of interest are defined by a clock position relative to a nipple of the breast, a depth from a surface of the breast, and a distance from the nipple as taught by Caluser in order to track a target location during times of deformation ([0049] of Caluser).
Regarding claim 14, Tsymbalenko modified by Rijken, Venkataraman, Dascal, and Caluser discloses the limitations of claim 13 as stated above and Tsymbalenko further discloses wherein vascularity is measured using Doppler imaging (“In various embodiments of the present invention, data may be processed in different mode-related modules by the processor 116 (e.g., B-mode, Color Doppler, M-mode, Color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and the like) to form 2D or 3D data. For example, one or more modules may generate B-mode, color Doppler, M-mode, color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate, and combinations thereof, and the like. As one example, the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like.” [0019]).
Regarding claim 20, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 18 as stated above but fails to disclose further comprising a tracking system configured to: receive a current position and orientation of an ultrasound probe from a probe localization transceiver of the ultrasound probe; and display the current position and orientation of the ultrasound probe relative to the breast on a graphical user interface including images of the breast that are obtained during at least one of the first imaging procedure and the second imaging procedure.
However, Caluser teaches, in the same field of endeavor, a tracking system configured to: receive a current position and orientation of an ultrasound probe from a probe localization transceiver of the ultrasound probe (“Also a combination of wired and wireless position sensors can be used to provide the position tracking module with positional information from tracked landmarks or anatomical reference (AR) points on the patient's body A and the ultrasound probe 34.” [0042]); and display the current position and orientation of the ultrasound probe relative to the breast on a graphical user interface including images of the breast that are obtained during at least one of the first imaging procedure and the second imaging procedure(“Reconstruction module 27 of processor 41 receives the digital ultrasound images, associates the associated positional information from sensors 48, 49, 52 with the image frames and/or a body diagram, and outputs the information to TDMD computer display 38 and/or to a storage device 39 for review and processing at a later time. TDMD display 38 is then enabled to show images D captured by ultrasound device 22 and associated positional data as collected from sensors 48, 49, and 52.” [0046]; also see Fig. 4 and corresponding description).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with further comprising a tracking system configured to: receive a current position and orientation of an ultrasound probe from a probe localization transceiver of the ultrasound probe; and display the current position and orientation of the ultrasound probe relative to the breast on a graphical user interface including images of the breast that are obtained during at least one of the first imaging procedure and the second imaging procedure as taught by Caluser in order to track a target location during times of deformation ([0049] of Caluser).
Regarding claim 21, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 19 as stated above but fails to disclose further comprising a tracking system configured to: determine a current position and orientation of an ultrasound probe based on images captured by a camera system; and display the current position and orientation of the ultrasound probe relative to the breast on a graphical user interface including images of the breast that are obtained during at least one of the first imaging procedure and the second imaging procedure.
However, Caluser teaches, in the same field of endeavor, a tracking system configured to: determine a current position and orientation of an ultrasound probe based on images captured by a camera system (“Also a combination of wired and wireless position sensors can be used to provide the position tracking module with positional information from tracked landmarks or anatomical reference (AR) points on the patient's body A and the ultrasound probe 34. In yet other embodiments, elements 48, 49, and 52 are markers that may be tracked using an optional overhead infrared or optical AR tracking system 43 (shown in phantom), which incorporates one or more infrared or optical cameras.” [0042]); and display the current position and orientation of the ultrasound probe relative to the breast on a graphical user interface including images of the breast that are obtained during at least one of the first imaging procedure and the second imaging procedure(“Reconstruction module 27 of processor 41 receives the digital ultrasound images, associates the associated positional information from sensors 48, 49, 52 with the image frames and/or a body diagram, and outputs the information to TDMD computer display 38 and/or to a storage device 39 for review and processing at a later time. TDMD display 38 is then enabled to show images D captured by ultrasound device 22 and associated positional data as collected from sensors 48, 49, and 52.” [0046]; also see Fig. 4 and corresponding description).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with further comprising a tracking system configured to: determine a current position and orientation of an ultrasound probe based on images captured by a camera system; and display the current position and orientation of the ultrasound probe relative to the breast on a graphical user interface including images of the breast that are obtained during at least one of the first imaging procedure and the second imaging procedure. as taught by Caluser in order to track a target location during times of deformation ([0049] of Caluser).
Regarding claim 23, Tsymbalenko modified by Rijken, Venkataraman, and Dascal discloses the limitations of claim 18 as stated above but fails to disclose further comprising recording a set of coordinates indicating a location of the region of interest and saving the set of coordinates with the architectural map. Although examiner notes that as stated above with respect to claim 18, Tsymbalenko does teach saving coordinates in general with the architectural map.
However, Caluser teaches, in the same field of endeavor, recording a set of coordinates indicating a location of the region of interest (“The non-deformable surface in the medical image is registered to positional coordinates of anatomical reference point(s) within the reference state model. The position of a target pixel in the medical image is projected to the reference state model based on a relative location of the target pixel between the deformable and non-deformable surfaces.” Abstract).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with further recording a set of coordinates indicating a location of the region of interest as taught by Caluser in order to track a target location during times of deformation ([0049] of Caluser).
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko modified by Rijken, Venkataraman, Dascal, and Caluser as applied to claims 1 and 12-14 above and further in view of Shi et al. (US 2023/0000467, corresponding PCT filed October 30, 2020, hereinafter “Shi”).
Regarding claim 15, Tsymbalenko modified by Rijken, Venkataraman, Dascal, and Caluser discloses the limitations of claim 14 as stated above. Although Tsymbalenko suggests wherein the Doppler imaging is microflow Doppler imaging (“As one example, the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like.” [0019]), Tsymbalenko fails to explicitly disclose wherein the Doppler imaging is microflow Doppler imaging.
However, Shi teaches, in the same field of endeavor, wherein the Doppler imaging is microflow Doppler imaging (“The present disclosure is directed to systems and methods for multi-level vascular imaging for construction and display of vasculature from large to small vessels and micro-vessels using a combination of varying resolution CEUS flow imaging modalities. While one or more resolution flow imaging modes may be employed for imaging large to small vessels of a vascular tree within a large region of interest (ROI), a SRI mode is constructed for delineation of the microvascular morphology and directional microcirculation within one or more small ROIs placed in selected locations within a larger ROI. Examples of flow imaging modes include, but are not limited to CEUS, color Doppler, color power angiography (CPA), microflow imaging (MFI), CEUS-MFI, microvascular imaging (MVI), and high definition MVI (HD-MVI). In general, different vascular levels can be imaged with different modes for large vessels to small vessels to capillaries.” [0018]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with wherein the Doppler imaging is microflow Doppler imaging as taught by Shi in order to be able to measure vascularity of smaller vessels and capillaries ([0018] of Shi).
Claims 27-28 are rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko in view of Rijken, Caluser, Shim, Venkataraman, and Dascal.
Regarding claim 27, Tsymbalenko discloses a non-transitory machine-readable storage medium, comprising executable instructions that, when executed by a processor, facilitate performance of operations (“Image processing system 202 includes a processor 204 configured to execute machine readable instructions stored in non-transitory memory 206.” [0027]), comprising:
during a first imaging procedure, capturing first ultrasound images of an entire breast (“method 300 proceeds to 304 to obtain a B-mode image that includes a region of interest (ROI). The ROI may be a lesion or another anatomical feature of interest, such as a thyroid” [0039]; also see “ROI, such as a breast lesion” [0036]; also see Fig. 4 and corresponding description which shows the entire breast is imaged);
identifying, in the first ultrasound images, a region of interest within breast tissue of the breast (“At 310, the B-mode image and the processed elastography image are entered as inputs to an A/B ratio model. The A/B ratio model (e.g., A/B ratio model 208) may include one or more a deep learning/machine learning models trained to identify the ROI/anatomical feature of interest in the B-mode image and in the elastography image.” [0044]);
recording location coordinates (“The modules may include, for example, a scan conversion module to perform scan conversion operations to convert the acquired images from beam space coordinates to display space coordinates.” [0019]);
measuring, using Doppler, a vascularity of the breast tissue including the region of interest (“As one example, the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like.” [0019]);
measuring, using shear-wave elastography (“The ultrasound imaging system 100 includes an elastography circuit 103 configured to enable shear-wave and/or stain elastography imaging. While in the shear-wave mode, the elastography circuit 103 may control the probe 106 to generate a shear wave at a site within a region of interest (ROI) of an imaging subject (e.g., a patient).” [0020]), a stiffness of the breast tissue including the region of interest (“While in the strain mode, the elastography circuit 103 may control the probe 106 to generate a mechanical force (e.g., surface vibration, freehand or step quasi-static surface displacement, or the like) or radiation force on the patient or ROI to measure the stiffness or strain of the ROI of the patient.” [0021]);
saving the first ultrasound images, the location coordinates, and the stiffness as an architectural map (“The elastography image may include color or grayscale elastography information indicative of measured tissue stiffness, and the elastography information may be displayed as an overlay on a B-mode image.” [0041]) in an electronic record associated with the breast (“Non-transitory memory 206 may further store ultrasound image data 212, such as ultrasound images captured by the ultrasound imaging system 100 of FIG. 1. The ultrasound image data 212 may include both B-mode images and elastography images (whether obtained using shear-wave elastography or strain elastography). Further, ultrasound image data 212 may store ultrasound images, ground truth output, iterations of machine learning model output, and other types of ultrasound image data that may be used to train the A/B ratio model 208, when training module 210 is stored in non-transitory memory 206.” [0031]).
Although Tsymbalenko discloses identifying the region of interest, as stated above, Tsymbalenko fails to disclose wherein identifying the region of interest in the first ultrasound images comprises: receiving, at a lesion matching engine, a previously-captured set of images including a target lesion corresponding to the region of interest; receiving, at the lesion matching engine, the first ultrasound images, wherein the first ultrasound images include a potential lesion; analyzing the previously-captured set of images and the first ultrasound images; and determining that the potential lesion is the same as the target lesion.
However, Rijken teaches, in the same field of endeavor, wherein identifying the region of interest in the first ultrasound images (“lesion of interest” [0020]) comprises: receiving, at a lesion matching engine (“computer” [0061]), a previously-captured set of images including a target lesion corresponding to the region of interest (“one or more mammograms” in training steps [0026]); receiving, at the lesion matching engine, the first ultrasound images, wherein the first ultrasound images include a potential lesion (“By receiving multiple input images, radiologists can perform case-wise analysis for patients and substantially determine the likelihood of a malignant lesion after analysing multiple mammographic views.” [0010]); analyzing the previously-captured set of images (“The machine learning algorithm analyses the training data” [0065]) and the first ultrasound images (“the step of performing the first analysis on the plurality of mammograms is conducted using one or more trained convolutional neural network classifier” [0012]; also see [0017]); and determining that the potential lesion is the same as the target lesion (“The mammograms, pre-processed or not, are then fed into a convolutional neural network (CNN) classifier 30 which has been trained to analyse the images and assess whether the image shows a malignant lesion.” [0043]; also see [0046], [0062]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with wherein identifying the region of interest in the first ultrasound images comprises: receiving, at a lesion matching engine, a previously-captured set of images including a target lesion corresponding to the region of interest; receiving, at the lesion matching engine, the first ultrasound images, wherein the first ultrasound images include a potential lesion; analyzing the previously-captured set of images and the first ultrasound images; and determining that the potential lesion is the same as the target lesion as taught by Rijken in order to provide a trained neural network that is representative of real-world data ([0065] of Rijken).
Although Tsymbalenko discloses recording location coordinates as shown above, Tsymbalenko fails to disclose recording location coordinates of the region of interest within the breast tissue of the breast.
However, Caluser teaches, in the same field of endeavor, recording location coordinates of the region of interest within the breast tissue of the breast (“The non-deformable surface in the medical image is registered to positional coordinates of anatomical reference point(s) within the reference state model. The position of a target pixel in the medical image is projected to the reference state model based on a relative location of the target pixel between the deformable and non-deformable surfaces.” Abstract).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with recording location coordinates of the region of interest within the breast tissue of the breast as taught by Caluser in order to track a target location during times of deformation ([0049] of Caluser).
Although Tsymbalenko suggests the Doppler is microflow Doppler (“As one example, the one or more modules may process color Doppler data, which may include traditional color flow Doppler, power Doppler, HD flow, and the like.” [0019]), Tsymbalenko fails to explicitly disclose the Doppler imaging is microflow Doppler.
However, Shi teaches, in the same field of endeavor, the Doppler imaging is microflow Doppler (“The present disclosure is directed to systems and methods for multi-level vascular imaging for construction and display of vasculature from large to small vessels and micro-vessels using a combination of varying resolution CEUS flow imaging modalities. While one or more resolution flow imaging modes may be employed for imaging large to small vessels of a vascular tree within a large region of interest (ROI), a SRI mode is constructed for delineation of the microvascular morphology and directional microcirculation within one or more small ROIs placed in selected locations within a larger ROI. Examples of flow imaging modes include, but are not limited to CEUS, color Doppler, color power angiography (CPA), microflow imaging (MFI), CEUS-MFI, microvascular imaging (MVI), and high definition MVI (HD-MVI). In general, different vascular levels can be imaged with different modes for large vessels to small vessels to capillaries.” [0018]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with the Doppler imaging is microflow Doppler as taught by Shi in order to be able to measure vascularity of smaller vessels and capillaries ([0018] of Shi).
Although Tsymbalenko discloses measuring vascularity and saving the first ultrasound images, the location coordinates, and the stiffness as an architectural map of the target site, as shown above, Tsymbalenko fails to explicitly disclose saving the first ultrasound images, the location coordinates, vascularity, and stiffness as an architectural map of the region of interest.
However, Venkataraman teaches, in the same field of endeavor, saving the first ultrasound images, the location coordinates, vascularity, and stiffness as an architectural map of the region of interest (“As shown in FIG. 7A, in a first rotational scan a standard B-Mode ultrasound image is acquired. In the present example, the image is of a patient's prostate. After the first image set is acquired, the probe is re-rotated using a second imagining modality. In the present example, an elastography ultrasound is performed during the second scan. See FIG. 7B. As shown in the exemplary scan, a number of areas in the scan (A, B and C) having an elasticity above or below a predetermined threshold are illustrated. After the elastography image is obtained, the probe may again be re-rotated using a third imaging modality. In the present example, a Doppler image is performed during the third scan to identify areas (1, 2 and 3) of blood flow above or below a predetermined threshold. Additional or different scans may be performed. FIG. 7D illustrates the registration of the three images into a mpUS image. This image with multiple modes of information may then be analyzed to identify potential regions of interest.” [0098]; also see Figs. 7A-7D, reproduced below, and corresponding descriptions).
PNG
media_image3.png
314
372
media_image3.png
Greyscale
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with saving the first ultrasound images, the location coordinates, vascularity, and stiffness as an architectural map of the region of interest as taught by Venkataraman in order to provide enhanced detection of suspicious regions by providing an image with multiple modes of information (Abstract and [0098] of Venkataraman).
Tsymbalenko also fails to disclose during a second imaging procedure, capturing at least one second ultrasound image of the entire breast; receiving scan information associated with the at least one second ultrasound image; accessing the architectural map of the region of interest in the electronic record; and analyzing the scan information associated with the at least one second ultrasound image to identify the region of interest within the at least one second ultrasound image based on the architectural map.
However, Venkataraman further teaches in the same field of endeavor, during a second imaging procedure, capturing at least one second ultrasound image of the entire breast (“real-time image” [0053]); receiving scan information associated with the at least one second ultrasound image (“the image(s) taken by the probe 10 are output from the ultrasound imaging system 16 to an image registration system 2” [0054]); accessing the architectural map of the region of interest in the electronic record (“The mpUS images [architectural map] (i.e., 2D or 3D) may be used for enhanced and/or automated detection of one or more suspicious regions. After identifying one or more suspicious regions, the mpUS images may be utilized with a real-time image to guide biopsy or therapy the region(s).” [0053]); and analyzing the scan information associated with the at least one second ultrasound image to identify the region of interest within the at least one second ultrasound image based on the architectural map (“See FIG. 10. Of note in FIG. 10, rather than showing a live ultrasound image (e.g., B-mode) registered with the mpUS image, the real-time image simply includes a target point ROI that was identified from the mpUS image. In this example, the system may use a real time image or volume to provide guidance for an introducer (e.g., needle, trocar etc.) of a targeted focal therapy (TFT) device.” [0110]; also see [0104]).
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko as modified above with during a second imaging procedure, capturing at least one second ultrasound image of the entire breast; receiving scan information associated with the at least one second ultrasound image; accessing the architectural map of the region of interest in the electronic record; and analyzing the scan information associated with the at least one second ultrasound image to identify the region of interest within the at least one second ultrasound image based on the architectural map as taught by Venkataraman in order to provide real time guidance to a target region ([0110] of Venkataraman).
Tsymbalenko fails to disclose the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion.
However, Dascal teaches, in the same field of endeavor, the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion (“Generating a confidence score/figure of merit (FOM) is performed using one or more software modules 1501. In one embodiment, the confidence score or (FOM) is provided to a user by graphical representation on a computer monitor, for example by providing a color-code on the X-ray or OCT image indicating regions of the OCT pullback that have high or low confidence of being co-registered. Regions of low confidence may, for example, be indicated by a red strip or bar on the X-ray image near the vessel segment where low FOM values were obtained. The FOM/Score reflects a confidence measure in the returned results. The score is in the range of [0, 1] where 0 reflects the lowest confidence and 1 reflects the highest. A FOM threshold value can be selected to define a boundary between high confidence and low confidence co-registration results.” [0114])
Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Tsymbalenko with the architectural map comprising a confidence level indicator providing a likelihood that the target lesion and the potential lesion are a same lesion as taught by Dascal in order to allow for user input or intervention when the result is not acceptable ([0157] of Dascal).
Regarding claim 28, Tsymbalenko modified by Rijken, Caluser, Shi, Venkataraman, and Dascal discloses the limitations of claim 27 as stated above and Tsymbalenko further discloses wherein the region of interest is a biopsy site (“a breast lesion, a clinician may evaluate the lesion using standard B-mode ultrasound imaging as well as elastography imaging, which is a mechanism for non-invasively measuring tissue stiffness. Certain properties of the lesion in the elastography image relative to the lesion in the B-mode image may facilitate semi-quantitative characterization of the lesion. For example, the width and/or area of the lesion in the elastography image relative to the width and/or area of the lesion in the B-mode image, which is referred to as an A/B ratio, may provide a semi-quantitative analysis of the malignancy of the lesion, as benign lesions typically have a smaller A/B ratio than malignant lesions.” [0010]).
Claim 29 is rejected under 35 U.S.C. 103 as being unpatentable over Tsymbalenko in view of Rijken, Caluser, Shi, Venkataraman, and Dascal as applied to claim 27 above and further in view of Ramsay.
Regarding claim 29, Tsymbalenko modified by Caluser, Shi, Venkataraman, and Dascal discloses the limitations of claim 26 as stated above but fails to disclose wherein the second imaging procedure occurs at least one day after the first imaging procedure, and wherein the scan information is analyzed in comparison to the architectural map to determine if any changes to the breast tissue have occurred.
However, Ramsay suggests, in the same field of endeavor, wherein the second imaging procedure occurs at least one day after the first imaging procedure, and wherein the scan information is analyzed in comparison to the architectural map to determine if any changes to the breast tissue have occurred (“FIGS. 11a through 11d illustrate the consistency with which one embodiment of this application performs across different imaging modalities. The pattern responses for breast images reveal consistent colors and tissue characterizations for modalities 3D Tomosynthesis in FIG. 11a, synthetic 2D from 3D in FIG. 11b, Full Field Digital Mammography (FFDM) in FIG. 11c, and digitized film in FIG. 11d. This provides a radiologist and their patients the ability to compare changes over time using only one set of algorithms, even when a patient's images were generated historically using different imaging modalities.” [0342]; also see “The patterns can also be utilized to monitor changes in a tumor during and after medical treatments such as chemo therapy, hormone therapy, immunotherapy, and radiation.” [0343], examiner notes that because the changes are monitored after treatments such as chemo therapy, hormone therapy, etc. it is suggested that the later time would be at least one day). Tsymbalenko modified by Venkataraman and Ramsay does not explicitly teach wherein the second imaging procedure occurs at least one day after the first imaging procedure. However, it would have been an obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the time of the second imaging procedure taught by Venkataraman and Ramsay to at least one day after, because such a modification is part of routine experimentation optimization and applicant has not disclosed that this timing provides an unexpected advantage, is used for a particular purpose, or solves a stated problem. Additionally, one of ordinary skill in the art would recognize that performing the second imaging procedure at least one day after the first would allow clinicians to track a patient over the course of treatment. Furthermore, it has been held before that "it is not inventive to discover the optimum or workable ranges by routine experimentation" (see MPEP 2144.05.II.A), one of ordinary skill in the art could have made the modification with known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMINAH ASGHAR whose telephone number is (571)272-0527. The examiner can normally be reached M-W, F 9am-5pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at (571) 272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.A./Examiner, Art Unit 3797
/CHRISTOPHER KOHARSKI/Supervisory Patent Examiner, Art Unit 3797