DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Status:
Claims 4, 12, 15 are canceled.
Claims 1-3, 5-11, 13-14, and 16-21 are pending and examined below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1, 2, 5, 11, 13, 14, 17, 19 and 20-21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Non-Patent Literature: “Detection and Grading of Prostate Cancer Using Temporal Enhanced Ultrasound: Combining Deep Neural Networks and Tissue Mimicking Simulations.” To Azizi et al. “Azizi”, in view of US2018/0042680 to DiMaio et al. “DiMaio”, and further in view of US2012/0209106 to Liang et al. “Liang”.
Regarding claim 1, Azizi discloses an ultrasound imaging system (Method section of Abstract, temporal enhanced ultrasound, TeUS) comprising:
an ultrasound transducer (Page 1295, top of right column, TRUS transducer) configured to acquire echo signals responsive to ultrasound pulses (Page 1295, Left column, Section: Temporal enhanced ultrasound date, insonification of tissue over time, the tissue response to this prolonged insonification consists of reflected US echo-intensity values) transmitted along a biopsy plane within a target region (Page 1295, right column, Section Preprocessing and region of interest, "each biopsy target, we analyze ... around the target location ... along the projected needle path in the US image and centered on the target location''); and
a processor in communication with the ultrasound transducer (Page1295, bottom of left column, UroNav MR/US fusion system (Invivo Inc., a subsidiary of Philips Healthcare)) and configured to:
obtain a time series of sequential data frames associated with the echo signals (Page 1295, Section: Temporal enhanced ultrasound data, "Temporal Enhanced Ultrasound or TeUS is defined as the time series of US RF frames captured from insonification of tissue over time… the tissue response to this prolonged isonification consists of reflected US echo-intensity values "; Fig. 1, TeUs data),
apply a neural network to the time series of sequential data frames (Page 1296, Left column, Section: Feature learning, using Deep Belief Network, DBN, on the TeUS data; Page 1295, Section: Temporal enhanced ultrasound data wherein the TeUS data is a time series of US RF frames captured from insonification of tissue over time), in which the neural network determines spatial locations (Fig. 6, showing, "Cancer likelihood maps ... [using] the TeUS data ... [and that the] red boundary shows the segmented prostate in MRI projected in TRUS coordinates") and identities (Page 1303, right column, Conclusion and future directions, grading PCa from TeUS data using a DBN and showed that our approach could successfully differentiate among aggressive PCa, clinically less significant PCa, and non-cancerous prostate tissue) of a plurality of tissue types in the sequential data frames (Page 1298, left column, providing that, "comparing the components activated for ROIs labeled as GS pattern of 3, 4 as well as non-cancerous tissue ... we can identify those frequency ranges that are different between two tissue types);
generate a spatial distribution map (Fig. 6, showing, "Cancer likelihood maps ... [using] the TeUS data ... [and that the] red boundary shows the segmented prostate in MRI projected in TRUS coordinates) in communication with the processor (Page1295, bottom of left column, UroNav MR/US fusion system from which T2-weighted MR images were fused with the 3D TRUS [Transrectal Ultrasound] volume of the prostate),
the spatial distribution map labeling the coordinates of the plurality of tissue types identified within the target region (Fig. 6, showing, Cancer likelihood maps overlaid on B-mode US image ...centered on the target. The ROIs for which we detect as Gleason grade of 4 and 3 are colored in red and yellow, respectively. The non-cancerous ROIs are colored as blue. The red boundary shows the segmented prostate in MRI projected in TRUS coordinates and the arrow pointer shows the target. a MRI lesion length = 27 mm, benign target, b MRI lesion length = 36 mm, GS ≤ 3 + 4, c MRI lesion length = 24 mm, GS ≤ S 3 + 4 and d MRI lesion length= 17 mm, GS ≥ 4 + 3), wherein the plurality of tissue types comprise various grades of cancerous tissue and benign tissue (Fig. 6, the labeled tissue types are benign, lesion with GS ≤ 3 + 4, lesion with GS ≥ 4 + 3; Page 1303, lesion with GS ≤ 3 + 4 has clinically less significant PCa, and lesion with GS ≥ 4 + 3 has aggressive PCa, which would read on grades of cancerous tissue).
Azizi additionally discloses that the cancer color map (i.e. cancer likelihood map) provides orientation information regarding the movement of the biopsy to other targeted biopsy regions (Page 1300, right column, “The colormap demonstrates using our approach, the clinician could have reoriented the needle to biopsy a more aggressive region”).
However, Azizi does not explicitly disclose wherein the spatial distribution map is to be displayed on a user interface in communication with the processor, and wherein the ultrasound system receives a user input, via the user interface, indicating a targeted biopsy sample, wherein the user input indicating the targeted biopsy sample is based at least in part on the displayed spatial distribution map; and generate a corrected biopsy path based on the targeted biopsy sample, and the spatial distribution map.
DiMaio teaches in a similar minimally invasive surgical system (Abstract) that performs needle guidance into a marked lesion of a cancerous structure (Paragraph 0087), and includes a ultrasound probe (Paragraph 0045, Ref. 150, LUS probe) to acquire 2D ultrasound images of an anatomic structure (Paragraph 0045).
DiMaio teaches in the method as shown in Fig. 7, at step 701, 2D ultrasound image slice view of a cancerous structure is overlaid onto a 3D image taken from a different imaging modality (Paragraph 0087-0088), and wherein the overlay would read on the spatial distribution map since it displays where lesions are located on the cancerous structure (Paragraph 0090). Further the overlay is displayed on a user interface (Paragraph 0090, “the surgeon marks lesions on the cancerous structure displayed as a result of process 701”, which is interpreted as the overlay is generated in step 701 and then displayed, and the surgeon marks the lesions on the displayed cancerous structure overlay, and would read on the spatial mapping displayed on a user interface), and
wherein the ultrasound system receives a user input, via the user interface, indicating a targeted biopsy sample (Paragraph 0090, the surgeon marks lesions on the cancerous structure displayed with different designated colors; wherein marking the displayed image at different lesion locations with different colors infers a user interface, wherein the marking of the lesions would read on an input by the user), wherein the user input indicating the targeted biopsy sample is based at least in part on the displayed spatial distribution map (Paragraph 0090, since the markings by the surgeon is on the displayed overlay image that reads on the spatial distribution map, the markings are based at least in part on the displayed overlay/spatial distribution map); and
generate a biopsy path based on the targeted biopsy sample, the spatial distribution map, and (Paragraph 0090, DiMaio describe the surgeon marking the lesions, each with a different color, and the location of each marked lesion is stored in memory, and in step 703, of Fig. 7, the processor 102 determines an optimal needle tip path to the locations of each lesion; the optimal needle path).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified Azizi’s invention, wherein the spatial distribution map of Azizi’s is to be displayed on a user interface in communication with the processor, and wherein the ultrasound system receives a user input, via the user interface, indicating a targeted biopsy sample, wherein the user input indicating the targeted biopsy sample is based at least in part on the displayed spatial distribution map; and generate a corrected biopsy path based on the targeted biopsy sample, and the spatial distribution map, as taught by DiMaio, in order to allow the user to select one or more cancerous lesions of interest to the user and guide the biopsy needle to each of the user selected cancerous lesions (DiMaio, Paragraph 0090). Such a modification was even broadly suggested by Azizi. As stated above, Azizi discloses in its Figure 6(b) that the colormap demonstrates that the clinician could have reoriented the needle to biopsy a more aggressive region (i.e. a region that contained more aggressive grades of cancer) (Page 1300, right column). This suggests that if the clinician had been able to see the colormap, that needle path could have been corrected to a region with more aggressive cancer. Therefore, in the combination of Azizi and DiMaio, the biopsy path generated based on the teachings of DiMaio, would read on a corrected biopsy path, since Azizi teaches for example an original biopsy path in Fig. 6b, and would use the color map to see that there is a more aggressive region in the color map, and would thus select this more aggressive region, and use the teachings of DiMaio to generate a corrected biopsy path such that the biopsy needle procedure would be redirected to the more aggressive region to obtain the biopsy sample.
However, the modifications of Azizi and DiMaio do not explicitly disclose wherein the corrected biopsy path is further based on pre-set preferences entered in the user interface.
Liang teaches a similar system for providing three dimensional assistance to a user during a medical procedure involving a soft organ (Abstract), wherein the medical procedure can be an interventional ablation or needle biopsy (Paragraph 0017).
Liang teaches a margin is preset by the user, which defines a work zone (Paragraph 0044), which is implemented by a user interface (Paragraph 0073), such as a touch screen (Paragraph 0039). Liang further teaches the system, in real-time may determine if the tool tip is within the work zone. The work zone is defined to be the safe operation region with confined offset from the pre-planned cutting surface. Liang additionally teaches the system can provide guidance on, for example, how close he/she is to a critical anatomical structure that has to be avoided during the operation or what adjustments the physicians has to make in order to follow his/her pre-operative planning (Paragraph 0037), wherein the adjustments would read on corrections to the current position or path. Therefore in teaching presetting a margin with a work zone, determining if the tool tip is within the workzone, and providing adjustments to position of the tool so that the pre-operative plan can be followed would read on the claimed corrected biopsy path is further based on pre-set preferences.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi and DiMaio, wherein the corrected biopsy path is further based on pre-set preferences entered in the user interface, as taught by Liang in order to predefine the margin or work zone of the targeted region of interest.
Regarding claim 2, the modifications of Azizi, DiMaio, and Liang teaches all the features of claim 1 above.
Azizi teaches wherein the time series of sequential data frames embody radio frequency signals (Page 1295, left column, times series of US RF frames, where RF is radiofrequency). Azizi also discloses B-mode signals (Page 1300, left column, “Cancer colormaps overlaid on B-mode US image, along the projected needle path in the TeUS data and centered on the target.”).
Regarding claim 5, the modifications of Azizi, DiMaio, and Liang teaches all the features of claim 1 above.
Azizi teaches wherein the target region comprises a prostate gland (Page 1295, Data was obtained from fusion prostate biopsy procedures where the biopsy target locations were identified using mp-MRI information, and the biopsy was guided by TRUS).
Regarding claim 11, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 1 above.
Azizi discloses wherein the neural network is operatively associated with a training algorithm (Page 1296, left column, stating that a, "Deep Belief Network (DBN) ... [is used] to automatically learn a high-level latent feature representation of the TeUS ... [including the performance of a] pre-training step[, a] ... discriminative fine-tuning step ... [prior to testing] the trained DBN") configured to receive an array of known inputs and known outputs (Page 1297, right column, "TeUS test data is propagated through the trained DBN, and the activations of the last hidden layer (i.e., the learned latent features) are computed. We then use the [Gaussian Mixture Model)] GMM along with the learned features, as explained in "Distribution learning" section to assign a PCa grades to each ROIs of the test dataset"), wherein the known inputs comprise ultrasound image frames (Page 1297, right column, providing that the, "TeUS test data is propagated through the trained DBN, and the activations of the last hidden layer (i.e., the learned latent features) are computed"; Page 1295, left column, "Temporal Enhanced Ultrasound or TeUS is defined as the time series of US RF frames captured from insonification of tissue over time") containing at least one tissue type (Page 1295, left column, clarifying that, "Temporal Enhanced Ultrasound or TeUS is defined as the time series of US RF frames captured from insonification of tissue over time"; Page 1293, Abstract, clarifying that in this study, "TeUS [is used] to address the problem of grading prostate cancer")) and a histopathological classification associated with the at least one tissue type contained in the ultrasound image frames (Page 1298, right column, "a digital pathology dataset [14] [is used] to investigate this hypothesis in a histopathology-based simulation framework"); Page 130, left column, discussing that, "Figure 7 (top) summarizes the biopsy target locations, distribution, and histopathology outcomes for the test data"; and Page 1293, Abstract, "TeUS [is used] to address the problem of grading prostate cancer").
Regarding claim 13, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 1 above.
Azizi teaches wherein the spatial distribution map is generated using multiparametric magnetic resonance imaging (mp-MRI) data of the target region (Page 1295, Data acquisition, “before, the biopsy procedure, suspicious lesions were identified using mp-MRI and scored by two independent radiologists ... The consensus scores were grouped ... and ref erred to as the MR suspicious level assigned to the area. These scores are based on findings on each mp-MRI sequence ... which indicate[s] both the presence of prostate cancer and tumor grade ... mp-MRI lesions were delineated on the T2-weighted MR image as the biopsy targets. At the beginning of the procedure, a 3D US volume of the prostate was reconstructed by obtaining a series of electromagnetically (EM) tracked 2D TRUS images. Then, using UroNav MR/US fusion system (Invivo Inc., a subsidiary of Philips Healthcare), T2-weighted MR images were fused with the 3D TRUS volume of the prostate ... [from which the target locations for biopsy were transformed to the EM coordinate frame. A clinician then navigated in the prostate volume towards the MR-identified target"; Figure 6, showing, "Cancer likelihood maps overlaid on B-mode US image ... centered on the target. The ROIs for which we detect as Gleason grade of 4 and 3 are colored in red and yellow, respectively. The non-cancerous ROIs are colored as blue. The red boundary shows the segmented prostate in MRI projected in TRUS coordinates and the arrow pointer shows the target. a MRI lesion length = 27 mm, benign target, b MRI lesion length = 36 mm, GS ≤ 3 + 4, c MRI lesion length = 24 mm, GS ≤ S 3 + 4 and d MRI lesion length= 17 mm, GS ≥ 4 + 3).
Regarding claim 14, Azizi discloses a method of ultrasound imaging (Method section of Abstract, temporal enhanced ultrasound, TeUS), the method comprising:
acquiring echo signals responsive to ultrasound pulses (Page 1295, Left column, Section: Temporal enhanced ultrasound date, insonification of tissue over time, the tissue response to this prolonged insonification consists of reflected US echo-intensity values) transmitted along a biopsy plane within a target region (Page 1295, right column, Section Preprocessing and region of interest, "each biopsy target, we analyze ... around the target location ... along the projected needle path in the US image and centered on the target location'');
obtaining a time series of sequential data frames associated with the echo signals (Page 1295, Section: Temporal enhanced ultrasound data, "Temporal Enhanced Ultrasound or TeUS is defined as the time series of US RF frames captured from insonification of tissue over time… the tissue response to this prolonged isonification consists of reflected US echo-intensity values "; Fig. 1, TeUs data);
applying a neural network to the time series of sequential data frames (Page 1296, Left column, Section: Feature learning, using Deep Belief Network, DBN, on the TeUS data; Page 1295, Section: Temporal enhanced ultrasound data wherein the TeUS data is a time series of US RF frames captured from insonification of tissue over time), in which the neural network determines spatial locations (Fig. 6, showing, "Cancer likelihood maps ... [using] the TeUS data ... [and that the] red boundary shows the segmented prostate in MRI projected in TRUS coordinates") and identities (Page 1303, right column, Conclusion and future directions, grading PCa from TeUS data using a DBN and showed that our approach could successfully differentiate among aggressive PCa, clinically less significant PCa, and non-cancerous prostate tissue) of a plurality of tissue types in the sequential data frames (Page 1298, left column, providing that, "comparing the components activated for ROIs labeled as GS pattern of 3, 4 as well as non-cancerous tissue ... we can identify those frequency ranges that are different between two tissue types);
generating a spatial distribution map to be displayed on a user interface (Fig. 6, showing, "Cancer likelihood maps ... [using] the TeUS data ... [and that the] red boundary shows the segmented prostate in MRI projected in TRUS coordinates) in communication with a processor (Page1295, bottom of left column, UroNav MR/US fusion system from which T2-weighted MR images were fused with the 3D TRUS [Transrectal Ultrasound] volume of the prostate),
the spatial distribution map labeling the coordinates of the plurality of tissue types identified within the target region (Fig. 6, showing, Cancer likelihood maps overlaid on B-mode US image ...centered on the target. The ROIs for which we detect as Gleason grade of 4 and 3 are colored in red and yellow, respectively. The non-cancerous ROIs are colored as blue. The red boundary shows the segmented prostate in MRI projected in TRUS coordinates and the arrow pointer shows the target. a MRI lesion length = 27 mm, benign target, b MRI lesion length = 36 mm, GS ≤ 3 + 4, c MRI lesion length = 24 mm, GS ≤ S 3 + 4 and d MRI lesion length= 17 mm, GS ≥ 4 + 3), wherein the plurality of tissue types comprise various grades of cancerous tissue and benign tissue (Fig. 6, the labeled tissue types are benign, lesion with GS ≤ 3 + 4, lesion with GS ≥ 4 + 3; Page 1303, lesion with GS ≤ 3 + 4 has clinically less significant PCa, and lesion with GS ≥ 4 + 3 has aggressive PCa, which would read on grades of cancerous tissue); and
Azizi additionally discloses that the cancer color map (i.e. cancer likelihood map) provides orientation information regarding the movement of the biopsy to other targeted biopsy regions (Page 1300, right column, “The colormap demonstrates using our approach, the clinician could have reoriented the needle to biopsy a more aggressive region”).
However, Azizi does not explicitly disclose wherein the spatial distribution map is to be displayed on a user interface in communication with the processor, and the method further includes receiving a user input, via the user interface, indicating a targeted biopsy sample, wherein the user input indicating the targeted biopsy sample is based at least in part on the displayed spatial distribution map; and generating a corrected biopsy path based on the targeted biopsy sample, and the spatial distribution map.
DiMaio teaches in a similar minimally invasive surgical system (Abstract) that performs needle guidance into a marked lesion of a cancerous structure (Paragraph 0087), and includes a ultrasound probe (Paragraph 0045, Ref. 150, LUS probe) to acquire 2D ultrasound images of an anatomic structure (Paragraph 0045).
DiMaio teaches in the method as shown in Fig. 7, at step 701, 2D ultrasound image slice view of a cancerous structure is overlaid onto a 3D image taken from a different imaging modality (Paragraph 0087-0088), and wherein the overlay would read on the spatial distribution map since it displays where lesions are located on the cancerous structure (Paragraph 0090). Further the overlay is displayed on a user interface (Paragraph 0090, “the surgeon marks lesions on the cancerous structure displayed as a result of process 701”, which is interpreted as the overlay is generated in step 701 and then displayed, and the surgeon marks the lesions on the displayed cancerous structure overlay, and would read on the spatial mapping displayed on a user interface), and
wherein the ultrasound system receives a user input, via the user interface, indicating a targeted biopsy sample (Paragraph 0090, the surgeon marks lesions on the cancerous structure displayed with different designated colors; wherein marking the displayed image at different lesion locations with different colors infers a user interface, wherein the marking of the lesions would read on an input by the user), wherein the user input indicating the targeted biopsy sample is based at least in part on the displayed spatial distribution map (Paragraph 0090, since the markings by the surgeon is on the displayed overlay image that reads on the spatial distribution map, the markings are based at least in part on the displayed overlay/spatial distribution map); and
generating a biopsy path based on the targeted biopsy sample, the spatial distribution map, and (Paragraph 0090, DiMaio describe the surgeon marking the lesions, each with a different color, and the location of each marked lesion is stored in memory, and in step 703, of Fig. 7, the processor 102 determines an optimal needle tip path to the locations of each lesion; the optimal needle path).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified Azizi’s invention, wherein the spatial distribution map of Azizi’s is to be displayed on a user interface in communication with the processor, and the method further includes receiving a user input, via the user interface, indicating a targeted biopsy sample, wherein the user input indicating the targeted biopsy sample is based at least in part on the displayed spatial distribution map; and generating a corrected biopsy path based on the targeted biopsy sample, and the spatial distribution map, as taught by DiMaio, in order to allow the user to select one or more cancerous lesions of interest to the user and guide the biopsy needle to each of the user selected cancerous lesions (DiMaio, Paragraph 0090). Such a modification was even broadly suggested by Azizi. As stated above, Azizi discloses in its Figure 6(b) that the colormap demonstrates that the clinician could have reoriented the needle to biopsy a more aggressive region (i.e. a region that contained more aggressive grades of cancer) (Page 1300, right column). This suggests that if the clinician had been able to see the colormap, that needle path could have been corrected to a region with more aggressive cancer. Therefore, in the combination of Azizi and DiMaio, the biopsy path generated based on the teachings of DiMaio, would read on a corrected biopsy path, since Azizi teaches for example an original biopsy path in Fig. 6b, and would use the color map to see that there is a more aggressive region in the color map, and would thus select this more aggressive region, and use the teachings of DiMaio to generate a corrected biopsy path such that the biopsy needle procedure would be redirected to the more aggressive region to obtain the biopsy sample.
However, the modifications of Azizi and DiMaio do not explicitly disclose wherein the corrected biopsy path is further based on pre-set preferences entered in the user interface.
Liang teaches a similar system for providing three dimensional assistance to a user during a medical procedure involving a soft organ (Abstract), wherein the medical procedure can be an interventional ablation or needle biopsy (Paragraph 0017). Liang teaches a margin is preset by the user, which defines a work zone (Paragraph 0044), which is implemented by a user interface (Paragraph 0073), such as a touch screen (Paragraph 0039). Liang further teaches the system, in real-time may determine if the tool tip is within the work zone. The work zone is defined to be the safe operation region with confined offset from the pre-planned cutting surface. Liang additionally teaches the system can provide guidance on, for example, how close he/she is to a critical anatomical structure that has to be avoided during the operation or what adjustments the physicians has to make in order to follow his/her pre-operative planning (Paragraph 0037), wherein the adjustments would read on corrections to the current position or path. Therefore in teaching presetting a margin with a work zone, determining if the tool tip is within the workzone, and providing adjustments to position of the tool so that the pre-operative plan can be followed would read on the claimed corrected biopsy path is further based on pre-set preferences.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi and DiMaio, wherein the corrected biopsy path is further based on pre-set preferences entered in the user interface, as taught by Liang in order to predefine the margin or work zone of the targeted region of interest.
Regarding claim 17, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 14 above.
Liang teaches based on the tip position and the orientation of the needle, the system can forecast the needle path by extending the needle forward along its current orientation. The system can then estimate if the forecasted needle path will hit or miss the preset target and give warning signs/signals in case of a miss or anticipated. As the needles penetrates the organ, if the forecast needle path indicates the user may hit a critical anatomy, the system can also highlight the said anatomy and give audio or visual warning (Paragraph 0069). Liang further teaches the system can provide guidance on what adjustments the physicians has to make in order to follow his/her pre-operative planning based on the warnings (Paragraph 0037).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, DiMaio, and Liang, wherein the method includes generating an instruction for adjusting an ultrasound transducer to align a biopsy needle with the corrected biopsy path, as taught by Liang, in order to avoid the surgical instrument from contacting critical structures, during the operation of the surgical instrument (Liang, Paragraph 0037).
Regarding claim 19, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 14 above.
As disclosed in the claim 14 rejection above, DiMaio teaches wherein the biopsy path is generated by direct user interaction with the spatial distribution map displayed on the user interface (Paragraph 0090, the surgeon marks lesions on the cancerous structure displayed in the overlay image with different designated colors; wherein marking the displayed image at different lesion locations with different colors infers a user interface, wherein the marking of the lesions would read on an input by the user. DiMaio describe the surgeon marking the lesions, each with a different color, and the location of each marked lesion is stored in memory, and in step 703, of Fig. 7, the processor 102 determines an optimal needle tip path to the locations of each lesion; the optimal needle path).
Regarding claim 20, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 14 above.
Azizi teaches wherein the identities of a plurality of tissue types are identified (Page 1298, let column, "comparing the components activated for ROIs labeled as GS pattern of 3, 4 as well as non-cancerous tissue ... we can identify those frequency ranges that are different between two tissue types") by recognizing ultrasound signatures (Page 1303, right column, stating that, "grading PCa [(prostate cancer) was achieved] ... from TeUS data using a DBN ... [which] showed that our approach could successfully differentiate among aggressive PCa (GS ≥ 4 + 3), clinically less significant PCa (GS ≤ 3 + 4), and non-cancerous prostate tissue") unique to histopathological classifications (Page 1298, stating that, "a digital pathology dataset [14] [is used] to investigate this hypothesis in a histopathology-based simulation framework"); Page 1301, left column, discussing that, "Figure 7 (top) summarizes the biopsy target locations, distribution, and histopathology outcomes for the test data"; and Page 1293, left column, clarifying that in this study, "TeUS [is used] to address the problem of grading prostate cancer") of each of the plurality of tissue types (Page 1298, left column, providing that, "comparing the components activated for ROIs labeled as GS pattern of 3, 4 as well as non-cancerous tissue ... we can identify those frequency ranges that are different between two tissue types").
Regarding claim 21, the modifications of Azizi, DiMaio, and Liang teaches all the features of claim 14 above, including the corrected biopsy path.
As disclosed in the claim 14 rejection above, Azizi discloses obtaining a biopsy sample with the biopsy gun containing the biopsy needle (Page 1295, right column),.
However, the modifications of Azizi and DiMaio do not explicitly disclose generating an instruction for adjusting an ultrasound transducer to align a biopsy needle with the corrected biopsy path; and adjusting an ultrasound transducer to align a biopsy needle with the corrected biopsy path.
Liang teaches based on the tip position and the orientation of the needle, the system can forecast the needle path by extending the needle forward along its current orientation. The system can then estimate if the forecasted needle path will hit or miss the preset target and give warning signs/signals in case of a miss or anticipated. As the needles penetrates the organ, if the forecast needle path indicates the user may hit a critical anatomy, the system can also highlight the said anatomy and give audio or visual warning (Paragraph 0069). Liang further teaches the system can provide guidance on what adjustments the physicians has to make in order to follow his/her pre-operative planning based on the warnings (Paragraph 0037).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, Thomas, and Liang, wherein the method includes generating an instruction for adjusting an ultrasound transducer to align a biopsy needle with the corrected biopsy path, as taught by Liang, in order to avoid the surgical instrument from contacting critical structures, during the operation of the surgical instrument (Liang, Paragraph 0037).
Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Azizi, in view of DiMaio, and further in view of Liang, as applied to claim 1 above, and further in view of US2003/0135115 to Burdette et al. “Burdette”.
Regarding claim 3, the modifications of Azizi, DiMaio, and Liang teaches all the features of claim 1 above.
Liang teaches based on the tip position and the orientation of the needle, the system can forecast the needle path by extending the needle forward along its current orientation. The system can then estimate if the forecasted needle path will hit or miss the preset target and give warning signs/signals in case of a miss or anticipated. As the needles penetrates the organ, if the forecast needle path indicates the user may hit a critical anatomy, the system can also highlight the said anatomy and give audio or visual warning (Paragraph 0069). Liang further teaches the system can provide guidance on what adjustments the physicians has to make in order to follow his/her pre-operative planning based on the warnings (Paragraph 0037).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, DiMaio, and Liang, wherein the method includes generating an instruction for adjusting an ultrasound transducer to align a biopsy needle with the corrected biopsy path, as taught by Liang, in order to avoid the surgical instrument from contacting critical structures, during the operation of the surgical instrument (Liang, Paragraph 0037).
However, the modifications of Azizi, DiMaio, and Liang do not teach wherein the ultrasound transducer is coupled with a biopsy needle.
Burdette teaches wherein the ultrasound transducer is coupled with a biopsy needle (Paragraph 0044, providing that the "biopsy needle 128 may be attached to the ultrasound probe via a biopsy needle guide 132 as shown in FIGS. 1-4"); Fig. 4, Ref. Chars. 100, 128, and 132).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the instant application to combine the teachings Azizi, DiMaio, and Liang, as discussed above, with the ultrasound transducer being coupled with a biopsy needle as taught by Burdette, in order to have a set spatial relationship between the position of the biopsy needle and the ultrasound probe (Burdette, Paragraph 0014).
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Azizi, in view of DiMaio, and further in view of Liang, as applied to claim 1 above, and further in view of US2014/0233826 to Agaian et al. “Agaian”.
Regarding claim 6, the modifications of Azizi, DiMaio, and Liang teaches all the features of claim 1 above.
As disclosed in the claim 1 rejection above, Azizi discloses the cancer likelihood color maps (Fig. 6) that were generated from TeUS data, analyzed by the Deep Belief Network (Page 1296, left column, and Page 1303, right column, Conclusion), displayed the regions of various grades of cancerous tissue, based on the Gleason grading scale (Fig. 6 and caption), wherein the cancer likelihood maps read on the claimed spatial map. Additionally in the claim 1 rejection above, DiMaio teaches an overlay of ultrasound imaging data with 3D imaging data from a different modality (Paragraph 0087-0088) wherein the overlay image is displayed and interacted with by the user (Paragraph 0090), and would read on the spatial map being displayed on the user interface. Therefore, the combination of Azizi and DiMaio is interpreted as the output of the Deep Belief Network that analyzes the imaging data to assign Gleason grades for cancerous tissue depicted in the imaging data, is outputted in the form of the cancer likelihood map and displayed on the user interface.
However, Azizi, DiMaio and Liang do not disclose wherein the user interface includes information about the targeted biopsy sample, wherein the information comprises a maximum number of different tissue types, a maximum amount of a single tissue type, a particular tissue type, or combinations thereof.
Agaian teaches in a similar field of screening for cancer (title) using biopsy images. Agaian teaches using machine learning algorithms to classify features extracted from the biopsy images (Paragraphs 0218, 0255) to generate Gleason pattern grades (Paragraph 0218, and 0255), which can then be used to create localized cancer map including the grade of each cancerous patch in the biopsy image (Paragraph 0264). Therefore the output of the learning algorithm of Agaian is very similar to that of Azizi, wherein both teach/disclose analyzing the imaging data by a neural network or machine learning algorithm to output a localized cancer map. However, Agaian outputs additional information (Paragraph 0264) such as the most frequent Gleason score in the slide/image, and a percentage of each Gleason pattern in biopsy specimen. The percentage of each Gleason pattern in the biopsy specimen would read on a maximum amount of each type of tissue (classified by Gleason pattern grades) in the biopsy specimen.
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, DiMaio and Liang, wherein the user interface includes information about the targeted biopsy sample, wherein the information comprises a maximum number of different tissue types, a maximum amount of a single tissue type, a particular tissue type, or combinations thereof, as taught by Agaian, since the user interface of Azizi, DiMaio, and Lang already displays the output from the neural network/machine learning algorithm analysis, in the form of the cancer likelihood map, and therefore it would have been obvious that the other information calculated using the machine learning algorithm analysis, such as a maximum amount of a single type of tissue, or information about a particular tissue type, could also be displayed, in order to provide the user/operator with additional metrics to provide support for the classification of tissue based on the Gleason score.
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Azizi, in view of DiMaio, and further in view of Liang, as applied to claim 1 above, and further in view of US9,521,961 to Silverstein et al. “Silverstein”.
Regarding claim 7, the modifications of Azizi, DiMaio, and Liang teaches all the features of claim 1 above.
However, the modifications of Azizi, DiMaio, and Liang do not disclose wherein the user input comprises a selection of a preset targeted biopsy sample option or a narrative description of the targeted biopsy sample.
Silverstein teaches, wherein the user input comprises a selection of a preset targeted biopsy sample option (Col. 24, Lines 1-12, clarifying that by using, "position and orientation information determined by the system 1110, together with the length of the cannula 1202 and position of the magnetic element 1210 with respect to the distal needle tip as known by or input into the system, enable the system to accurately determine the location and orientation of the entire length of the needle 1200 with respect to the sensor array 1190. Optionally, the distance between the magnetic element 1210 and the distal needle tip is known by or input into the system 1110. This in turn enables the system 1110 to superimpose an image of the needle 1200 on to an image produced by the ultrasound beam 1222 of the probe 1140")) or a narrative description of the targeted biopsy sample (Col. 14, Lines 7-9, providing that, "aural information, such as beeps, tones, etc., can also be employed by the system to assist the clinician during catheter placement").
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the instant application to combine the teachings of Azizi, DiMaio, and Liang to include wherein the user input comprises a selection of a preset targeted biopsy sample option or a narrative description of the targeted biopsy sample as taught by Silverstein. This is because Silverstein Col. 34, Lines 1-14 provides the motivation of using, "[t]he rigid medical device tracking system in many diagnostic medical procedures ... [to help] medical practitioners perform the procedure more efficiently and safely than previously known ... [including for use in] manual and automatic biopsy devices." Furthermore, Silverstein Col. 20, Lines 12-20 describes that, "the guidance system enables the position, orientation, and advancement of the needle to be superimposed in real-time atop the ultrasound image ... thus enabling a clinician to accurately guide the needle to the intended target. Furthermore, in one embodiment, the guidance system tracks the needle's position in five degrees of motion: x, y, and z spatial coordinate space, needle pitch, and needle yaw. Such tracking enables the needle to be guided and placed with relatively high accuracy."
Claim(s) 8-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Azizi, in view of DiMaio, and further in view of Liang, as applied to claim 1 above, and further in view of US2017/0258526 to “Lang”.
Regarding claim 8, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 1 above.
Liang teaches the virtual surgical instruments, including their positions/locations and orientations, can be shown within the 3D scenes (Paragraph 0043), wherein the 3D scene is displayed on screen that is manipulatable by the user, such as a touch screen (Paragraph 0039). The virtual surgical instruments can include a biopsy needle (Paragraph 0043) for needle biopsy procedure (Paragraph 0042). Liang’s teachings read on the claimed user interface comprises a touch screen configured to receive user input, and also reads on displaying a virtual needle on the touch screen.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by of Azizi, DiMaio, and Liang, wherein the user interface comprises a touch screen configured to receive the user input, and displays a virtual needle on the touch screen, as further taught by Liang, in order to reveal the relative spatial relationship of the actual instruments with respect to the operated objects and other reference objects (Liang, Paragraph 0043).
However, the modifications of Azizi, DiMaio, and Liang do not disclose the user input comprises movement of the virtual needle on the touch screen.
Lang teaches a similar device and method for performing a surgical procedure with visual guidance (Abstract). Lang teaches using the input to drag a virtual medical implement (such as a pedicle screw, or needle) to virtually move or align the virtual tool in the interface (Paragraph 0900). Although, Lang provides an exemplar holographic interface, Lang teaches that the movement of the virtual medical implement can be applied to other interfaces (Paragraph 0900), such as a touchpad (Paragraph 0079) with a screen (Paragraph 0910).
Therefore, It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, DiMaio, and Liang, wherein the touch screen of Liang is used for the user input, and the user input comprises movement of a virtual needle that is displayed, as suggested by Lang, in order to modify the virtual surgical plan based on the positioning of the virtual implement/instrument (Lang, Paragraph 0913).
Regarding claim 9, the modifications of Azizi, DiMaio, and Liang teaches all the features of claim 1 above.
As disclosed in the claim 1 rejections above, Liang teaches a touch screen (Paragraph 0039).
Azizi further teaches wherein the processor is configured to acquire ultrasound images from the biopsy plane (Page 1295, providing that, "using UroNav MR/US fusion system (lnvivo Inc., a subsidiary of Philips Healthcare), T2-weighted MR images were fused with the 3D TRUS volume of the prostate [23,33]. Following the registration of TRUS and MR volumes, the target locations for biopsy were transformed to the EM coordinate frame. A clinician then navigated in the prostate volume towards the MR-identified target. TRUS transducer was held steady for about 5 seconds to acquire 100 frames of temporal US data from the target, followed by firing the biopsy gun to acquire a tissue sample").
However, the modifications of Azizi, DiMaio, and Liang do not disclose that the ultrasound images are displayed live.
Lang teaches wherein both the live anatomic surface image data, Paragraph 0405, and the intraoperative images can be from ultrasound, Paragraph 0652. Lang further teaches displaying the live data and digital representations of virtual data on a display (Abstract).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, DiMaio, and Liang, wherein the ultrasound images are displayed live, as taught by Lang, in order to allow the live data to be simultaneously displayed with virtual data (Lang, Paragraph 0004).
Regarding claim 10, the modifications of Azizi, DiMaio, Liang, and Lang disclose all the features of claim 9 above.
As disclosed in the claim 9 rejection above, Lang teaches displaying the live ultrasound image.
Azizi further discloses wherein the processor is further configured to overlay the spatial distribution map on the ultrasound image (See Fig. 6, cancer likelihood maps overlaid on B-mode US image, along the projected needle path in the TeUS data and centered on the target).
Therefore, the combination of Lang and Azizi would read on the claimed overlaying of the spatial distribution map on the live ultrasound image.
Claim(s) 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Azizi, in view of DiMaio, and further in view of Liang, as applied to claim 14 above, and further in view of US2008/0183073 to Higgins et al. “Higgins”.
Regarding claim 16, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 14 above.
However, the modifications of Azizi, DiMaio, and Liang do not disclose applying a feasibility constraint against the corrected biopsy path, wherein the feasibility constraint is based on physical limitations of a biopsy.
Higgins teaches similar apparatus and method for route planning (Abstract) for applications such as needle biopsy (Paragraph 0078). Higgins teaches applying a feasibility constraint against the biopsy path, wherein the feasibility constraint is based on physical limitations of a biopsy (Paragraph 0072, applying constraints, including constraints related to the procedure to be performed, constraints regarding the anatomy, to routes determined by method 1, which are routes closest to the ROI, Paragraph 0069; wherein the procedural constraints are for example wherein a sufficient fraction and/or minimal volume of ROI voxels lie within the "diagnostic field of view." If an endoscopic device needs to interact with the ROI (such as a needle biopsy), this constraint ensures that the device approaches the ROI in such a manner that the interaction is possible, which would read on a feasibility constraint based on the physical limitations of a biopsy). Higgins further teaches correcting the paths in a case where the routes do not adequately reach the ROI (Paragraph 0083, 0084, method 3) and then applying procedural-specific restraints (Method 5, Paragraph 0097—0102). This reads on applying feasibility constraint, based on physical limitations of a biopsy, to the corrected path.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, DiMaio, and Liang, wherein the method includes applying a feasibility constraint against the corrected biopsy path, wherein the feasibility constraint is based on physical limitations of a biopsy, as taught by Higgins, in order to refine the originally calculated paths and provide physically meaningful navigation directions (Higgins, Paragraph 0097).
Claim(s) 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Azizi, in view of DiMaio, and further in view of Liang, as applied to claim 14 above, and further in view of Lang.
Regarding claim 18, the modifications of Azizi, DiMaio, and Liang disclose all the features of claim 14 above.
Azizi further discloses wherein the processor is further configured to overlay the spatial distribution map on the ultrasound image (See Fig. 6, cancer likelihood maps overlaid on B-mode US image, along the projected needle path in the TeUS data and centered on the target).
However, the modifications of Azizi, DiMaio, and Liang do not disclose that the displayed ultrasound image is a live image.
Lang teaches wherein both the live anatomic image data, Paragraph 0405, and the intraoperative images can be from ultrasound, Paragraph 0652. Lang further teaches displaying the live data and digital representations of virtual data on a display (Abstract).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system as described by Azizi, DiMaio, and Liang, wherein the ultrasound images are displayed live, as taught by Lang, in order to allow the live data to be simultaneously displayed with virtual data (Lang, Paragraph 0004).
Response to Arguments
Applicant’s arguments, see Pages 6-7, filed 12/19/2025, with respect to the 35 U.S.C. 112(b) rejections for claims 1-3, 5-11, 13-14, and 16-20 have been fully considered and are persuasive. The rejections of claims 1-3, 5-11, 13-14, and 16-20 have been withdrawn.
Applicant’s arguments, see Pages 7-12, filed 12/19/2025, with respect to the 35 U.S.C. 101 rejections have been fully considered and are persuasive. The rejections of 1-3, 5-11, 13-14, and 16-20 has been withdrawn.
Applicant’s arguments with respect to the 35 U.S.C. 103 rejections of claim(s) 1-3, 5-11, 13-14, and 16-20 have been considered but are moot because the new ground of rejection does not rely on any combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Milton Truong whose telephone number is (571)272-2158. The examiner can normally be reached 9AM - 5PM, MON-FRI.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Keith Raymond can be reached at (571) 270-1790. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MT/Examiner, Art Unit 3798
/KEITH M RAYMOND/Supervisory Patent Examiner, Art Unit 3798