DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-15 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. Independent claims 1-2 and 5 do not fall within at least one of the four categories of patent eligible subject matter because the claims are directed to a generic computer program product. Claims that are not directed to any statutory categories include, “Products that do not have a physical or tangible form, such as information (often referred to as "data per se") or a computer program per se (often referred to as "software per se") when claimed as a product without any structural recitations.” MPEP 2106.03. Dependent claims 3-4 and 6-15 are rejected in virtue of their dependency of a rejected independent claim under 35 U.S.C 101.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Park et. al. (U.S. 20070073155, March 29, 2007) (hereinafter, “Park”) in view of Tahmasebi et. al. (U.S. 20160317119, November 3, 2016)(hereinafter, “Tahmasebi”) and Von Allmen et. al. (U.S. 20170188990, July 6, 2017)(hereinafter, “VonAllmen”).
Regarding Claim 1, Park teaches: A computer program product, the computer program product comprising instructions that, when executed by one or more computing devices, cause the one or more computing devices to perform operations comprising (“…angular position data is processed separately from data from the transducer using software associated with monitor 20 or a separate computer connected to monitor 20. This software produces a real-time, continuous image of the needle orientation that is superimposed over the ultrasound image. In other embodiments, transducer and angle measuring data may be processed simultaneously so as to produce a single data stream that is fed to monitor 20. The software used to process needle position information may be incorporated into probe 30 or reside at a separate computer.” [0060]):
accessing image data acquired from the subject using an ultrasound probe on a surface of the subject, the image data including at least one image of a target structure of the subject (“Probe 30 includes an ultrasound transducer, which generates the image data transmitted to monitor 20. This image data is used to generate real time images of the patient's body below the skinline.“ [0054]);
determining a location of the target structure within the subject (“A health professional first places the transducer such that a target of interest (e.g. a vessel's lumen) is visible on the screen. The location of the target is then estimated and a needle guide is selected such that the needle will pass closest to the target's location.” [0012]);
determining an insertion point location for the interventional device based upon the location of the target structure (“System 10 is configured to provide a health professional with a visual indication of the needle pathway needed to intersect plane B at the target 64 and the insertion depth needed to place the tip of the needle at the target 64.” [0049]);
guiding placement of the ultrasound probe coupled to a guide system, the placement of the ultrasound probe positioning the guide system at the insertion point location (“…reference to FIGS. 2, 3, 2A and 3A, accurate positioning of needle 50 with respect to its intended target 64 proceeds as follows. First, monitor 20 is positioned within the health professional's immediate field of view of the patient and ultrasound device, so as to avoid any occurrence of drift during the procedure. If, initially, monitor 20 displays a cross hair, e.g., cross hair 62a, above the target, then the needle pathway needs adjustment…The processed signal produces real-time angular positional information for the needle pathway which is represented on monitor 20 as a downwardly moving cross-hair. As the needle pathway is adjusted downward by rotating needle 50, cross-hair 62a moves downward and towards target 64 until it reaches target 64, which corresponds to cross-hair 62b. Once the cross-hair is centered on the target, the desired needle pathway is located...” [0067]); and
tracking the interventional device from the insertion point location to the target structure (“Angular adjustments to the needle are displayed on monitor 20 in real time with the ultrasonic image so that the needle position can be tracked and aligned precisely with the target located within the patient.” [0049]; “…needle guidance portion 40 includes a needle tracking device that tracks the angular position of the needle as it rotates about axis A…” [0060]).
Park does not explicitly teach: the computer program product encoded on one or more non-transitory computer storage media and segmenting the image data based upon appearance of the target structure where the determination of insertion is from the segmented image data.
Tahmasebi in the field of ultrasound needle guidance teaches: “…a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.” [0030];
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Park to include a non-transitory memory as taught in Tahmasebi to perform and store various computer implemented functions in connection to the system as a whole.
VonAllmen in the field of ultrasound needle guidance systems teaches: “ The high level processor 18 executes various software subroutines or modules, including an image processing module 200 (FIG. 19), an image preprocessing module 250 (FIG. 20), a background and object segmentation module 300 (FIG. 22), a device or needle detection and tracking module 350 (FIG. 23), and a target detection and tracking module 400 (FIG. 24).” [0083]; “Once the position is confirmed by the operator, the coordinates of the target vein 60 relative to the ultrasound probe 26 are extracted from the ultrasound image 33."; “The FCM segmentation of block 308 employs the entropy features previously described and the pixel position to produce a segmentation map. The segmentation provided by both methods is then fuse into a single representation at block 310.” [0117]; “FIG. 26 illustrates the ultrasound image after processing by the background and object segmentation (BOS) module 300, where background clutter has been removed and the image enhanced. The needle target 456, including cross-hair 458 received within circle 460, identifies the projected trajectory of the needle 78.” [0146].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of references to segment the image data based upon appearance of the target structure where the determination of insertion is from the segmented image data as taught in VonAllmen “…to decompose the ultrasound image into separate objects.” (VonAllmen, [0011]), “… where each object is labeled in a way that facilitates the description of the original image so that it can be interpreted by the system that handles the image.” (VonAllmen, [0112]) and “…background clutter has been removed and the image enhanced.” (VonAllmen, [0147]).
Claims 2-4 is rejected under 35 U.S.C. 103 as being unpatentable over Park in view of Tahmasebi and Burdette et. al. (U.S. 20030135115, July 17, 2003)(hereinafter, “Burdette”).
Regarding Claim 2, Park teaches: A computer program product, the computer program product comprising instructions that, when executed by one or more computing devices, cause the one or more computing devices to perform operations comprising (“…angular position data is processed separately from data from the transducer using software associated with monitor 20 or a separate computer connected to monitor 20. This software produces a real-time, continuous image of the needle orientation that is superimposed over the ultrasound image. In other embodiments, transducer and angle measuring data may be processed simultaneously so as to produce a single data stream that is fed to monitor 20. The software used to process needle position information may be incorporated into probe 30 or reside at a separate computer.” [0060]):
accessing image data acquired from the subject using an ultrasound probe on a surface of the subject, the image data a plurality of images of a target structure of the subject, the plurality of images including a plurality of views of the target structure (“Probe 30 includes an ultrasound transducer, which generates the image data transmitted to monitor 20. This image data is used to generate real time images of the patient's body below the skinline.“ [0054]);
determining, from the plurality of views, a location of the target structure within the subject (“A health professional first places the transducer such that a target of interest (e.g. a vessel's lumen) is visible on the screen. The location of the target is then estimated and a needle guide is selected such that the needle will pass closest to the target's location.” [0012]);
determining an insertion point location for the interventional device where the interventional device reaches the target structure from the insertion point location (“System 10 is configured to provide a health professional with a visual indication of the needle pathway needed to intersect plane B at the target 64 and the insertion depth needed to place the tip of the needle at the target 64.” [0049]);
guiding placement of the ultrasound probe coupled to a guide system, the placement of the ultrasound probe positioning the guide system at the insertion point location (“…reference to FIGS. 2, 3, 2A and 3A, accurate positioning of needle 50 with respect to its intended target 64 proceeds as follows. First, monitor 20 is positioned within the health professional's immediate field of view of the patient and ultrasound device, so as to avoid any occurrence of drift during the procedure. If, initially, monitor 20 displays a cross hair, e.g., cross hair 62a, above the target, then the needle pathway needs adjustment…The processed signal produces real-time angular positional information for the needle pathway which is represented on monitor 20 as a downwardly moving cross-hair. As the needle pathway is adjusted downward by rotating needle 50, cross-hair 62a moves downward and towards target 64 until it reaches target 64, which corresponds to cross-hair 62b. Once the cross-hair is centered on the target, the desired needle pathway is located...” [0067]); and
tracking the interventional device from the insertion point location to the target structure (“Angular adjustments to the needle are displayed on monitor 20 in real time with the ultrasonic image so that the needle position can be tracked and aligned precisely with the target located within the patient.” [0049]; “…needle guidance portion 40 includes a needle tracking device that tracks the angular position of the needle as it rotates about axis A…” [0060]).
Park does not explicitly teach: the computer program product encoded on one or more non-transitory computer storage media and identifying a critical structure from the plurality of views which is considered in the point location to not penetrate the critical structure in the subject.
Tahmasebi in the field of ultrasound needle guidance teaches: “…a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.” [0030];
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Park to include a non-transitory memory as taught in Tahmasebi to perform and store various computer implemented functions in connection to the system as a whole.
Burdette in the field of image guidance for biopsy and needle positioning teaches: “…a target volume 110 is located within a working volume 102. In the invention's preferred application to prostate biopsies, the target volume 110 would be a patient's prostate or a portion thereof, and the working volume 102 would be the patient's pelvic area, which includes sensitive tissues such as the patient's rectum, urethra, and bladder.” [0024]; “…the present invention preferably includes determining the biopsy needle location corresponding to a biopsy sample extraction, wherein the graphically displayed target volume representation includes a graphical depiction of the determined biopsy needle location corresponding to the biopsy sample extraction.” [0011]; “The results of the tissue biopsy (i.e. malignant vs. benign) can be displayed in 3-D space registered with the appropriate surrounding anatomy of the target volume for easy evaluation by a clinician.” [0016]; “FIG. 5 illustrates an exemplary three-dimensional representation 500 of a target volume 110. The locations 130 of the biopsy sample extractions are also graphically depicted with the 3-D representation 500. Because the 3-D representation 500 is spatially registered, the three-dimensional coordinates of each biopsy sample location 130 is determinable.” [0042]. See reproduced Fig. 5 below.
PNG
media_image1.png
547
406
media_image1.png
Greyscale
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of references to identify a critical structure from the plurality of views which is considered in the point location for the intended purpose of not penetrating the critical structure in the subject as taught in Burdette increasing likelihood of accurate and meaningful procedure performance (Burdette, [0015]).
Regarding Claim 3, the combination of Park, Tahmasebi and Burdette teach the claim limitations as noted above.
Park does not teach: wherein the critical structure includes at least one of a bone, an unintended blood vessel, a non-target organ, or a nerve.
Burdette in the field of image guidance for biopsy and needle positioning teaches: “…a target volume 110 is located within a working volume 102. In the invention's preferred application to prostate biopsies, the target volume 110 would be a patient's prostate or a portion thereof, and the working volume 102 would be the patient's pelvic area, which includes sensitive tissues such as the patient's rectum, urethra, and bladder.” [0024]; “…the present invention preferably includes determining the biopsy needle location corresponding to a biopsy sample extraction, wherein the graphically displayed target volume representation includes a graphical depiction of the determined biopsy needle location corresponding to the biopsy sample extraction.” [0011]; “The results of the tissue biopsy (i.e. malignant vs. benign) can be displayed in 3-D space registered with the appropriate surrounding anatomy of the target volume for easy evaluation by a clinician.” [0016]; “FIG. 5 illustrates an exemplary three-dimensional representation 500 of a target volume 110. The locations 130 of the biopsy sample extractions are also graphically depicted with the 3-D representation 500. Because the 3-D representation 500 is spatially registered, the three-dimensional coordinates of each biopsy sample location 130 is determinable.” [0042].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the critical structure in the combination of reference to include at least a non-target organ as taught in Burdette “…for easy evaluation by a clinician…” (Burdette, [0016]).
Regarding Claim 4, the combination of Park, Tahmasebi and Burdette teach the claim limitations as noted above.
Park further teaches: wherein the plurality of images include images at a plurality of different timeframes (“…angular position data is processed separately from data from the transducer using software associated with monitor 20 or a separate computer connected to monitor 20. This software produces a real-time, continuous image of the needle orientation that is superimposed over the ultrasound image. In other embodiments, transducer and angle measuring data may be processed simultaneously so as to produce a single data stream that is fed to monitor 20. The software used to process needle position information may be incorporated into probe 30 or reside at a separate computer.” [0060]).
Claims 5-9 and 11-12 are rejected under 35 U.S.C. 103 as being unpatentable over Park in view of Tahmasebi.
Regarding Claim 5, Park teaches: A computer program product, the computer program product comprising instructions that, when executed by one or more computing devices, cause the one or more computing devices to perform operations comprising (“…angular position data is processed separately from data from the transducer using software associated with monitor 20 or a separate computer connected to monitor 20. This software produces a real-time, continuous image of the needle orientation that is superimposed over the ultrasound image. In other embodiments, transducer and angle measuring data may be processed simultaneously so as to produce a single data stream that is fed to monitor 20. The software used to process needle position information may be incorporated into probe 30 or reside at a separate computer.” [0060]):
accessing image data acquired from a subject using an ultrasound probe, wherein the image data including at least one image of a target structure of the subject (“Probe 30 includes an ultrasound transducer, which generates the image data transmitted to monitor 20. This image data is used to generate real time images of the patient's body below the skinline.“ [0054]);
determining, from the image data, a location of the target structure within the subject (“A health professional first places the transducer such that a target of interest (e.g. a vessel's lumen) is visible on the screen. The location of the target is then estimated and a needle guide is selected such that the needle will pass closest to the target's location.” [0012]);
determining an insertion point location for the interventional device based upon the location of the target structure (“System 10 is configured to provide a health professional with a visual indication of the needle pathway needed to intersect plane B at the target 64 and the insertion depth needed to place the tip of the needle at the target 64.” [0049]);
guiding placement of the ultrasound probe coupled to a guide system, the placement of the ultrasound probe positioning the guide system at the insertion point location (“…reference to FIGS. 2, 3, 2A and 3A, accurate positioning of needle 50 with respect to its intended target 64 proceeds as follows. First, monitor 20 is positioned within the health professional's immediate field of view of the patient and ultrasound device, so as to avoid any occurrence of drift during the procedure. If, initially, monitor 20 displays a cross hair, e.g., cross hair 62a, above the target, then the needle pathway needs adjustment…The processed signal produces real-time angular positional information for the needle pathway which is represented on monitor 20 as a downwardly moving cross-hair. As the needle pathway is adjusted downward by rotating needle 50, cross-hair 62a moves downward and towards target 64 until it reaches target 64, which corresponds to cross-hair 62b. Once the cross-hair is centered on the target, the desired needle pathway is located...” [0067]); and
guiding the interventional device, via the guide system, into a field of view (FOV) of the ultrasound probe (“Probe 30 includes a needle guidance device [guide system], rotatably mounted to probe 30, which enables a health professional to make angular adjustments to a needle [interventional device] mounted to probe 30 during a procedure.” [0049], [0052] [0063]);
and sending, to a display, guidance of the interventional device from the insertion point location to the target structure (“Probe 30 includes a needle guidance device, rotatably mounted to probe 30, which enables a health professional to make angular adjustments to a needle mounted to probe 30 during a procedure. Angular adjustments to the needle are displayed on monitor 20 in real time with the ultrasonic image so that the needle position can be tracked and aligned precisely with the target located within the patient.” [0049];“Ultrasonic image data can be generated and processed for display on monitor 20 using any suitably chosen ultrasound system.” [0050];“The depth position is then combined with the ultrasonic image data and displayed on monitor 20, e.g., the cross-hair 62a illustrated in FIGS. 2 and 3.” [0061];“The processed signal produces real-time angular positional information for the needle pathway which is represented on monitor 20 as a downwardly moving cross-hair.” [0067]).
Park does not explicitly teach: the computer program product encoded on one or more non-transitory computer storage media.
Tahmasebi in the field of ultrasound needle guidance teaches: “…a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.” [0030];
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Park to include a non-transitory memory as taught in Tahmasebi to perform and store various computer implemented functions in connection to the system as a whole.
Regarding Claim 6, the combination of Park and Tahmasebi teach the claim limitations as noted above.
Park further teaches: wherein the instructions further include determining at least one of an angle for the interventional device from the insertion point location to the target structure, a rotational angle for the ultrasound probe with respect to the subject, or an insertion distance from the insertion point location to the target structure (“…angular position data is processed separately from data from the transducer using software associated with monitor 20 or a separate computer connected to monitor 20. This software produces a real-time, continuous image of the needle orientation that is superimposed over the ultrasound image.” [0060];“The insertion distance .delta. for needle 50 may be obtained from the insertion angle .theta..sub.2 and other known distances which may be stored with the X, Y Position Parameters discussed earlier. For example, the insertion depth may be determined from .theta..sub.2, the distance from surface 37 and target 64, the horizontal distance between the needle shaft centerline (at the needle clip) and the scanning plane B and the vertical distance between the needle shaft centerline (at the needle clip) and the bottom surface of probe 30.” [0066]).
Regarding Claim 7, the combination of Park and Tahmasebi teach the claim limitations as noted above.
Park further teaches: wherein sending guidance to the display further includes sending, to the display, an indicator of at least one of the angle for the interventional device, the insertion point location, the location of the target structure, or the insertion distance (“System 10 is configured to provide a health professional with a visual indication of the needle pathway needed to intersect plane B at the target 64 and the insertion depth needed to place the tip of the needle at the target 64 (insertion distance .delta.). FIGS. 2 and 3 show images 60a and 60b, respectively, generated on monitor 20 that correspond respectively to the position of probe 30 and needle 50 illustrated in FIGS. 2A and 3A. Cross hairs 62a and 62b indicate the point of intersection between the respective needle pathways D and E and scanning plane B. A cross section of a blood vessel wall is also shown in FIGS. 2 and 3 with a section of the vessel wall corresponding to target 64.” [0065]; “…reference to FIGS. 2, 3, 2A and 3A, accurate positioning of needle 50 with respect to its intended target 64 proceeds as follows. First, monitor 20 is positioned within the health professional's immediate field of view of the patient and ultrasound device, so as to avoid any occurrence of drift during the procedure. If, initially, monitor 20 displays a cross hair, e.g., cross hair 62a, above the target, then the needle pathway needs adjustment…The processed signal produces real-time angular positional information for the needle pathway which is represented on monitor 20 as a downwardly moving cross-hair. As the needle pathway is adjusted downward by rotating needle 50, cross-hair 62a moves downward and towards target 64 until it reaches target 64, which corresponds to cross-hair 62b. Once the cross-hair is centered on the target, the desired needle pathway is located...” [0067]).
Regarding Claim 8, the combination of Park and Tahmasebi teach the claim limitations as noted above.
Park further teaches: wherein the indicator includes the insertion point location projected proximate to the target structure, or the ultrasound probe position at the insertion point location (“System 10 is configured to provide a health professional with a visual indication of the needle pathway needed to intersect plane B at the target 64 and the insertion depth needed to place the tip of the needle at the target 64 (insertion distance .delta.). FIGS. 2 and 3 show images 60a and 60b, respectively, generated on monitor 20 that correspond respectively to the position of probe 30 and needle 50 illustrated in FIGS. 2A and 3A. Cross hairs 62a and 62b indicate the point of intersection between the respective needle pathways D and E and scanning plane B. A cross section of a blood vessel wall is also shown in FIGS. 2 and 3 with a section of the vessel wall corresponding to target 64.” [0065]; “…reference to FIGS. 2, 3, 2A and 3A, accurate positioning of needle 50 with respect to its intended target 64 proceeds as follows. First, monitor 20 is positioned within the health professional's immediate field of view of the patient and ultrasound device, so as to avoid any occurrence of drift during the procedure. If, initially, monitor 20 displays a cross hair, e.g., cross hair 62a, above the target, then the needle pathway needs adjustment…The processed signal produces real-time angular positional information for the needle pathway which is represented on monitor 20 as a downwardly moving cross-hair. As the needle pathway is adjusted downward by rotating needle 50, cross-hair 62a moves downward and towards target 64 until it reaches target 64, which corresponds to cross-hair 62b. Once the cross-hair is centered on the target, the desired needle pathway is located...” [0067]).
Regarding Claim 9, the combination of Park and Tahmasebi teach the claim limitations as noted above.
Park further teaches: wherein the instructions further tracking the interventional device from the insertion point location to the target structure, wherein sending guidance to the display includes providing real-time feedback to a user, via the display, based on tracking the interventional device (“Angular adjustments to the needle are displayed on monitor 20 in real time with the ultrasonic image so that the needle position can be tracked and aligned precisely with the target located within the patient.” [0049]).
Regarding Claim 11, the combination of Park and Tahmasebi teach the claim limitations as noted above.
Park further teaches: wherein the target structure is one of an artery, a vein, a femoral artery, a femoral vein, a jugular vein, a peripheral vein, a subclavian vein, an airway, a lumen, a luminal organ, a body cavity, a fluid filled anatomic space, a location requiring biopsy, a breast, a kidney, a lymph node, a spinal canal, a location requiring nerve block, a peritoneal space or a pleural space (“ A health professional first places the transducer such that a target of interest (e.g. a vessel's lumen) is visible on the screen. The location of the target is then estimated and a needle guide is selected such that the needle will pass closest to the target's location.” [0012]; “A cross section of a blood vessel wall is also shown in FIGS. 2 and 3 with a section of the vessel wall corresponding to target 64.” [0065]).
Regarding Claim 12, the combination of Park and Tahmasebi teach the claim limitations as noted above.
Park further teaches: wherein accessing the image data includes receiving a plurality of images of the target structure of the subject acquired in real time (“…angular position data is processed separately from data from the transducer using software associated with monitor 20 or a separate computer connected to monitor 20. This software produces a real-time, continuous image of the needle orientation that is superimposed over the ultrasound image. In other embodiments, transducer and angle measuring data may be processed simultaneously so as to produce a single data stream that is fed to monitor 20. The software used to process needle position information may be incorporated into probe 30 or reside at a separate computer.” [0060]).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Park in view of Tahmasebi as applied to claim 5 above, and further in view of VonAllmen.
Regarding Claim 10, the combination of Park and Tahmasebi teach the claim limitations as noted above.
The combination of references does not teach: wherein determining the instructions for determining the location of the target structure within the subject include segmenting the image data based upon appearance of the target structure to determine at least one of the location of the target structure or the identity of the target structure.
VonAllmen in the field of ultrasound needle guidance systems teaches: “ The high level processor 18 executes various software subroutines or modules, including an image processing module 200 (FIG. 19), an image preprocessing module 250 (FIG. 20), a background and object segmentation module 300 (FIG. 22), a device or needle detection and tracking module 350 (FIG. 23), and a target detection and tracking module 400 (FIG. 24).” [0083]; “Once the position is confirmed by the operator, the coordinates of the target vein 60 relative to the ultrasound probe 26 are extracted from the ultrasound image 33."; “The FCM segmentation of block 308 employs the entropy features previously described and the pixel position to produce a segmentation map. The segmentation provided by both methods is then fuse into a single representation at block 310.” [0117]; “FIG. 26 illustrates the ultrasound image after processing by the background and object segmentation (BOS) module 300, where background clutter has been removed and the image enhanced. The needle target 456, including cross-hair 458 received within circle 460, identifies the projected trajectory of the needle 78.” [0146].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of references to segment the image data based upon appearance of the target structure as taught in VonAllmen “…to decompose the ultrasound image into separate objects.” (VonAllmen, [0011]), “… where each object is labeled in a way that facilitates the description of the original image so that it can be interpreted by the system that handles the image.” (VonAllmen, [0112]) and “…background clutter has been removed and the image enhanced.” (VonAllmen, [0147]).
Claims 13-15 is rejected under 35 U.S.C. 103 as being unpatentable over Park in view of Tahmasebi as applied to claim 12 above, and further in view of Burdette.
Regarding Claim 13, the combination of Park and Tahmasebi teach the claim limitations as noted above.
Park further teaches: wherein the plurality of images include a plurality of views of the target structure (“Probe 30 includes an ultrasound transducer, which generates the image data transmitted to monitor 20. This image data is used to generate real time images of the patient's body below the skinline.“ [0054]).
Park does not teach: and the instructions further include assessing the plurality of views to identify a critical structure in the subject, and identifying a location on the subject where the interventional device reaches the target structure from the insertion point location without penetrating the critical structure in the subject.
Burdette in the field of image guidance for biopsy and needle positioning teaches: “…a target volume 110 is located within a working volume 102. In the invention's preferred application to prostate biopsies, the target volume 110 would be a patient's prostate or a portion thereof, and the working volume 102 would be the patient's pelvic area, which includes sensitive tissues such as the patient's rectum, urethra, and bladder.” [0024]; “…the present invention preferably includes determining the biopsy needle location corresponding to a biopsy sample extraction, wherein the graphically displayed target volume representation includes a graphical depiction of the determined biopsy needle location corresponding to the biopsy sample extraction.” [0011]; “The results of the tissue biopsy (i.e. malignant vs. benign) can be displayed in 3-D space registered with the appropriate surrounding anatomy of the target volume for easy evaluation by a clinician.” [0016]; “FIG. 5 illustrates an exemplary three-dimensional representation 500 of a target volume 110. The locations 130 of the biopsy sample extractions are also graphically depicted with the 3-D representation 500. Because the 3-D representation 500 is spatially registered, the three-dimensional coordinates of each biopsy sample location 130 is determinable.” [0042].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of references to identify a critical structure from the plurality of views which is considered in the point location for the intended purpose of not penetrating the critical structure in the subject as taught in Burdette increasing likelihood of accurate and meaningful procedure performance (Burdette, [0015]).
Regarding Claim 14, the combination of Park, Tahmasebi and Burdette teach the claim limitations as noted above.
Park does not teach: wherein the critical structure includes at least one of a bone, an unintended blood vessel, a non-target organ, or a nerve.
Burdette in the field of image guidance for biopsy and needle positioning teaches: “…a target volume 110 is located within a working volume 102. In the invention's preferred application to prostate biopsies, the target volume 110 would be a patient's prostate or a portion thereof, and the working volume 102 would be the patient's pelvic area, which includes sensitive tissues such as the patient's rectum, urethra, and bladder.” [0024]; “…the present invention preferably includes determining the biopsy needle location corresponding to a biopsy sample extraction, wherein the graphically displayed target volume representation includes a graphical depiction of the determined biopsy needle location corresponding to the biopsy sample extraction.” [0011]; “The results of the tissue biopsy (i.e. malignant vs. benign) can be displayed in 3-D space registered with the appropriate surrounding anatomy of the target volume for easy evaluation by a clinician.” [0016]; “FIG. 5 illustrates an exemplary three-dimensional representation 500 of a target volume 110. The locations 130 of the biopsy sample extractions are also graphically depicted with the 3-D representation 500. Because the 3-D representation 500 is spatially registered, the three-dimensional coordinates of each biopsy sample location 130 is determinable.” [0042].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the critical structure in the combination of reference to include at at least a non-target organ as taught in Burdette “…for easy evaluation by a clinician…” (Burdette, [0016]).
Regarding Claim 15, the combination of Park, Tahmasebi and Burdette teach the claim limitations as noted above.
Park further teaches: wherein the plurality of images include images at a plurality of different timeframes (“…angular position data is processed separately from data from the transducer using software associated with monitor 20 or a separate computer connected to monitor 20. This software produces a real-time, continuous image of the needle orientation that is superimposed over the ultrasound image. In other embodiments, transducer and angle measuring data may be processed simultaneously so as to produce a single data stream that is fed to monitor 20. The software used to process needle position information may be incorporated into probe 30 or reside at a separate computer.” [0060]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMAL FARAG whose telephone number is (571)270-3432. The examiner can normally be reached 8:30 - 5:30 M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Keith Raymond can be reached at (571) 270-1790. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMAL ALY FARAG/Primary Examiner, Art Unit 3798