DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claims 7 & 20 are objected to because of the following informalities:
Claim 7 recites “and/or”, should instead recite “and” or “or”.
Claim 20 recites “and/or”, should instead recite “and” or “or”.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 11-12 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 11 recites the limitation “the interventional process guide does not provide information of an intervention angle of the interventional object” which renders the claim unclear. It is unclear how the process guide can provide distance information from the current starting point to the target point without providing some form of angle information.
Claim 12 recites the limitation “the endpoint of the interventional object, the minimum distance, and the maximum distance are displayed in a same linear direction” which renders the claim unclear. It is unclear how the maximum and minimum distance can be displayed along the same linear direction; the specifications example does not provide clarity in context with the dependence of Claim 10.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-9, & 13-20 are rejected under 35 U.S.C. 102(a)(1) & 102(a)(2) as being anticipated by Cohn et al (US20210153969A1; hereinafter referred to as Cohn).
Regarding Claim 1, Cohn discloses a method for guiding movement of a probe, so as to perform ultrasound imaging (“A method for navigating a probe to a location within a body of a patient” [Abstract]]), comprising:
using the probe to acquire a real-time ultrasound image related to a tissue to be imaged (“a real-time ultrasound may be utilized to generate a more accurate image of the desired region or portion of the patient that the target was determined to rest in.” [0037], “In the figure, the determined (e.g., calculated) trajectory 615 to the center 614 of the tumor 304 along the predetermined path is seen beginning at the surface of the virtual patient 600 and ending in the ablation site within the tumor 304.The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C.” [0049]);
determining a current intervention starting point, and determining a target intervention region and a non-intervention region in the ultrasound image (“the ablation probe positioning, quantity, and trajectory are calculated by the physician by providing the ability to navigate the model in search of the best trajectory regions to avoid healthy tissues/regions, or a combination of the aforementioned.” [0036], “FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary. As a further example, a planned trajectory 715 may be calculated and overlaid on the image to allow a user to compare the planned trajectory 715 with the calculated trajectory projection 718 and to make adjustments based on the same.” [0050]);
determining a current intervention path based on the target intervention region and the current intervention starting point, and determining a target intervention path based on the target intervention region and the non-intervention region (“the ablation probe positioning, quantity, and trajectory are calculated by the physician by providing the ability to navigate the model in search of the best trajectory regions to avoid healthy tissues/regions, or a combination of the aforementioned.” [0036], “FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary. As a further example, a planned trajectory 715 may be calculated and overlaid on the image to allow a user to compare the planned trajectory 715 with the calculated trajectory projection 718 and to make adjustments based on the same.” [0050]);
and based on the current intervention path and the target intervention path, generating and displaying movement guidance related to the probe, the movement guidance being configured to guide the probe to move such that the current intervention path and the target intervention path coincide (“As an illustrative example, a user may align the calculated trajectory projection 718 with the planned trajectory 715 in order to follow the planned path to a particular location (e.g., the ablation site 714). In addition to the projection 718, the projection or trajectory item (or other elements in display) may changes colors or some significant element of its item when the ultrasound is creating a sliced image over the actual ablation probe 716. This additional indicator may aid the user in knowing the real-time location of the ablation probe 716 during any planning or insertion steps. All of the calculated trajectories, both pre-planned and real-time trajectories, all may be displayed on the a display device such as a screen or a visualization headset in addition to being represented in the overlaid ultrasound image.” [0050]).
PNG
media_image1.png
597
411
media_image1.png
Greyscale
Regarding Claim 2, Cohn discloses the current intervention starting point is configured to be close to a side edge of the probe and located at an upper edge of the ultrasound image (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622. In FIG. 6C, the ultrasound device 619 is oriented along the directional axis of the calculated path 615, creating an ultrasound slice 620 along a larger portion of the calculated path or trajectory 615, if not all of it. The resulting image 621 has the overlaid calculated path trajectory 615 shown as an in-line axial cross-section of the customizable shape, object, or image 623. In addition to the calculated path or trajectory 615 overlaid on the ultrasound image 621, the projected trajectory as well as the actual location of the ablation probe in real-time are also tracked on the image.” [0049]).
Regarding Claim 3, Cohn discloses the current intervention starting point moves in real time with the movement of the probe, and during said movement, the relative positions of the current intervention starting point and the probe remain unchanged (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622.” [0049], as seen in Figs. 6B and 6C the calculated trajectory and entry point remain consistent no matter the probe positioning or movement).
PNG
media_image2.png
378
361
media_image2.png
Greyscale
Regarding Claim 4, Cohn discloses further comprising: displaying the current intervention path and the target intervention path in real time during the movement of the probe (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622.” [0049]).
Regarding Claim 5, Cohn discloses the current intervention path is a line connecting a point in the target intervention region to the current intervention starting point (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622.” [0049], see Fig. 6B for the intervention path).
Regarding Claim 6, Cohn discloses the target intervention path passes through a point of the target intervention region and does not pass through the non-intervention region (“Additionally or alternatively, if the tumor(s) or any of the surrounding tissues, organs and/or blood vessels did move, the CT scan/ultrasound will be used with the algorithm and the software to measure the shift and the computational geometry algorithm will automatically calculate new ablation probe trajectories as well as any other relevant ablation probe information or said action may be achieved by the physician if desired.” [0038]).
Regarding Claim 7, Cohn discloses further comprising: simultaneously displaying a target intervention starting point and the current intervention starting point in real time during the movement of the probe; wherein the target intervention starting point is an intersection point of the target intervention path and the upper edge of the ultrasound image; and/or simultaneously displaying the current intervention path and the target intervention path in real time during the movement of the probe (“FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary” [0050], see Fig. 7 for simultaneously displaying the current intervention path and the target intervention path).
Regarding Claim 8, Cohn discloses further comprising: controlling an ultrasound beam transmitted by the probe, causing the ultrasound beam to deflect in a direction perpendicular to the target intervention path (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615.” [0049]).
Regarding Claim 9, Cohn discloses further comprising: identifying an interventional object in the ultrasound image; (“Systems and methods are described for navigating a probe to a location within a body of a patient. The probe may comprise a needle, introducer, catheter, stylet, or sheath. “ [0024]).
generating and displaying an intervention process guide based on a positional relationship between the identified interventional object and the target intervention region (“FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304.” [0050]).
Regarding Claim 13, Cohn discloses an ultrasound imaging system, comprising: probe, the probe transmitting an ultrasound beam to tissue to be imaged, and receiving an echo signal (“A method for navigating a probe to a location within a body of a patient” [Abstract], “a real-time ultrasound may be utilized to generate a more accurate image of the desired region or portion of the patient that the target was determined to rest in.” [0037], “In the figure, the determined (e.g., calculated) trajectory 615 to the center 614 of the tumor 304 along the predetermined path is seen beginning at the surface of the virtual patient 600 and ending in the ablation site within the tumor 304.The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C.” [0049];
a processor configured to (“As an illustrative example, once this is accomplished, the physician, an artificial intelligence (AI) module of the software implementing the methodology of the present disclosure in conjunction with an ablation system, and/or the physician guided by the AI may then tag and record discrete slices of the tumor and the surgical path vector as the fan beam of the ultrasound probe is passed across the both the surgical path vector and the full target tumor. As the ultrasound records the position of the tumor and other relevant anatomical structures in the surrounding space, the AI/software automatically adjusts the CT-overlay to match the patient's real-time anatomy via a three-dimensional, line-of-best fit optimization, and subsequently adjusts the optimized surgical path vector for ablation probe trajectories to account for any anatomical shifting that may have occurred since the initial formulation of the trajectories that may have been based on historical imaging data. “ [0030]):
acquire a real-time ultrasound image obtained by the probe that is related to a tissue to be imaged (“a real-time ultrasound may be utilized to generate a more accurate image of the desired region or portion of the patient that the target was determined to rest in.” [0037], “In the figure, the determined (e.g., calculated) trajectory 615 to the center 614 of the tumor 304 along the predetermined path is seen beginning at the surface of the virtual patient 600 and ending in the ablation site within the tumor 304.The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C.” [0049]);
determine a current intervention starting point, and determining a target intervention region and a non-intervention region in the ultrasound image (“the ablation probe positioning, quantity, and trajectory are calculated by the physician by providing the ability to navigate the model in search of the best trajectory regions to avoid healthy tissues/regions, or a combination of the aforementioned.” [0036], “FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary. As a further example, a planned trajectory 715 may be calculated and overlaid on the image to allow a user to compare the planned trajectory 715 with the calculated trajectory projection 718 and to make adjustments based on the same.” [0050]);
determine a current intervention path based on the target intervention region and the current intervention starting point, and determining a target intervention path based on the target intervention region and the non-intervention region (“the ablation probe positioning, quantity, and trajectory are calculated by the physician by providing the ability to navigate the model in search of the best trajectory regions to avoid healthy tissues/regions, or a combination of the aforementioned.” [0036], “FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary. As a further example, a planned trajectory 715 may be calculated and overlaid on the image to allow a user to compare the planned trajectory 715 with the calculated trajectory projection 718 and to make adjustments based on the same.” [0050]);
and based on the current intervention path and the target intervention path, generating and displaying movement guidance related to the probe, the movement guidance being configured to guide the probe to move such that the current intervention path and the target intervention path coincide (“As an illustrative example, a user may align the calculated trajectory projection 718 with the planned trajectory 715 in order to follow the planned path to a particular location (e.g., the ablation site 714). In addition to the projection 718, the projection or trajectory item (or other elements in display) may changes colors or some significant element of its item when the ultrasound is creating a sliced image over the actual ablation probe 716. This additional indicator may aid the user in knowing the real-time location of the ablation probe 716 during any planning or insertion steps. All of the calculated trajectories, both pre-planned and real-time trajectories, all may be displayed on the a display device such as a screen or a visualization headset in addition to being represented in the overlaid ultrasound image.” [0050]).
PNG
media_image1.png
597
411
media_image1.png
Greyscale
Regarding Claim 14, Cohn discloses A non-transitory computer-readable medium, the non-transitory computer-readable medium having a computer program stored therein, the computer program having at least one code segment, and the at least one code segment being executable by a machine, to cause the machine to execute the steps of: (“A method for navigating a probe to a location within a body of a patient” [Abstract], “As an illustrative example, once this is accomplished, the physician, an artificial intelligence (AI) module of the software implementing the methodology of the present disclosure in conjunction with an ablation system, and/or the physician guided by the AI may then tag and record discrete slices of the tumor and the surgical path vector as the fan beam of the ultrasound probe is passed across the both the surgical path vector and the full target tumor. As the ultrasound records the position of the tumor and other relevant anatomical structures in the surrounding space, the AI/software automatically adjusts the CT-overlay to match the patient's real-time anatomy via a three-dimensional, line-of-best fit optimization, and subsequently adjusts the optimized surgical path vector for ablation probe trajectories to account for any anatomical shifting that may have occurred since the initial formulation of the trajectories that may have been based on historical imaging data. “ [0030]), comprising:
using a probe to acquire a real-time ultrasound image related to a tissue to be imaged (“a real-time ultrasound may be utilized to generate a more accurate image of the desired region or portion of the patient that the target was determined to rest in.” [0037], “In the figure, the determined (e.g., calculated) trajectory 615 to the center 614 of the tumor 304 along the predetermined path is seen beginning at the surface of the virtual patient 600 and ending in the ablation site within the tumor 304.The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C.” [0049]);
determining a current intervention starting point, and determining a target intervention region and a non-intervention region in the ultrasound image (“the ablation probe positioning, quantity, and trajectory are calculated by the physician by providing the ability to navigate the model in search of the best trajectory regions to avoid healthy tissues/regions, or a combination of the aforementioned.” [0036], “FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary. As a further example, a planned trajectory 715 may be calculated and overlaid on the image to allow a user to compare the planned trajectory 715 with the calculated trajectory projection 718 and to make adjustments based on the same.” [0050]);
determining a current intervention path based on the target intervention region and the current intervention starting point, and determining a target intervention path based on the target intervention region and the non-intervention region (“the ablation probe positioning, quantity, and trajectory are calculated by the physician by providing the ability to navigate the model in search of the best trajectory regions to avoid healthy tissues/regions, or a combination of the aforementioned.” [0036], “FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary. As a further example, a planned trajectory 715 may be calculated and overlaid on the image to allow a user to compare the planned trajectory 715 with the calculated trajectory projection 718 and to make adjustments based on the same.” [0050]);
and based on the current intervention path and the target intervention path, generating and displaying movement guidance related to the probe, the movement guidance being configured to guide the probe to move such that the current intervention path and the target intervention path coincide (“As an illustrative example, a user may align the calculated trajectory projection 718 with the planned trajectory 715 in order to follow the planned path to a particular location (e.g., the ablation site 714). In addition to the projection 718, the projection or trajectory item (or other elements in display) may changes colors or some significant element of its item when the ultrasound is creating a sliced image over the actual ablation probe 716. This additional indicator may aid the user in knowing the real-time location of the ablation probe 716 during any planning or insertion steps. All of the calculated trajectories, both pre-planned and real-time trajectories, all may be displayed on the a display device such as a screen or a visualization headset in addition to being represented in the overlaid ultrasound image.” [0050]).
PNG
media_image1.png
597
411
media_image1.png
Greyscale
Regarding Claim 15, Cohn discloses the current intervention starting point is configured to be close to a side edge of the probe and located at an upper edge of the ultrasound image (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622. In FIG. 6C, the ultrasound device 619 is oriented along the directional axis of the calculated path 615, creating an ultrasound slice 620 along a larger portion of the calculated path or trajectory 615, if not all of it. The resulting image 621 has the overlaid calculated path trajectory 615 shown as an in-line axial cross-section of the customizable shape, object, or image 623. In addition to the calculated path or trajectory 615 overlaid on the ultrasound image 621, the projected trajectory as well as the actual location of the ablation probe in real-time are also tracked on the image.” [0049]).
Regarding Claim 16, Cohn discloses the current intervention starting point moves in real time with the movement of the probe, and during said movement, the relative positions of the current intervention starting point and the probe remain unchanged (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622.” [0049], as seen in Figs. 6B and 6C the calculated trajectory and entry point remain consistent no matter the probe positioning or movement).
PNG
media_image2.png
378
361
media_image2.png
Greyscale
Regarding Claim 17, Cohn discloses further comprising: displaying the current intervention path and the target intervention path in real time during the movement of the probe (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622.” [0049]).
Regarding Claim 18, Cohn discloses the current intervention path is a line connecting a point in the target intervention region to the current intervention starting point (“The system itself is designed to allow for the real-time monitoring of the calculated path or trajectory 615 with an ultrasound 619 device illustrated in FIGS. 6B and 6C. FIGS. 6B and 6C illustrate the method of visualization for the calculated path or trajectory 615 relative to different ultrasound orientations; namely, perpendicular to the calculated path or trajectory 615 or along the calculated path or trajectory 615. In FIG. 6B, the ultrasound device 619 is oriented perpendicular to the calculated path or trajectory 615, creating an ultrasound slice 620 at some depth of the calculated path or trajectory 615. The resulting ultrasound image 621 has the overlaid calculated path or trajectory 615 shown as the perpendicular cross-section of a customizable shape, object, or image 622.” [0049], see Fig. 6B for the intervention path).
Regarding Claim 19, Cohn discloses the target intervention path passes through a point of the target intervention region and does not pass through the non-intervention region (“Additionally or alternatively, if the tumor(s) or any of the surrounding tissues, organs and/or blood vessels did move, the CT scan/ultrasound will be used with the algorithm and the software to measure the shift and the computational geometry algorithm will automatically calculate new ablation probe trajectories as well as any other relevant ablation probe information or said action may be achieved by the physician if desired.” [0038]).
Regarding Claim 20, Cohn discloses further comprising: simultaneously displaying a target intervention starting point and the current intervention starting point in real time during the movement of the probe; wherein the target intervention starting point is an intersection point of the target intervention path and the upper edge of the ultrasound image; and/or simultaneously displaying the current intervention path and the target intervention path in real time during the movement of the probe (“FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304. Thus, a user may determine that based on the calculated trajectory projection 718, correction is necessary” [0050], see Fig. 7 for simultaneously displaying the current intervention path and the target intervention path).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 10-12 are rejected under 35 U.S.C. 103 as being unpatentable over Cohn in view of Bharat et al (US20200390505A1; hereinafter referred to as Bharat).
Regarding Claim 10, Cohn discloses the information displayed in the intervention process guide comprises: an endpoint of the interventional object, and the intervention process guide is displayed independently of the ultrasound image (“FIG. 7 illustrates point 717 as an example entry point for a probe 716. A calculated trajectory projection 718 may be determined based on the real-time location of the ablation probe 716 at the time of the procedure or during planning studies. The calculated trajectory projection 718 may be projected line calculated based on the angulation and location of the real-time ablation probe 716. As shown, the calculated trajectory projection 718 may be overlaid on an image such as an ultrasound image (e.g., image 621 in FIGS. 6B and 6C) to allow a user to visualize a projection of the calculated trajectory of the probe 716. In this example, the calculated trajectory projection 718 is shown missing the ablation site 714 which is the center of the tumor 304.” [0050])..
Cohn does not specifically disclose displaying a minimum distance and a maximum distance from the target intervention region to the current intervention starting point.
However, in a similar field of endeavor, Bharat teaches an imaging probe being controlled to activate imaging elements to emit imaging signals to generate three or more imaging planes, to simultaneously capture an interventional device and anatomy targeted by the interventional device [Abstract].
Bharat also teaches displaying a minimum distance and a maximum distance from the target intervention region to the current intervention starting point (“At S360, a distance between the interventional device and anatomy targeted by the interventional device is determined and displayed. The distance may be determined in two dimensions, such as width (X)/height (Y), or may be determined in three dimensions such as width (X)/height (Y)/depth (Z).” [0046], “In FIG. 9, “Distance to target” embodiment: Display distance to anatomical target imaging plane on the interventional device X-plane.” [0073]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Cohn as outlined above with displaying a minimum distance and a maximum distance from the target intervention region to the current intervention starting point as taught by Bharat, because it allows for visualization of tissue around a device and other quantitative navigation metrics, without losing sight of targeted anatomy [0075].
Regarding Claim 11, Cohn discloses all limitations noted above except that the interventional process guide does not provide information of an intervention angle of the interventional object.
However, in a similar field of endeavor, Bharat teaches the interventional process guide does not provide information of an intervention angle of the interventional object (“At S360, a distance between the interventional device and anatomy targeted by the interventional device is determined and displayed. The distance may be determined in two dimensions, such as width (X)/height (Y), or may be determined in three dimensions such as width (X)/height (Y)/depth (Z).” [0046], “In FIG. 9, “Distance to target” embodiment: Display distance to anatomical target imaging plane on the interventional device X-plane.” [0073]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Cohn as outlined above with the interventional process guide does not provide information of an intervention angle of the interventional object as taught by Bharat, because it allows for visualization of tissue around a device and other quantitative navigation metrics, without losing sight of targeted anatomy [0075].
Regarding Claim 12, Cohn discloses the endpoint of the interventional object, the minimum distance, and the maximum distance are displayed in a same linear direction.
However, in a similar field of endeavor, Bharat teaches displaying the endpoint of the interventional object, the minimum distance, and the maximum distance are displayed in a same linear direction (“At S360, a distance between the interventional device and anatomy targeted by the interventional device is determined and displayed. The distance may be determined in two dimensions, such as width (X)/height (Y), or may be determined in three dimensions such as width (X)/height (Y)/depth (Z).” [0046], “In FIG. 9, “Distance to target” embodiment: Display distance to anatomical target imaging plane on the interventional device X-plane.” [0073], the same linear direction is being interpreted as the same imaging plane).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Cohn as outlined above with the endpoint of the interventional object, the minimum distance, and the maximum distance are displayed in a same linear direction as taught by Bharat, because it allows for visualization of tissue around a device and other quantitative navigation metrics, without losing sight of targeted anatomy [0075].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's
disclosure (US 20240350208 A1; US 20240050061 A1; US 20210161612 A1; US 20220061803 A1).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVEN MALDONADO whose telephone number is 703-756-1421. The examiner can normally be reached 8:00 am-4:00 pm PST M-Th Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at
http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached on (571) 272-7230. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Steven Maldonado/
Patent Examiner, Art Unit 3797
/CHRISTOPHER KOHARSKI/Supervisory Patent Examiner, Art Unit 3797