DETAILED ACTION
This office action is in response to the communication received on 10/28/2025 concerning application no. 18/035,985 filed on 05/09/2023.
Claims 1, 4, 7, 9-10, and 13-16 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 10/28/2025 have been fully considered but they are not persuasive.
Regarding the 103 rejection, Applicant argues that claim 1 enables a more accurate 3D Doppler map as it uses pixel mask that enables the calculation of physically accurate vessel cross-sections. Applicant argues that Li does not teach the generation of a 3D Doppler vessel map and is instead showing the characterization of a flow characteristic in an imaged field of view. Applicant argues that the Figures do not show a map. Furthermore, Applicant argues that the present claims are operating in real time for the generation of the tracked imaging region.
Examiner respectfully disagrees. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., a more accurate 3D Doppler map as it uses pixel mask that enables the calculation of physically accurate vessel cross-sections) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993).
Regarding Li, Applicant’s remarks are contrary to Li’s teachings. Paragraph 0073 clearly teaches that the “velocity information can be visualized as maps of flow velocities in the entire field of view.” (emphasis added). These flow velocities are profiles of the multiple blood blow dimensions in the given spatial location as noted in the Abstract of Li. Additionally, Applicant’s remarks about a map not being shown is unpersuasive. There is no requirement in a prior art rejection that a drawing must be shown. Assuming, arguendo, a drawing was required, Applicant’s argument is still incorrect. Figs. 5-10 of Li clearly show a map of the anatomy with velocity vectors of the blood flow. MPEP 2145 establishes “If a prima facie case of obviousness is established, the burden shifts to the applicant to come forward with arguments and/or evidence to rebut the prima facie case. See, e.g., In re Dillon, 919 F.2d 688, 692, 16 USPQ2d 1897, 1901 (Fed. Cir. 1990) (en banc). Rebuttal evidence and arguments can be presented in the specification, In re Soni, 54 F.3d 746, 750, 34 USPQ2d 1684, 1687 (Fed. Cir. 1995), by counsel, In re Chu, 66 F.3d 292, 299, 36 USPQ2d 1089, 1094-95 (Fed. Cir. 1995), or by way of an affidavit or declaration under 37 CFR 1.132, e.g., Soni, 54 F.3d at 750, 34 USPQ2d at 1687; In re Piasecki, 745 F.2d 1468, 1474, 223 USPQ 785, 789-90 (Fed. Cir. 1984). However, arguments of counsel cannot take the place of factually supported objective evidence. See, e.g., In re Huang, 100 F.3d 135, 139-40, 40 USPQ2d 1685, 1689 (Fed. Cir. 1996); In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984).” Here, Li clearly shows the velocity map of the blood flow according to Doppler acquisition and is counter to Applicant’s conclusory allegations.
Finally, regarding the real-time element in the claim, Stolka clearly teaches this in paragraph 0120 that teaches the capture and digitization in real time. Furthermore, the images and the data are updated in real time and can be overlayed in real time. Again, Applicant is reminded that MPEP 2145 establishes “arguments of counsel cannot take the place of factually supported objective evidence. See, e.g., In re Huang, 100 F.3d 135, 139-40, 40 USPQ2d 1685, 1689 (Fed. Cir. 1996); In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984).”
Examiner respectfully maintains the rejection.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
“obtaining a first image of a surface acquired…by way of an image sensor” in claims 1, 7, and 9-10: Paragraphs 0084-87 teach “In an embodiment, the image sensor comprises one or more of: a camera; a 3D camera; and a LIDAR sensor.”
“obtaining a second motion component of the ultrasound probe acquired… by way of an inertial measurement unit” in claim 1, 7, and 9: “In an embodiment, the inertial measurement unit comprises one or more of: an accelerometer; and a gyroscope.”
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4, 7, 9-10, and 13-16 are rejected under 35 U.S.C. 103 as being unpatentable over Stolka et al. (PGPUB No. US 2013/0016185) in view of Li et al. (PGPUB No. US 2021/0100524) as supported by Roundhill (PGPUB No. US 2010/0262008).
Regarding claim 1, Stolka teaches a method for generating a tracked imaging region representing ultrasound data acquired from a subject, the method comprising:
obtaining ultrasound data acquired from an imaging region by way of an ultrasound probe (Paragraph 0057 teaches the use of an ultrasound probe for imaging. Paragraph 0054 teaches the imaging of an ROI);
obtaining ultrasound data acquired from an imaging region by way of an ultrasound probe (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision. Furthermore, the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
obtaining a first image of a surface acquired during the acquisition of the ultrasound data by way of an image sensor coupled to the ultrasound probe (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision. Furthermore, the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
obtaining a second image of the surface acquired during the acquisition of the ultrasound data by way of the image sensor coupled to the ultrasound probe (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision. Furthermore, the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
comparing the first image and the second image (Paragraphs 0063-64 teach the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
computing a first motion component of the ultrasound probe based on the comparison (Paragraphs 0063-64 teach the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
obtaining a second motion component of the ultrasound probe acquired during the acquisition of the ultrasound data by way of an inertial measurement unit coupled to the image sensor (Paragraph 0063 teaches that an inertial sensor component can be used in the tracking of the position and orientation of the probe and operate in conjunction with the optical sensor system);
combining the first motion component and the second motion component, thereby generating a calculated motion of the ultrasound probe across the surface (Paragraph 0064 teaches that the optical sensor system works with the inertial sensor component in the tracking of the probe. Paragraph 0088 teaches that the local translation data across the scan surface like the skin is tracked via internal probe tracking. This includes further consideration of absolute orientation and rotation motion data);
combining the ultrasound data from the imaging region based on the motion of the ultrasound probe, thereby generating the tracked imaging region (Paragraph 0052 teaches that the ultrasound probe’s current motion is tracked and the tracking information can be registered with the image data. Paragraph 0064 teaches the registration of the surface information with the ultrasound imaging); and
generating a 3D ultrasound volume comprising the tracked imaging region by combining the ultrasound data of the tracked imaging region based on the calculated motion of the ultrasound probe (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface);
generating an ultrasound image based on the ultrasound data obtained from the tracked imaging region (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking);
generating, in real-time, a representation of the tracked imaging region within the 3D ultrasound volume based on a combination of the ultrasound image, the 3D ultrasound volume and the motion of the ultrasound probe (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface. Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays. Paragraph 0100 teaches the real-time display for the guidance of needle tracking and placement at an interventional site. Paragraph 0120 that teaches the capture and digitization in real time. Furthermore, the images and the data are updated in real time and can be overlayed in real time); and
displaying on a display device the representation of the tracked imaging region (Paragraph 0035 teaches the display of fused data and the display of navigation information via guidance overlays. See Fig. 12).
However, Stolka is silent regarding a method,
wherein the ultrasound data comprises Doppler ultrasound data;
generating a 3D Doppler vessel map based on the Doppler ultrasound data and the motion of the ultrasound probe,
deriving a blood flow measure from the 3D Doppler vessel map;
displaying on a display device an indication of the blood flow measure and the 3D Doppler vessel map.
In an analogous imaging field of endeavor, regarding ultrasound imaging of a patient, Li teaches a method,
wherein the ultrasound data comprises Doppler ultrasound data (Abstract teaches Doppler acquisition);
generating a 3D Doppler vessel map based on the Doppler ultrasound data and the motion of the ultrasound probe (Paragraphs 0031-38 teaches Doppler acquisition and the generation of video and image data with temporal profiles of axial velocity, lateral velocity, magnitude of velocity and angle of velocity at any location in the field of view),
deriving a blood flow measure from the 3D Doppler vessel map (Paragraphs 0031-38 teaches Doppler acquisition and the generation of video and image data with temporal profiles of axial velocity, lateral velocity, magnitude of velocity and angle of velocity at any location in the field of view);
displaying on a display device the tracked imaging region, an indication of the blood flow measure, and the 3D Doppler vessel map (Abstract teaches the display of the flow profiles and the dimensions. See Figs. 6 and 11).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Stolka with Li’s teaching of the use of a 3D Doppler map in the derivation of blood flow according to angular and cross-section information. It is well known in the art that “it is common practice to acquire Doppler data” in the context of ultrasound imaging of patient vasculature and ultrasound transducers can be moved along the surface (Paragraph 0015 and Figs. 1-2 of Roundhill). Such a capability is further enabled via the transducer and ultrasound system is equipped to acquire three-dimensional (3D) image data (Paragraph 0015 of Roundhill). This modified method would allow the user to improve imaging with 3D characterization of blood flow dynamics in vivo (Paragraph 0012 of Li). Furthermore, the modification can be implemented and integrated into medical ultrasound scanners at low cost without major modification of hardware and no customized hardware is needed at all (Paragraph 0013 of Li).
Regarding claim 4, modified Stolka teaches the method in claim 1, as discussed above.
Stolka further teaches a method, wherein the method further comprises:
generating guidance information for positioning an interventional device based on the 3D ultrasound volume (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface. Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays. Paragraph 0100 teaches the real-time display for the guidance of needle tracking and placement at an interventional site); and
providing the guidance information to a user (Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays).
Regarding claim 7, Stolka teaches a non-transitory computer-readable storage medium comprising instructions which, when executed by a computer, cause the computer to carry out the steps of:
obtaining ultrasound data acquired from an imaging region by way of an ultrasound probe (Paragraph 0057 teaches the use of an ultrasound probe for imaging. Paragraph 0054 teaches the imaging of an ROI);
obtaining a first image of a surface acquired during the acquisition of the ultrasound data by way of an image sensor coupled to the ultrasound probe (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision. Furthermore, the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
obtaining a second image of the surface acquired during the acquisition of the ultrasound data by way of the image sensor coupled to the ultrasound probe (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision. Furthermore, the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
comparing the first image and the second image (Paragraphs 0063-64 teach the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
computing a first motion component of the ultrasound probe based on the comparison (Paragraphs 0063-64 teach the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
obtaining a second motion component of the ultrasound probe acquired during the acquisition of the ultrasound data by way of an inertial measurement unit coupled to the image sensor (Paragraph 0063 teaches that an inertial sensor component can be used in the tracking of the position and orientation of the probe and operate in conjunction with the optical sensor system);
combining the first motion component and the second motion component, thereby generating a calculated motion of the ultrasound probe across the surface (Paragraph 0064 teaches that the optical sensor system works with the inertial sensor component in the tracking of the probe. Paragraph 0088 teaches that the local translation data across the scan surface like the skin is tracked via internal probe tracking. This includes further consideration of absolute orientation and rotation motion data);
combining the ultrasound data from the imaging region and the motion of the ultrasound probe, thereby generating a tracked imaging region (Paragraph 0052 teaches that the ultrasound probe’s current motion is tracked and the tracking information can be registered with the image data. Paragraph 0064 teaches the registration of the surface information with the ultrasound imaging); and
generating a 3D ultrasound volume comprising the tracked imaging region by combining the ultrasound data of the tracked imaging region based on the calculated motion of the ultrasound probe (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface);
generating an ultrasound image based on the ultrasound data obtained from the tracked imaging region (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking);
generating, in real-time, a representation of the tracked imaging region within the 3D ultrasound volume based on a combination of the ultrasound image, the 3D ultrasound volume and the motion of the ultrasound probe (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface. Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays. Paragraph 0100 teaches the real-time display for the guidance of needle tracking and placement at an interventional site. Paragraph 0120 that teaches the capture and digitization in real time. Furthermore, the images and the data are updated in real time and can be overlayed in real time); and
displaying on a display device the representation of the tracked imaging region (Paragraph 0035 teaches the display of fused data and the display of navigation information via guidance overlays. See Fig. 12).
However, Stolka is silent regarding a non-transitory computer-readable storage medium,
wherein the ultrasound data comprises Doppler ultrasound probe;
generating a 3D Doppler vessel map based on the Doppler ultrasound data and the motion of the ultrasound probe,
deriving a blood flow measure from the 3D Doppler vessel map,
displaying on a display device an indication of the blood flow measure, and the 3D Doppler vessel map.
In an analogous imaging field of endeavor, regarding ultrasound imaging of a patient, Li teaches a non-transitory computer-readable storage medium,
wherein the ultrasound data comprises Doppler ultrasound data (Abstract teaches Doppler acquisition);
generating a 3D Doppler vessel map based on the Doppler ultrasound data and the motion of the ultrasound probe (Abstract teaches Doppler acquisition. Paragraphs 0031-38 teaches Doppler acquisition and the generation of video and image data with temporal profiles of axial velocity, lateral velocity, magnitude of velocity and angle of velocity at any location in the field of view);
deriving a blood flow measure from the 3D Doppler vessel map (Paragraphs 0031-38 teaches Doppler acquisition and the generation of video and image data with temporal profiles of axial velocity, lateral velocity, magnitude of velocity and angle of velocity at any location in the field of view),
displaying on a display device an indication of the blood flow measure, and the 3D Doppler vessel map (Abstract teaches the display of the flow profiles and the dimensions. See Figs. 6 and 11).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Stolka with Li’s teaching of the use of a 3D Doppler map in the derivation of blood flow according to angular and cross-section information. It is well known in the art that “it is common practice to acquire Doppler data” in the context of ultrasound imaging of patient vasculature and ultrasound transducers can be moved along the surface (Paragraph 0015 and Figs. 1-2 of Roundhill). Such a capability is further enabled via the transducer and ultrasound system is equipped to acquire three-dimensional (3D) image data (Paragraph 0015 of Roundhill). This modified apparatus would allow the user to improve imaging with 3D characterization of blood flow dynamics in vivo (Paragraph 0012 of Li). Furthermore, the modification can be implemented and integrated into medical ultrasound scanners at low cost without major modification of hardware and no customized hardware is needed at all (Paragraph 0013 of Li).
Regarding claim 9, Stolka teaches a processing system for use in an ultrasound system and for generating a tracked imaging region representing ultrasound data acquired from an imaging region, the processing system comprising:
an input to receive a (i) ultrasound data acquired from the imaging region by way of an ultrasound probe (Paragraph 0057 teaches the use of an ultrasound probe for imaging. Paragraph 0054 teaches the imaging of an ROI), (ii) a first image of a surface acquired during the acquisition of the ultrasound data by way of an image sensor coupled to the ultrasound probe (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision. Furthermore, the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time), and (iii) a second image of the surface acquired during the acquisition of the ultrasound data by way of the image sensor coupled to the ultrasound probe (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision. Furthermore, the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time); and
a processor coupled to the input to (It is inherent that a computational system will utilize a processor and memory for the performance of its computational functions):
compare the first image and the second image (Paragraphs 0063-64 teach the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
compute a first motion component of the ultrasound probe based on the comparison (Paragraphs 0063-64 teach the optical system is detecting the motion of the imaging component with surface and are able to implement feature and device tracking algorithms to track the device, various surface features, or surface region patches over time to thereby perform trajectory and stereo surface reconstruction. Paragraph 0028 teaches that the operation can be in real time);
combine the first motion component and a second motion component, wherein the second motion component represents motion of the ultrasound probe acquired during the acquisition of the ultrasound data by way of an inertial measurement unit coupled to the image sensor, thereby generating a calculated motion of the ultrasound probe across the surface (Paragraph 0063 teaches that an inertial sensor component can be used in the tracking of the position and orientation of the probe and operate in conjunction with the optical sensor system. Paragraph 0064 teaches that the optical sensor system works with the inertial sensor component in the tracking of the probe. Paragraph 0088 teaches that the local translation data across the scan surface like the skin is tracked via internal probe tracking. This includes further consideration of absolute orientation and rotation motion data);
combine the ultrasound data from the imaging region and the motion of the ultrasound probe, thereby generating the tracked imaging region (Paragraph 0052 teaches that the ultrasound probe’s current motion is tracked and the tracking information can be registered with the image data. Paragraph 0064 teaches the registration of the surface information with the ultrasound imaging);
generate a 3D ultrasound volume comprising the tracked imaging region by combining the ultrasound data of the tracked imaging region based on the calculated motion of the ultrasound probe (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface);
generate an ultrasound image based on the ultrasound data obtained from the tracked imaging region (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking);
generate, in real-time, a representation of the tracked imaging region within the 3D ultrasound volume based on a combination of the ultrasound image, the 3D ultrasound volume and the motion of the ultrasound probe (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface. Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays. Paragraph 0100 teaches the real-time display for the guidance of needle tracking and placement at an interventional site. Paragraph 0120 that teaches the capture and digitization in real time. Furthermore, the images and the data are updated in real time and can be overlayed in real time); and
a display device configured to display the tracked imaging region (Paragraph 0035 teaches the display of fused data and the display of navigation information via guidance overlays. See Fig. 12).
However, Stolka is silent regarding a processing system,
wherein the ultrasound data comprises Doppler ultrasound data;
generate a 3D Doppler vessel map based on the Doppler ultrasound data and the calculated motion of the ultrasound probe across the surface;
derive a blood flow measure from the 3D Doppler vessel map,
a display device configured to display the representation of the an indication of the blood flow measure and the 3D Doppler vessel map.
In an analogous imaging field of endeavor, regarding ultrasound imaging of a patient, Li teaches a processing system,
wherein the ultrasound data comprises Doppler ultrasound data (Abstract teaches Doppler acquisition);
generate a 3D Doppler vessel map based on the Doppler ultrasound data and the calculated motion of the ultrasound probe across the surface (Abstract teaches Doppler acquisition. Paragraphs 0031-38 teaches Doppler acquisition and the generation of video and image data with temporal profiles of axial velocity, lateral velocity, magnitude of velocity and angle of velocity at any location in the field of view);
derive a blood flow measure from the 3D Doppler vessel map (Paragraphs 0031-38 teaches Doppler acquisition and the generation of video and image data with temporal profiles of axial velocity, lateral velocity, magnitude of velocity and angle of velocity at any location in the field of view),
a display device configured to display the representation of an indication of the blood flow measure and the 3D Doppler vessel map (Abstract teaches the display of the flow profiles and the dimensions. See Figs. 6 and 11).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Stolka with Li’s teaching of the use of a 3D Doppler map in the derivation of blood flow according to angular and cross-section information. It is well known in the art that “it is common practice to acquire Doppler data” in the context of ultrasound imaging of patient vasculature and ultrasound transducers can be moved along the surface (Paragraph 0015 and Figs. 1-2 of Roundhill). Such a capability is further enabled via the transducer and ultrasound system is equipped to acquire three-dimensional (3D) image data (Paragraph 0015 of Roundhill). This modified apparatus would allow the user to improve imaging with 3D characterization of blood flow dynamics in vivo (Paragraph 0012 of Li). Furthermore, the modification can be implemented and integrated into medical ultrasound scanners at low cost without major modification of hardware and no customized hardware is needed at all (Paragraph 0013 of Li).
Regarding claim 10, modified Stolka teaches the processing system in claim 9, as discussed above.
Stolka further teaches an ultrasound imaging system, comprising:
the ultrasound probe adapted to acquire the ultrasound data (Paragraph 0057 teaches the use of an ultrasound probe for imaging. Paragraph 0054 teaches the imaging of an ROI);
the image sensor coupled to the ultrasound probe and adapted to acquire the images of the surface (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe); and
the inertial measurement unit coupled to the image sensor and adapted to acquire the second motion component (Paragraph 0063 teaches that an inertial sensor component can be used in the tracking of the position and orientation of the probe and operate in conjunction with the optical sensor system).
Regarding claim 13, modified Stolka teaches the ultrasound imaging system in claim 10, as discussed above.
Stolka further teaches an ultrasound imaging system, wherein the processor is further adapted to generate guidance information for positioning an interventional device based on the 3D ultrasound volume (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraphs 0063-64 teach the tracking of the ultrasound probe via the camera and the inertial sensor. Paragraph 0078 teaches that the camera units can reconstruct the 3D shape of the surface. Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays. Paragraph 0100 teaches the real-time display for the guidance of needle tracking and placement at an interventional site).
Regarding claim 14, modified Stolka teaches the ultrasound imaging system in claim 10, as discussed above.
Stolka further teaches an ultrasound imaging system, wherein the system further comprises a display unit, and wherein the processor is further adapted to instruct the display unit to display one or more of:
the 3D ultrasound volume (Paragraphs 0037-43 teaches that the disclosed system allows for generation of free-hand 3D ultrasound volumes without external tracking. Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays);
the 3D Doppler vessel map; and
the guidance information (Paragraph 0035 teaches the display of the imaging components being fused and the display of guidance overlays).
Regarding claim 15, modified Stolka teaches the ultrasound imaging system in claim 10, as discussed above.
Stolka further teaches an ultrasound imaging system, wherein the image sensor comprises one or more of:
a camera (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision);
a 3D camera (Paragraphs 0063-64 teach the use of cameras that are attached to the ultrasound probe. See Fig. 2. The optical system can have two cameras that operate via stereovision); and
a LIDAR sensor.
Regarding claim 16, modified Stolka teaches the ultrasound imaging system in claim 10, as discussed above.
Stolka further teaches an ultrasound imaging system, wherein the inertial measurement unit comprises one or more of:
an accelerometer (Paragraph 0063 teaches that the inertial sensor can be an accelerometer or gyroscope); and
a gyroscope (Paragraph 0063 teaches that the inertial sensor can be an accelerometer or gyroscope).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Tanaka et al. (PGPUB No. US 2013/0102903): Teaches tracking of the blood flow and the motion of the probe and generating three-dimensional volumes.
Lamata de la Orden et al. (PGPUB No. US 2019/0082970): Teaches tracking of the blood flow and generating three-dimensional volumes.
Chen et al. (PGPUB No. US 2017/0184714): Teaches a more accurate 3D Doppler map as it uses pixel mask that enables the calculation of physically accurate vessel cross-sections.
Nachtomy et al. (US Patent No. 6,095,976): Teaches a more accurate 3D Doppler map as it uses pixel mask that enables the calculation of physically accurate vessel cross-sections.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ADIL PARTAP S VIRK whose telephone number is (571)272-8569. The examiner can normally be reached Mon-Fri 8-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached on 571-272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ADIL PARTAP S VIRK/Primary Examiner, Art Unit 3798