DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/22/2025 has been entered.
Response to Amendment
The proposed reply filed on 10/21/2025 has been entered. Claims 14, 17, 19, 22-25, and 27-28 remain pending in the current application. The amendments to the claims have overcome the 35 USC 112 rejections.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 14, 17, 19, 23-25, and 28 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 16, 18-20, and 24-26 of copending Application No. 18/255,885 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because they’re an obvious variation of each other. Both applications are directed to a tracking and visualization system to monitor a device inside a target.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Current application
copending Application No. 18/255,885
Claim 14: A device tracking system configured to generate a 3D anatomical mapping, monitor a target body structure of a patient, and localize a device inside said target body structure by means of ultrasounds, the device tracking system comprising: a control unit with a memory configured to store at least one Ultrasound Localization Microscopy (ULM) ultrasound image of the target body structure, the control unit being configured to launch an acquisition of the least one ULM ultrasound image, the control unit being an information emission/reception system configured to transform electric impulses into acoustic impulses (and vice versa) in order to enable an acoustic characterization of the target body structure; a device comprising: at least one steering element configured to be handled outside the target body structure, and at least one steerable element measuring between 0.5mm and 3mm in diameter, configured to be introduced inside the target body structure and to be handled manually by means of the steering element, the steerable element and the steering element being physically connected by at least one connection element, at least one probe configured to be brought in contact with a securing body part of the patient, the securing body part surrounding at least partially the target body structure, the at least one probe being configured to be in real time communication with the control unit, and the at least one probe being configured to be removably secured to the securing body part of the patient, and at least one tracker configured to be secured to the steerable element of the device, the at least one tracker comprising an object strongly reflecting ultrasounds waves, wherein the at least one probe and the at least one tracker are configured to communicate by means of ultrasounds, the control unit being thus configured to localize, in real time, the steerable element inside the target body structure, and wherein the control unit is configured to localize the steerable element by means of ULM localization, wherein the control unit is further configured to display, on a screen, the at least one stored ULM ultrasound image and display, in real time, the localization of the steerable element on said at least one stored ULM ultrasound image, and wherein the control unit is configured to combine the real time ultrasound information obtained from each probe regarding the at least one tracker and information of the stored ULM ultrasound image.
Claims 16+18: A micro-device tracking and visualization system configured to monitor a target body part of a patient and localizing a micro-device inside said target body part, the tracking system comprising: a micro-device designed to be remotely steered and controlled in a contactless manner from outside the target body part, at least one probe configured to be brought in contact with a securing body part of the patient, the securing body part surrounding at least partially the target body part, a control unit comprising a memory, the memory being configured to store at least one ultrasound image of the target body part, at least one tracker configured to be connected to the micro-device, at least a screen, wherein the at least one probe and the at least one tracker communicate by means of ultrasound technology, the control unit being thus able to localize, in real time, the at least one tracker inside the target body part within the an internal referential defined with regards to the at least one probe, wherein the control unit is further designed to display, on the screen, the at least one ultrasound image stored inside the memory of the control unit and to display, in real time, the localization of the micro-device on said at least one ultrasound image, wherein the ultrasound tracking of the tracker is co-registered with an acquisition of the at least one ultrasound image acquisition within the internal referential. The system according to claim 17, wherein the at least one ultrasound image is an ULM image
Claim 17: The system according to claim 14, wherein the at least one probe comprises at least one ultrasound transducer and the at least one tracker comprises at least one ultrasound sensor.
Claim 19: The system according to claim 16, wherein the at least one probe comprises at least one ultrasound transducer and the at least one tracker comprises at least one ultrasound sensor.
Claim 19: The system according to claim 14, wherein the at least one tracker comprises at least one ultrasound transducer and the at least one probe comprises at least one ultrasound sensor.
Claim 20: The system according to claim 16, wherein the at least one probe comprises at least one ultrasound sensor and the at least one tracker comprises at least one ultrasound transducer.
Claim 23: The system according to claim 14, wherein the memory of the control unit is configured to store a succession of ULM ultrasound image of the target body structure, each new ULM ultrasound image replacing the prior ULM ultrasound image.
Claim 24: The system according to claim 16, wherein the memory of the control unit is configured to store a succession of ultrasound image of the target body structure, each new ultrasound image replacing the prior one.
Claim 24: The system according to claim 23, wherein the ULM ultrasound image acquisition is done in real time, a new ULM ultrasound image acquisition being launched as soon a prior ULM ultrasound image acquisition is terminated, each new ULM ultrasound image replacing the prior ULM ultrasound image as soon its acquisition is terminated
Claim 25: The system according to claim 24, wherein the ultrasound image acquisition is done in real time, a new ultrasound image acquisition being launched as soon the prior ultrasound image acquisition is terminated, each new ultrasound image replacing the prior one as soon its acquisition is terminated.
Claim 25: The system according to claim 14, wherein the target body structure is a vascular system of the brain
Claim 26: The system according to claim 16, wherein the target body part is the patient's brain.
Claim 28: The system according to claim 14, wherein a new ULM ultrasound image acquisition is launched, by the control unit, as soon the prior ULM ultrasound image acquisition is terminated.
Claim 25: The system according to claim 24, wherein the ultrasound image acquisition is done in real time, a new ultrasound image acquisition being launched as soon the prior ultrasound image acquisition is terminated, each new ultrasound image replacing the prior one as soon its acquisition is terminated.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 14, 17, 19, 22-25, and 27-28 are rejected under 35 U.S.C. 103 as being unpatentable over Vaidya et (JP2020506749, the US2021/0353362 is used for examination purposes) in the view of Errico et al (NPL: “Ultrafast ultrasound localization microscopy for deep super-resolution vascular imaging”).
Regarding claim 14, Vaidya teaches a device tracking system configured to generate a 3D anatomical image, monitor a target body structure of a patient, and localizing a device inside said target body structure, the tracking system comprising (figures 1 and 3, paras. 0023 and 0030; an exemplary system 100 of the invention for imaging and tracking an interventional tool 112 as it is directed to a region of interest 114 within a subject or patient 108. A volume renderer 834 may generate an image of the 3D dataset as viewed from a given reference point):
a control unit with a memory configured to store at least one ultrasound image of the target body structure, the control unit being configured to launch an acquisition of the least one ultrasound image, the control unit being an information emission/reception system configured to transform electric impulses into acoustic impulses (and vice versa) in order to enable an acoustic characterization of the target body structure (paras. 0029 and 0032; The transmission of ultrasonic pulses from transducers of the imaging elements 120, 122 may be directed by the transmit controller 820 coupled to the T/R switch 818 and the beamformer 822, which may receive input from the user's operation of a user interface 824. Output (e.g., images) from the scan converter 830, the multiplanar reformatter 832, and/or the volume renderer 834 may be coupled to an image processor 836 for further enhancement, buffering and temporary storage before being displayed on an image display 838. In certain embodiments, the image processor 836 is configured to register images from the processed signal data. The examiner notes that the system comprises a controller unit that comprise multiple elements responsible for controlling the ultrasound image acquisition by a transmit controller and stores acquired images in a memory before displaying them),
a device comprising (figures 1 and 3, element 112, paras. 0021 and 0038-0040):
at least one steering element configured to be handled outside the target body structure (figures 1 and 3, element 112, paras. 0021 and 0038-0040; a physician inserts a device or tool into a patient's body to, e.g., biopsy, monitor, diagnose or treat. Typical interventional tools include, for example, guidewires, guide catheters or sheaths, delivery catheters, ablation catheters, imaging catheters, catheter sheaths, needles, and implantable devices (sensors, stents, filters, etc.). The examiner notes that the device is a catheter that is known to comprise a handle for the user to hold for insertion and manipulation of the catheter.), and
at least one steerable element measuring between 0.5mm and 3mm in diameter, configured to be introduced inside the target body structure and to be handled manually by means of the steering element, the steerable element and the steering element being physically connected by at least one connection element (paras. 0021-0022 and 0038-0040; the interventional device is configured for entry into one or more body lumens, and are imaged by the imaging elements. Various biological lumens include blood vessels, vasculature of the lymphatic and nervous systems, various structures of the gastrointestinal tract including lumen of the small intestine, large intestine, stomach, esophagus, colon, pancreatic duct, bile duct, hepatic duct, lumen of the reproductive tract including the vas deferens, vagina, uterus and fallopian tubes, structures of the urinary tract including urinary collecting ducts, renal tubules, ureter, and bladder, and structures of the head, neck and pulmonary system including sinuses, parotid, trachea, bronchi, and lungs. a preferred location of the sensor is at or proximate to a distal end of the interventional device. The examiner notes that the catheter is known to have a handle and a tip, where the tip diameter is selected based on the interventional procedure. Vascular catheters are known to have a small diameter to enter the vessel and are conventionally sized within 0.5-3 mm outer diameter. Therefore, the claimed diameter is inherent.),
at least one probe configured to be brought in contact with a securing body part of the patient, the securing body part surrounding at least partially the target body structure, the at least one probe being configured to be in real time communication with the control unit, and the at least one probe being configured to be removably secured to the securing body part of the patient (figure 1, elements 120 and 122, para. 0023; The system 100 includes at least two imaging elements 122, 120. In some instances, systems of the invention include 2, 3, 4, 5, or more imaging elements, the imaging elements may be fixed or immobile with respect to each other or a reference point. In other embodiments, at least one of the probes may be held immobile with respect to the region of interest. FIG. 2 shows imaging probe 106 in a fixed-stand 130. As an alternative to the fixed-stand 130, at least one of the imaging elements may be held against the subject using an adhesive patch or a band. The imaging elements 122, 120 are coupled to wires 124, which connect the imaging elements 122, 120 to one or more processors of an imaging system (described hereinafter). The imaging elements 122, 120 are positioned on a patient 108 in order to image a region of interest 114 and such that the field of view of each imaging element differs from each other. The examiner notes that the ultrasound probes are secured to the target structure using a band that is at least partially surrounding the target body structure.), and
at least one tracker configured to be secured to the steerable element of the device, the at least one tracker comprising an object strongly reflecting ultrasounds waves (paras. 0040 and 0042; the positional sensor 110 of the interventional tool 112 is configured to receive signals transmitted from the imaging elements 120, 122. For example, the imaging elements 120, 122 transmit imaging signals as described above, and the positional sensor 110 passively listens to the signals transmitted from the imaging elements 120, 122. The received signals from the imaging elements 122, 120 by the positional sensor 110 can be used to determine the position of the positional sensor 110, and thus the position of the interventional tool 112 within the generated image. In addition to technique described above and shown in FIG. 5, it is also contemplated that the location of the positional sensor is tracked based on signals emitted from the positional sensor 110 and received by the two or more imaging elements 120, 122. The examiner notes that the position sensors fixed to the distal end of the catheter are ultrasound sensors/transducers),
wherein the at least one probe and the at least one tracker are configured to communicate by means of ultrasounds, the control unit being thus configured to localize, in real time, the steerable element inside the target body structure (para. 0041; he positional sensor 110 is configured to receive signals from imaging element 122. The positional sensor 110 and the imaging element 122 are connected to and in communication with an imaging system (e.g. the same processing system or separate processing systems in communication with each other). In order to determine the position of the positional sensor 110 (and thus the interventional tool 112), the imaging element 122 sends and receives signals to generate an image of a region of interest. When the signals are transmitted for imaging, a trigger is sent to the imaging system and/or the positional sensor 110 that indicates when the signal was sent (e.g. starts the clock for a particular signal at zero). The transmitted signal is then received by the positional sensor 110, and time delay between when the signal was transmitted and when the signal was received (e.g. the time of flight of the signal) is used to determine the position of the positional sensor 110 within the imaging beam. That is, the time from beam emission to reception by the positional sensor indicates the depth of the positional sensor 110 within the imaging beam. This can be repeated for a plurality of imaging signals for real-time tracking of the positional sensor and thus the interventional tool in images.), and
wherein the control unit is configured to localize the steerable element by means of ultrasound localization (para. 0041; This can be repeated for a plurality of imaging signals for real-time tracking of the positional sensor and thus the interventional tool in images.),
wherein the control unit is further configured to display, on a screen, the at least one stored ultrasound image and display, in real time, the localization of the steerable element on said at least one stored ULM ultrasound image (paras. 0032 and 0044; Ideally, the techniques are continuously repeated by one or more imaging elements 120, 122 to provide real-time tracking of the positional sensor 110 within the registered images generated by the imaging elements 120, 122. With the location of the positional sensor 110 determined by at least one of the imaging elements 120, 122, its location can be registered to the fused image generated from the imaging elements 120, 120. For example, the location of the positional sensor 110 can be overlaid in the registered image from the imaging elements for enhanced visualization of the interventional tool 112. A graphical element is used to show the positional sensor 110 in the resulting image on the monitor. ] Output (e.g., images) from the scan converter 830, the multiplanar reformatter 832, and/or the volume renderer 834 may be coupled to an image processor 836 for further enhancement, buffering and temporary storage before being displayed on an image display 838. In certain embodiments, the image processor 836 is configured to register images from the processed signal data. The examiner notes that the images received from the probes are stored and registered before overlaying real time position information of the catheter), and
wherein the control unit is configured to combine the real time ultrasound information obtained from each probe regarding the at least one tracker and information of the stored ultrasound image (paras. 0032 and 0044; Ideally, the techniques are continuously repeated by one or more imaging elements 120, 122 to provide real-time tracking of the positional sensor 110 within the registered images generated by the imaging elements 120, 122. With the location of the positional sensor 110 determined by at least one of the imaging elements 120, 122, its location can be registered to the fused image generated from the imaging elements 120, 120. For example, the location of the positional sensor 110 can be overlaid in the registered image from the imaging elements for enhanced visualization of the interventional tool 112. A graphical element is used to show the positional sensor 110 in the resulting image on the monitor. ] Output (e.g., images) from the scan converter 830, the multiplanar reformatter 832, and/or the volume renderer 834 may be coupled to an image processor 836 for further enhancement, buffering and temporary storage before being displayed on an image display 838. In certain embodiments, the image processor 836 is configured to register images from the processed signal data. The examiner notes that the images received from the probes are stored and registered before overlaying real time position information of the catheter).
However, Vaidya fails to disclose that the ultrasound images are ULM images for generating 3D anatomical mapping.
Errico, in the same field of endeavor, teaches ULM image (figure 2, pages 499-500; we demonstrate ultrafast ultrasound localization microscopy (uULM), which combines deep penetration and super-resolution imaging at unprecedented spatiotemporal resolution, by using clinically approved contrast agents: inert gas microbubbles. we were able to track each moving bubble according to its instantaneous position and in-plane velocity vector, leading to quantitative and localized maps of cerebral blood flow velocity. Hence, ultrafast imaging allows the reconstruction of entire organs within tens of seconds, a prerequisite for a preclinical and clinical modality.).
It would have been obvious to one in the ordinary skill in the art before the effective filling date of the claimed invention to have modified the at least one ultrasound image of Vaidya to incorporate the ULM ultrasound image of Errico. This modification will allow detailed reconstruction of entire organs within tens of seconds and allow small objects to be detected with high accuracy as disclosed within Errico in page 500.
Regarding claim 17, Vaidya teaches the system according to claim 14, wherein the at least one probe comprises at least one ultrasound transducer and the at least one tracker comprises at least one ultrasound sensor (paras. 0026 and 0040; The imaging elements 120, 120 may include one or more ultrasound transducers. The ultrasound transducer may include piezoelectric transducer elements, capacitive micro-machined transducer elements, or any other suitable ultrasound transducer element. According to certain aspects, the positional sensor 110 of the interventional tool 112 is configured to receive signals transmitted from the imaging elements 120, 122. For example, the imaging elements 120, 122 transmit imaging signals as described above, and the positional sensor 110 passively listens to the signals transmitted from the imaging elements 120, 122. The received signals from the imaging elements 122, 120 by the positional sensor 110 can be used to determine the position of the positional sensor 110, and thus the position of the interventional tool 112 within the generated image.).
Regarding claim 19, Vaidya teaches the system according to claim 14, wherein the at least one tracker comprises at least one ultrasound transducer and the at least one probe comprises at least one ultrasound sensor (paras. 0026 and 0042; The imaging elements 120, 120 may include one or more ultrasound transducers. The ultrasound transducer may include piezoelectric transducer elements, capacitive micro-machined transducer elements, or any other suitable ultrasound transducer element. In addition to technique described above and shown in FIG. 5, it is also contemplated that the location of the positional sensor is tracked based on signals emitted from the positional sensor 110 and received by the two or more imaging elements 120, 122.).
Regarding claim 22, Vaidya teaches the system according to claim 14, wherein the device is a catheter, the at least one steerable element is a catheter tip, and the at least one steering element is a catheter handle (figure 1, paras. 0021 and 0038-0040; a physician inserts a device or tool into a patient's body to, e.g., biopsy, monitor, diagnose or treat. Typical interventional tools include, for example, guidewires, guide catheters or sheaths, delivery catheters, ablation catheters, imaging catheters, catheter sheaths, needles, and implantable devices (sensors, stents, filters, etc.). The examiner notes that the device is a catheter that is known to comprise a handle for the user to hold for insertion and manipulation of the catheter distal tip.).
Regarding claim 23, Vaidya teaches the system according to claim 14, wherein the memory of the control unit is configured to store a succession of ultrasound image of the target body structure, each new ultrasound image replacing a prior ultrasound image (paras. 0024, 0029, and 0032; The at least two imaging elements 120, 122 are configured to send imaging signals to and receive imaging signals from the region of interest within their respective field of views or a portion thereof. In certain embodiments, the imaging signals includes acoustic signals. In other embodiments, the imaging signals may be or also include photoacoustic signals. The received imaging signals of the region of interest can be used to generate one or more images. In certain embodiments, the received imaging signals generate a continuous imaging stream of the region of interest in real-time. Output (e.g., images) from the scan converter 830, the multiplanar reformatter 832, and/or the volume renderer 834 may be coupled to an image processor 836 for further enhancement, buffering and temporary storage before being displayed on an image display 838. The examiner notes that the imaging is a real time continuous imaging and the processor receives images and store them in real time before displaying them and continue to acquire new images.).
However, Vaidya fails to disclose that the ultrasound images are ULM images.
Errico, in the same field of endeavor, teaches ULM image (figure 2, pages 499-500; we demonstrate ultrafast ultrasound localization microscopy (uULM), which combines deep penetration and super-resolution imaging at unprecedented spatiotemporal resolution, by using clinically approved contrast agents: inert gas microbubbles. we were able to track each moving bubble according to its instantaneous position and in-plane velocity vector, leading to quantitative and localized maps of cerebral blood flow velocity. Hence, ultrafast imaging allows the reconstruction of entire organs within tens of seconds, a prerequisite for a preclinical and clinical modality.).
It would have been obvious to one in the ordinary skill in the art before the effective filling date of the claimed invention to have modified the at least one ultrasound image of Vaidya to incorporate the ULM ultrasound image of Errico. This modification will allow detailed reconstruction of entire organs within tens of seconds and allow small objects to be detected with high accuracy as disclosed within Errico in page 500.
Regarding claim 24, Vaidya teaches the according to claim 23, wherein the ultrasound image acquisition is done in real time, a new ultrasound image acquisition being launched as soon a prior ultrasound image acquisition is terminated, each new ultrasound image replacing the prior ultrasound image as soon its acquisition is terminated (paras. 0024 and 0032; The at least two imaging elements 120, 122 are configured to send imaging signals to and receive imaging signals from the region of interest within their respective field of views or a portion thereof. In certain embodiments, the imaging signals includes acoustic signals. In other embodiments, the imaging signals may be or also include photoacoustic signals. The received imaging signals of the region of interest can be used to generate one or more images. In certain embodiments, the received imaging signals generate a continuous imaging stream of the region of interest in real-time. Output (e.g., images) from the scan converter 830, the multiplanar reformatter 832, and/or the volume renderer 834 may be coupled to an image processor 836 for further enhancement, buffering and temporary storage before being displayed on an image display 838. The examiner notes that the imaging is a real time continuous imaging and the processor receives images and store them in real time before displaying them. Therefore, each new image replaces the prior one).
However, Vaidya fails to disclose that the ultrasound images are ULM images.
Errico, in the same field of endeavor, teaches ULM image (figure 2, pages 499-500; we demonstrate ultrafast ultrasound localization microscopy (uULM), which combines deep penetration and super-resolution imaging at unprecedented spatiotemporal resolution, by using clinically approved contrast agents: inert gas microbubbles. we were able to track each moving bubble according to its instantaneous position and in-plane velocity vector, leading to quantitative and localized maps of cerebral blood flow velocity. Hence, ultrafast imaging allows the reconstruction of entire organs within tens of seconds, a prerequisite for a preclinical and clinical modality.).
It would have been obvious to one in the ordinary skill in the art before the effective filling date of the claimed invention to have modified the at least one ultrasound image of Vaidya to incorporate the ULM ultrasound image of Errico. This modification will allow detailed reconstruction of entire organs within tens of seconds and allow small objects to be detected with high accuracy as disclosed within Errico in page 500.
Regarding claim 25, Vaidya teaches the system according to claim 14, wherein the target body structure is a vascular system of the brain (paras. 0021-0022; systems and methods of the invention are applicable for imaging and tracking devices in a variety of interventional procedures. Interventional procedures may include any procedure in which a physician inserts a device or tool into a patient's body to, e.g., biopsy, monitor, diagnose or treat. Exemplary interventional procedures may include, but are not limited to: arteriovenous malformations, angioplasty, biliary drainage and stenting, catheter embolization, central venous access, chemoembolization, gastrostomy tube insertion, hemodialysis access maintenance, balloon catheterization, needle biopsy, ablation, grafting, thrombolysis, shunting (e.g. transjugular intrahepatic portosystemic shunt), urinary catheterization, uterine catheterization, filter or stent implantation (e.g. vena cava filter). For example, systems and methods of the invention are well-suited for monitoring treatment of cardiovascular disease. In certain embodiments, the interventional device is configured for entry into one or more body lumens, and are imaged by the imaging elements. Various biological lumens include blood vessels, vasculature of the lymphatic and nervous systems, various structures of the gastrointestinal tract including lumen of the small intestine, large intestine, stomach, esophagus, colon, pancreatic duct, bile duct, hepatic duct, lumen of the reproductive tract including the vas deferens, vagina, uterus and fallopian tubes, structures of the urinary tract including urinary collecting ducts, renal tubules, ureter, and bladder, and structures of the head, neck and pulmonary system including sinuses, parotid, trachea, bronchi, and lungs. The examiner notes that the system is used to guide interventional device entry to one or more body lumens such as vasculature of the nervous system and structures in the head.).
Regarding claim 27, Vaidya teaches the system according to claim 14, wherein the system is for use in thrombectomy (paras. 0021-0022; systems and methods of the invention are applicable for imaging and tracking devices in a variety of interventional procedures. Interventional procedures may include any procedure in which a physician inserts a device or tool into a patient's body to, e.g., biopsy, monitor, diagnose or treat. Exemplary interventional procedures may include, but are not limited to: arteriovenous malformations, angioplasty, biliary drainage and stenting, catheter embolization, central venous access, chemoembolization, gastrostomy tube insertion, hemodialysis access maintenance, balloon catheterization, needle biopsy, ablation, grafting, thrombolysis, shunting (e.g. transjugular intrahepatic portosystemic shunt), urinary catheterization, uterine catheterization, filter or stent implantation (e.g. vena cava filter). For example, systems and methods of the invention are well-suited for monitoring treatment of cardiovascular disease. In certain embodiments, the interventional device is configured for entry into one or more body lumens, and are imaged by the imaging elements. Various biological lumens include blood vessels, vasculature of the lymphatic and nervous systems, various structures of the gastrointestinal tract including lumen of the small intestine, large intestine, stomach, esophagus, colon, pancreatic duct, bile duct, hepatic duct, lumen of the reproductive tract including the vas deferens, vagina, uterus and fallopian tubes, structures of the urinary tract including urinary collecting ducts, renal tubules, ureter, and bladder, and structures of the head, neck and pulmonary system including sinuses, parotid, trachea, bronchi, and lungs.).
Regarding claim 28, Vaidya teaches the system according to claim 14, wherein a new ultrasound image acquisition is launched, by the control unit, as soon a prior ultrasound image acquisition is terminated (paras. 0024, 0029, and 0032; The at least two imaging elements 120, 122 are configured to send imaging signals to and receive imaging signals from the region of interest within their respective field of views or a portion thereof. In certain embodiments, the imaging signals includes acoustic signals. In other embodiments, the imaging signals may be or also include photoacoustic signals. The received imaging signals of the region of interest can be used to generate one or more images. In certain embodiments, the received imaging signals generate a continuous imaging stream of the region of interest in real-time. Output (e.g., images) from the scan converter 830, the multiplanar reformatter 832, and/or the volume renderer 834 may be coupled to an image processor 836 for further enhancement, buffering and temporary storage before being displayed on an image display 838. The examiner notes that the imaging is a real time continuous imaging and the processor receives images and store them in real time before displaying them and continue to acquire new images.).
However, Vaidya fails to disclose that the ultrasound images are ULM images.
Errico, in the same field of endeavor, teaches ULM image (figure 2, pages 499-500; we demonstrate ultrafast ultrasound localization microscopy (uULM), which combines deep penetration and super-resolution imaging at unprecedented spatiotemporal resolution, by using clinically approved contrast agents: inert gas microbubbles. we were able to track each moving bubble according to its instantaneous position and in-plane velocity vector, leading to quantitative and localized maps of cerebral blood flow velocity. Hence, ultrafast imaging allows the reconstruction of entire organs within tens of seconds, a prerequisite for a preclinical and clinical modality.).
It would have been obvious to one in the ordinary skill in the art before the effective filling date of the claimed invention to have modified the at least one ultrasound image of Vaidya to incorporate the ULM ultrasound image of Errico. This modification will allow detailed reconstruction of entire organs within tens of seconds and allow small objects to be detected with high accuracy as disclosed within Errico in page 500.
Response to Arguments
Applicant’s arguments with respect to claim(s) 35 USC 103 rejections have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZAINAB M ALDARRAJI whose telephone number is (571)272-8726. The examiner can normally be reached Monday-Thursday7AM-5PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carey Michael can be reached at (571) 270-7235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ZAINAB MOHAMMED ALDARRAJI/ Patent Examiner, Art Unit 3797