DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Claim 14-19 withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected device, there being no allowable generic or linking claim. Election was made without traverse in the reply filed on 12/19/2025.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 1-13 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 recites the limitation “a method for robotic positioning of a medical probe or instrument” which renders the claim unclear. The proceeding limitations only disclose the embodiment containing a medical instrument which renders the above limitation unclear, whether a medical instrument is required by the claim or not.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-13 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang et al (US20210128103A1; hereinafter referred to as Zhang) in view of Crawford et al (US20230011428A1; hereinafter referred to as Crawford).
Regarding Claim 1, Zhang discloses receiving, from each of a plurality of sensing elements disposed in proximity to a medical instrument (“The array 160 is employed for a method for generating an image of scanned tissue, which includes receiving a set of signals from a circular array 160 of transducers 152, in which the circular array 160 is defined by the circular frame 150 having the transducers 152 disposed thereon.” [0023]),
a signal indicative of a distance to a treatment site of a patient (“the circular array 150 has a center axis 111 defining a radius 180 to each of the transducers 152. The plurality of transducers 152 is disposed in the circular frame 150 to define the circular array 160, such that the transducers 152 are centered around the needle insertion sheath 130 defining the needle axis 111. The tracking circuit 170 computes each pixel on the rendered image 140 from a value based on a distance 136 from the location on the reconstruction plane 132 to each respective transducer 152, such that the distance is computed based on the radius.” [0034]);
computing, based on each of the signals and an offset of the sensor from the medical instrument, a distance from each of the respective sensing elements to the treatment site (“as the device 304 is positioned and the needle 110 advanced, the transducers 152 emit and receiving a return signal at each emitting transducer or a combination of multiple transducers in proximity to the emitting transducer. Each transducer is a single element transducer operable for transmission and reception of US signals. The tracking circuit 170 computes, based on each of a plurality of positions on the reconstruction plane 132, a value for the corresponding pixel based on the return signal from a plurality of the transducers 152. In other words, the transducers emit 252 and receive signals 252′ in an iterative manner for the depth and width of the reconstruction plane 132. For each scanned or imaged position on the reconstruction plane, the tracking circuit receives and evaluates a return signal 252′ to compute a value of a corresponding pixel in the rendered image 140, as disclosed above with respect to FIG. 4. The tracking circuit 170 iterates over a plurality of positions on the reconstruction plane 132 for computing a value for a corresponding pixel of each pixel of the generated image 140. Each transducer 152 receives the return signal 252′ based on a depth, distance and angle to the corresponding location on the reconstruction plane 132 from the respective transducer 152.” [0037]);
and determining, based on the computed distances, an angle of the medical instrument relative to the treatment site (“After mounting the device 104, RF data is collected continuously from the ring-arrayed transducers 152, shown at step 602 while the needle posture is set by rotating the device 104, as depicted at step 603. The forward-viewing US images 140 are reconstructed and rendered based on the needle posture, as disclosed at step 604, based on emission of an ultrasonic (US) beam from each of the transducers 152 around the circular array 160. The rendered image 140 may be employed to evaluate an acceptable insertion path directly by changing the (rotation) needle posture or shifting the device on the body surface 112 in a slidable manner, depicted at step 605 and performed iteratively until an acceptable path axis 111 is found. Once an acceptable needle 110 insertion path is found, the needle angle will be fixed and insertion commenced, as shown at step 606. The rendered forward-viewing image 140 continually updates in real-time during the needle insertion for tracking the needle location 607 as the needle advances towards the target 134 along the axis 111, depicted at step 608. The tracking circuit 170 continues to render the generated image along a forward direction of needle insertion, until the target is attained at step 609.” [0036]).
Zhang does not specifically disclose a method for robotic positioning of a medical probe or instrument.
However, in a similar field of endeavor, Crawford teaches a method for robotic positioning of a medical probe or instrument (“a surgical robot system comprises a robot, a US transducer, and at least one processor. The robot has a robot base, a robot arm coupled to the robot base, and an end-effector coupled to the robot arm, such as explained above in accordance with some embodiments. The end-effector is configured to guide movement of a surgical instrument. The US transducer is coupled to the end-effector and operative to output US imaging data of anatomical structure proximately located to the end-effector.” [0158]), comprising:
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Zhang as outlined above with a method for robotic positioning of a medical probe or instrument as taught by Crawford, because robotic positioning allows tracking with a high degree of precision [0003].
Regarding Claim 2, Zhang discloses further comprising identifying an axis of the medical instrument, the axis extending towards the treatment site, the angle based on an orientation of the axis relative to a plane defined by the treatment site (“After mounting the device 104, RF data is collected continuously from the ring-arrayed transducers 152, shown at step 602 while the needle posture is set by rotating the device 104, as depicted at step 603. The forward-viewing US images 140 are reconstructed and rendered based on the needle posture, as disclosed at step 604, based on emission of an ultrasonic (US) beam from each of the transducers 152 around the circular array 160. The rendered image 140 may be employed to evaluate an acceptable insertion path directly by changing the (rotation) needle posture or shifting the device on the body surface 112 in a slidable manner, depicted at step 605 and performed iteratively until an acceptable path axis 111 is found. Once an acceptable needle 110 insertion path is found, the needle angle will be fixed and insertion commenced, as shown at step 606. The rendered forward-viewing image 140 continually updates in real-time during the needle insertion for tracking the needle location 607 as the needle advances towards the target 134 along the axis 111, depicted at step 608. The tracking circuit 170 continues to render the generated image along a forward direction of needle insertion, until the target is attained at step 609.” [0036]).
Regarding Claim 3, Zhang discloses the axis defines an approach angle of the medical instrument, further comprising: disposing the medical instrument at the angle based on a target angle defined by intersection of the axis with the treatment site; and translating the surgical instrument along the axis (“After mounting the device 104, RF data is collected continuously from the ring-arrayed transducers 152, shown at step 602 while the needle posture is set by rotating the device 104, as depicted at step 603. The forward-viewing US images 140 are reconstructed and rendered based on the needle posture, as disclosed at step 604, based on emission of an ultrasonic (US) beam from each of the transducers 152 around the circular array 160. The rendered image 140 may be employed to evaluate an acceptable insertion path directly by changing the (rotation) needle posture or shifting the device on the body surface 112 in a slidable manner, depicted at step 605 and performed iteratively until an acceptable path axis 111 is found. Once an acceptable needle 110 insertion path is found, the needle angle will be fixed and insertion commenced, as shown at step 606. The rendered forward-viewing image 140 continually updates in real-time during the needle insertion for tracking the needle location 607 as the needle advances towards the target 134 along the axis 111, depicted at step 608. The tracking circuit 170 continues to render the generated image along a forward direction of needle insertion, until the target is attained at step 609.” [0036]).
Regarding Claim 4, Zhang discloses identifying a surgical target, the surgical target disposed on an opposed side of the plane defining the treatment surface; and disposing the medical instrument for aligning the axis with the treatment site; and advancing the medical instrument along the axis aligned with the treatment site (“After mounting the device 104, RF data is collected continuously from the ring-arrayed transducers 152, shown at step 602 while the needle posture is set by rotating the device 104, as depicted at step 603. The forward-viewing US images 140 are reconstructed and rendered based on the needle posture, as disclosed at step 604, based on emission of an ultrasonic (US) beam from each of the transducers 152 around the circular array 160. The rendered image 140 may be employed to evaluate an acceptable insertion path directly by changing the (rotation) needle posture or shifting the device on the body surface 112 in a slidable manner, depicted at step 605 and performed iteratively until an acceptable path axis 111 is found. Once an acceptable needle 110 insertion path is found, the needle angle will be fixed and insertion commenced, as shown at step 606. The rendered forward-viewing image 140 continually updates in real-time during the needle insertion for tracking the needle location 607 as the needle advances towards the target 134 along the axis 111, depicted at step 608. The tracking circuit 170 continues to render the generated image along a forward direction of needle insertion, until the target is attained at step 609.” [0036]).
Regarding Claim 5, Zhang discloses further comprising: identifying a probe plane defined by the plurality of sensors; determining an orientation of the medical instrument to the probe plane identifying a patient plane defined by the treatment site; computing an orientation of the probe plane relative to the patient plane based on the computed distances (“The monitoring device 142 allows rendering of the image 140 of the surgical target 134, such that the surgical target 134 is located on the reconstruction plane 132 and based on an insertion site aligned with a needle on a trajectory defined by the needle insertion sheath 130. Since the needle path is centered among the transducers, the reconstructed plane image 132 includes the path at any rotation of the reconstructed plane image 132. The surgical target 134 may be, for example, a region or growth for retrieving a biopsy sample, or the reconstructed plane 132 may simply define a diagnostic region for further imaging.” [0021], “the circular array 150 has a center axis 111 defining a radius 180 to each of the transducers 152. The plurality of transducers 152 is disposed in the circular frame 150 to define the circular array 160, such that the transducers 152 are centered around the needle insertion sheath 130 defining the needle axis 111. The tracking circuit 170 computes each pixel on the rendered image 140 from a value based on a distance 136 from the location on the reconstruction plane 132 to each respective transducer 152, such that the distance is computed based on the radius. Each location corresponding to a pixel also has an angle 137 from the transducer 152 and a depth 138, which is a function of the angle 137 and distance 136, which define a location on the reconstruction plane 132.” [0034]).
Regarding Claim 6, Zhang discloses all limitations noted above except further comprising: positioning the sensing elements in a predetermined orientation with a robotic actuator; engaging the medical instrument with the robotic actuator; and disposing the robotic actuator based on the determined angle of the medical instrument
However, in a similar field of endeavor, Crawford teaches further comprising: positioning the sensing elements in a predetermined orientation with a robotic actuator; engaging the medical instrument with the robotic actuator; and disposing the robotic actuator based on the determined angle of the medical instrument (“a surgical robot system comprises a robot, a US transducer, and at least one processor. The robot has a robot base, a robot arm coupled to the robot base, and an end-effector coupled to the robot arm, such as explained above in accordance with some embodiments. The end-effector is configured to guide movement of a surgical instrument. The US transducer is coupled to the end-effector and operative to output US imaging data of anatomical structure proximately located to the end-effector. “ [0158], “once the end-effector 112 control by the surgical robot 102 approaches the target location, the surgical robot 102 will adjust the arm 104 orientation to match the desired trajectory orientation” [0194]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Zhang as outlined above with further comprising: positioning the sensing elements in a predetermined orientation with a robotic actuator; engaging the medical instrument with the robotic actuator; and disposing the robotic actuator based on the determined angle of the medical instrument as taught by Crawford, because robotic positioning allows tracking with a high degree of precision [0003].
Regarding Claim 7, Zhang discloses further comprising: receiving a location of a surgical target; computing the angle of the medical instrument based on an intersection with the surgical target; and advancing the medical instrument along the computed angle for attaining the surgical target (“After mounting the device 104, RF data is collected continuously from the ring-arrayed transducers 152, shown at step 602 while the needle posture is set by rotating the device 104, as depicted at step 603. The forward-viewing US images 140 are reconstructed and rendered based on the needle posture, as disclosed at step 604, based on emission of an ultrasonic (US) beam from each of the transducers 152 around the circular array 160. The rendered image 140 may be employed to evaluate an acceptable insertion path directly by changing the (rotation) needle posture or shifting the device on the body surface 112 in a slidable manner, depicted at step 605 and performed iteratively until an acceptable path axis 111 is found. Once an acceptable needle 110 insertion path is found, the needle angle will be fixed and insertion commenced, as shown at step 606. The rendered forward-viewing image 140 continually updates in real-time during the needle insertion for tracking the needle location 607 as the needle advances towards the target 134 along the axis 111, depicted at step 608. The tracking circuit 170 continues to render the generated image along a forward direction of needle insertion, until the target is attained at step 609.” [0036]).
Regarding Claim 8, Zhang discloses all limitations noted above except further comprising: engaging the medical instrument with a robotic actuator for advancing the medial instrument.
However, in a similar field of endeavor, Crawford teaches further comprising: engaging the medical instrument with a robotic actuator for advancing the medial instrument (“once the end-effector 112 control by the surgical robot 102 approaches the target location, the surgical robot 102 will adjust the arm 104 orientation to match the desired trajectory orientation” [0194]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Zhang as outlined above with further comprising: engaging the medical instrument with a robotic actuator for advancing the medial instrument as taught by Crawford, because robotic positioning allows tracking with a high degree of precision [0003].
Regarding Claim 9, Zhang discloses the distance sensor is configured for at least one of optical, ultrasonic, or visual sensing (“the circular array 150 has a center axis 111 defining a radius 180 to each of the transducers 152. The plurality of transducers 152 is disposed in the circular frame 150 to define the circular array 160, such that the transducers 152 are centered around the needle insertion sheath 130 defining the needle axis 111. The tracking circuit 170 computes each pixel on the rendered image 140 from a value based on a distance 136 from the location on the reconstruction plane 132 to each respective transducer 152, such that the distance is computed based on the radius.” [0034]).
Regarding Claim 10, Zhang discloses further comprising receiving, from the plurality of sensing elements, a set of points, each point of the set of points having a position and corresponding distance to the treatment site (“the circular array 150 has a center axis 111 defining a radius 180 to each of the transducers 152. The plurality of transducers 152 is disposed in the circular frame 150 to define the circular array 160, such that the transducers 152 are centered around the needle insertion sheath 130 defining the needle axis 111. The tracking circuit 170 computes each pixel on the rendered image 140 from a value based on a distance 136 from the location on the reconstruction plane 132 to each respective transducer 152, such that the distance is computed based on the radius.” [0034]).
Regarding Claim 11, Zhang discloses the signal is a video signal and the set of points defines a pixelated grid, the pixelated grid having a two dimensional representation of the position of a respective point in the set of points (“The circular array 150 has a center axis 111 defining a radius 180 to each of the transducers 152. The plurality of transducers 152 is disposed in the circular frame 150 to define the circular array 160, such that the transducers 152 are centered around the needle insertion sheath 130 defining the needle axis 111. The tracking circuit 170 computes each pixel on the rendered image 140 from a value based on a distance 136 from the location on the reconstruction plane 132 to each respective transducer 152, such that the distance is computed based on the radius. Each location corresponding to a pixel also has an angle 137 from the transducer 152 and a depth 138, which is a function of the angle 137 and distance 136, which define a location on the reconstruction plane 132.” [0034]).
Regarding Claim 12 Zhang discloses plurality of sensing elements are arranged in a plane, the offset indicative of a relative position from the medical treatment (“the circular array 150 has a center axis 111 defining a radius 180 to each of the transducers 152. The plurality of transducers 152 is disposed in the circular frame 150 to define the circular array 160, such that the transducers 152 are centered around the needle insertion sheath 130 defining the needle axis 111. The tracking circuit 170 computes each pixel on the rendered image 140 from a value based on a distance 136 from the location on the reconstruction plane 132 to each respective transducer 152, such that the distance is computed based on the radius.” [0034]).
Regarding Claim 13, Zhang discloses the medical instrument has an axis passing through a longitudinal dimension of the medical instrument, the axis extending towards the treatment site, the angle based on an orientation of the axis relative to a plane defined by the treatment site (“After mounting the device 104, RF data is collected continuously from the ring-arrayed transducers 152, shown at step 602 while the needle posture is set by rotating the device 104, as depicted at step 603. The forward-viewing US images 140 are reconstructed and rendered based on the needle posture, as disclosed at step 604, based on emission of an ultrasonic (US) beam from each of the transducers 152 around the circular array 160. The rendered image 140 may be employed to evaluate an acceptable insertion path directly by changing the (rotation) needle posture or shifting the device on the body surface 112 in a slidable manner, depicted at step 605 and performed iteratively until an acceptable path axis 111 is found. Once an acceptable needle 110 insertion path is found, the needle angle will be fixed and insertion commenced, as shown at step 606. The rendered forward-viewing image 140 continually updates in real-time during the needle insertion for tracking the needle location 607 as the needle advances towards the target 134 along the axis 111, depicted at step 608. The tracking circuit 170 continues to render the generated image along a forward direction of needle insertion, until the target is attained at step 609.” [0036])
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVEN MALDONADO whose telephone number is 703-756-1421. The examiner can normally be reached 8:00 am-4:00 pm PST M-Th Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at
http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached on (571) 272-7230. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Steven Maldonado/
Patent Examiner, Art Unit 3797