DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
After a further review of main prior art Najafi et al it was determined that the amended limitations in question were taught by Najafi, refer to the new rejections below on the interpretation of Najafi that covers the newly amended limitations.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-9, & 11-19 are rejected under 35 U.S.C. 103 as being unpatentable over Najafi et al (M. Najafi, P. Abolmaesumi, and R. Rohling, “Single-camera closed-form real-time needle tracking for ultrasound-guided needle insertion,” Ultrasound in Medicine & Biology, vol. 41, no. 10, pp. 2663–2676, Oct. 2015.; hereinafter referred to as Najafi) in view of Halmann et al (US20230131115A1; hereinafter referred to as Halmann).
Regarding Claim 1, Najafi teaches an augmented reality system (“a needle insertion procedure with ultrasound guidance, real-time calculation and visualization of the needle trajectory can help to guide the choice of puncture site and needle angle to reach the target depicted in the ultrasound image” [Abstract]) comprising:
a needle (“Epidural needle with 1-cm markings is projected into the camera plane using the standard camera model.” [Mathematical Framework], comprising:
a first marking area, wherein the first marking area has a first grayscale color ; and a second marking area, wherein the second marking area has a second grayscale color, the first grayscale color and the second grayscale color are different from each other, and the second grayscale color is arranged after the first marking area (“the first grayscale color and the second grayscale color are different from each other,” [Mathematical Framework], as seen in Fig. 1 there is alternating grey marking areas on the needle with both having different gray colors);
PNG
media_image1.png
908
1295
media_image1.png
Greyscale
Figure 1
a camera (“The camera was an FL2G-13S2M-C (Point Grey Research, Richmond, BC, Canada) with a DF6HA-1B lens (Fujifilm Group, Fujinon, Saitama, Japan). The camera had a mono 1.3-MP Sony ICX204 CCD, 1/3-in., 4.65-μm sensor. The frame rate was 30 FPS and the image size was 1,288 × 964 pixels.” [Methods]) ;
an ultrasound generator, configured to form an ultrasound field of view based on a measuring object (“A SonixTOUCH ultrasound machine (Ultrasonix Medical, Richmond, BC, Canada) was used for ultrasound imaging. An L14-5 transducer with 7.2-MHz center frequency was used for ultrasound imaging.” [Methods];
a memory, configured to store a plurality of commands; a processor, configured to perform following steps according to the plurality of commands of the memory (“The algorithm was implemented in both MATLAB and C++. In the C++ implementation, the OpenCV library was used, and the overall method runs in real time (50 ms) on a standard computer workstation with a Core 2 Duo CPU at 2.93 GHz and 4 GB of RAM.” [Real-time Implementation]:
capturing the first marking area, and the second marking area of the needle to perform positioning or a marking pose estimation by the camera (“To estimate the pose of the needle, marking edge points should be extracted from each image of the needle. These edge points are defined as the intersection of the centerline of the needle with the marking edge lines (Fig. 4a). The accuracy of the pose estimation is directly related to the accuracy of this segmentation procedure.” [Automatic feature extraction], see Figure 2 for the captured marking areas;
PNG
media_image2.png
265
677
media_image2.png
Greyscale
Figure 2
confirming the first marking area and the second marking area of the needle are located on a straight line for a feature pose estimation according to a linear regression algorithm (“To estimate the pose of the needle, marking edge points should be extracted from each image of the needle. These edge points are defined as the intersection of the centerline of the needle with the marking edge lines (Fig. 4a). The accuracy of the pose estimation is directly related to the accuracy of this segmentation procedure.” [Automatic feature extraction] “The algorithm starts with an initial estimate for the needle centerline. The estimate can come from the previous frame, if the current frame is not the first frame. For the first frame, the estimate is provided by the Hough transform (using the Canny edge detector). Then, random lines are generated around this estimate by adding random variables to the slope and line intercept. Pixel values on each line are examined to determine if it is an “on-the-needle” line. In that case, the intersection points of the line with the marking edges will be stored.” [Collecting marking edge points]);
comparing the first marking area, the second marking area, and a matching template to confirm a direction of the needle (“A novel mathematical–geometrical formulation is devised here for needle trajectory calculation using the centimeter-spaced black markings on many needles,” [Single Camera Pose Estimation]);
obtaining a feature points number according to the first marking area and the second marking area in the needle, wherein the feature points number is a positive integer greater than or equal to 2 (“The feature extraction procedure, described in the next section, finds the projected points in the image, and is calculated in the camera calibration procedure. Actual marker distances are assumed to be known. For an epidural needle, in particular, the marking distances are all equal. However, the proposed method is not limited to the equal spacing constraint and can be generalized for needles with non-equal marking spacings.” [Mathematical framework]);
obtaining a range prediction interval of the ultrasound field of view according to the feature points number of the needle, wherein the processor projects the needle into the ultrasound field of view according to the feature points number, and a projection range in the ultrasound field of view of the needle is the range prediction interval, wherein the range prediction interval has a width, and the width of the range prediction interval related to the feature points number (“To measure the effect of various parameters such as depth, needle tilt angle, needle yaw angle and number of edge points on accuracy, a sensitivity analysis test was performed by simulating the pose estimation process. The mathematical formulation of the system and the standard camera model was used with a noise source model as input. As discussed under Precision of the Feature Extraction, the root mean squared segmentation error was measured as 2.2 pixels, and hence, here, the noise was modeled as a normal random variable with a standard deviation of 2.5 pixels For the base case, the needle's pose and the other parameters were chosen similar to those in the Overall System Accuracy section. The angle between the needle and the ultrasound image plane was 45°, the intersection depth was 20 mm and three edge points were used… Finally, the number of edge points was varied from 3 to 7 to investigate if using more edge points increases the accuracy because of the averaging effect. Results illustrated in Figure 12d confirm that the estimation error decreases by using more edge points.” [Sensitivity analysis of the needle pose estimation], Najafi discloses an estimation error (range prediction interval) between the projected needle on the ultrasound vs the actual coordinates of the needle, this estimation error is dependent on multiple factors; however, when the number of edge points (feature points) is increased the estimation error is shown to decrease in size (width) as seen in Fig. 12 D below);
PNG
media_image3.png
570
677
media_image3.png
Greyscale
determining the feature points number is greater than a feature points number threshold , and reducing the width of the range interval (“Finally, the number of edge points was varied from 3 to 7 to investigate if using more edge points increases the accuracy because of the averaging effect. Results illustrated in Figure 12d confirm that the estimation error decreases by using more edge points.” [Sensitivity analysis of the needle pose estimation]).
Najafi does not specifically teach a location code; and capturing the location code.
However, in a similar field of endeavor, Halmann teaches a system and method is provided for providing an indication of viewable and non-viewable parts of an interventional device in an ultrasound image [Abstract].
Halmann also teaches a location code; and capturing the location code (“The detection and recognition system 200 can then identify the pattern 47 for the needle 32 selected by the user and operate to locate that pattern 47 within the image data/ultrasound image 202. This information on the needle 32 to be used can be supplied to the detection and recognition system 200 by the user in various manners, such as through the input device 124, such as by manually entering identifying information on the needle 32, or by scanning a barcode or RFID located on packaging for the needle 32 including the identifying information.” [0027])
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with a location code; and capturing the location code as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 2, Najafi teaches a length of the first marking area is smaller than a length of the second marking area (“The feature extraction procedure, described in the next section, finds the projected points in the image, and is calculated in the camera calibration procedure. Actual marker distances are assumed to be known. For an epidural needle, in particular, the marking distances are all equal. However, the proposed method is not limited to the equal spacing constraint and can be generalized for needles with non-equal marking spacings.” [Mathematical framework]).
Regarding Claim 3, Najafi teaches a length of the first marking area is smaller than a length of the second marking area (“The feature extraction procedure, described in the next section, finds the projected points in the image, and is calculated in the camera calibration procedure. Actual marker distances are assumed to be known. For an epidural needle, in particular, the marking distances are all equal. However, the proposed method is not limited to the equal spacing constraint and can be generalized for needles with non-equal marking spacings.” [Mathematical framework]).
Regarding Claim 4, Najafi teaches the processor is further configured to perform the following steps according to the plurality of commands of the memory: confirming the first marking area and the second marking area of the needle are located on the straight line according to at least one of the linear regression algorithm (“To estimate the pose of the needle, marking edge points should be extracted from each image of the needle. These edge points are defined as the intersection of the centerline of the needle with the marking edge lines (Fig. 4a). The accuracy of the pose estimation is directly related to the accuracy of this segmentation procedure.” [Automatic feature extraction] “The algorithm starts with an initial estimate for the needle centerline. The estimate can come from the previous frame, if the current frame is not the first frame. For the first frame, the estimate is provided by the Hough transform (using the Canny edge detector). Then, random lines are generated around this estimate by adding random variables to the slope and line intercept. Pixel values on each line are examined to determine if it is an “on-the-needle” line. In that case, the intersection points of the line with the marking edges will be stored.” [Collecting marking edge points]).
Najafi does not specifically teach a needle bending detection and a bending relational expression wherein the bending relational expression has a distance error mean value; setting a bending threshold according to the bending relational expression; and when the distance error mean value smaller than the bending threshold, it is determined that the needle approaches the straight line.
However, in a similar field of endeavor, Halmann teaches a needle bending detection and a bending relational expression wherein the bending relational expression has a distance error mean value; setting a bending threshold according to the bending relational expression; and when the distance error mean value smaller than the bending threshold, it is determined that the needle approaches the straight line (“in the situation where the detection and recognition system 200 determines that less than the entire length of the first echogenic portion 50′ is represented or viewable within the ultrasound image 202, illustrating that the tip 42 has been directed and/or deflected out of the imaging plane for the image 202, the system 200 will position the boundary lines 404 along each side of the needle 32 represented in the ultrasound image 202. However, in this situation the boundary lines 404 presented by the system 202 will extend beyond the length of the needle 32 represented in the ultrasound image 202 to correspond in length and position to the actual position of the tip 42 of the needle 32 as determined by the detection and recognition system 200. As shown in the exemplary illustrated embodiment of FIG. 7 , the boundary lines 404 extend past the representation of the needle 32 to the estimated point where the tip 42 of the needle 32 is actually positioned as determined by the system 200, thus visually illustrating the position of the entire needle 32 including the tip 42 relative to the actual position of the first portion of the needle 32 that remains viewable in the ultrasound image 202 and bounded by a first portion 411 of the boundary lines 404. In this manner, the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035].
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with a needle bending detection and a bending relational expression wherein the bending relational expression has a distance error mean value; setting a bending threshold according to the bending relational expression; and when the distance error mean value smaller than the bending threshold, it is determined that the needle approaches the straight line as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 5, Najafi teaches that the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a depth of field length between the camera and the measuring object (“Proper values for these parameters should be chosen based on the experimental setup. Specifically, minpeakdistance is related to the length of the markings in pixels and, therefore, depends on the image resolution, lens field of view and distance of the needle to the camera.” [Collecting marking edge points]);
and obtaining a horizontal image range according to a horizontal viewing angle and the depth of field length of the camera (“By use of a linear stage, the needle was moved in 2-mm steps laterally at approximately 10- and 20-mm axial depths (26 points in total), as illustrated in Figure 11a. The tilt angle of the needle, the angle between the needle and the ultrasound image plane, was approximately 42°, and the angle between the needle and the optical axis of the camera was about 67°. The average angle between the needle and the camera image plane was 26.9°.” [Overall system accuracy]).
PNG
media_image4.png
512
852
media_image4.png
Greyscale
Regarding Claim 6, Najafi teaches that the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a vertical image range according to a vertical viewing angle and the depth of field length of the camera (“By use of a linear stage, the needle was moved in 2-mm steps laterally at approximately 10- and 20-mm axial depths (26 points in total), as illustrated in Figure 11a. The tilt angle of the needle, the angle between the needle and the ultrasound image plane, was approximately 42°, and the angle between the needle and the optical axis of the camera was about 67°. The average angle between the needle and the camera image plane was 26.9°.” [Overall system accuracy]).
PNG
media_image4.png
512
852
media_image4.png
Greyscale
Regarding Claim 7, Najafi teaches all limitations noted above except that the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a horizontal elongation according to the horizontal image range and an effective length of the needle.
However, in a similar field of endeavor, Halmann teaches the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a horizontal elongation according to the horizontal image range and an effective length of the needle. (“This is accomplished by the recognition and detection system 200 by comparing the location and/or dimensions (e.g., length) of the echogenic portion(s) 50,50′ and associated pattern(s) 47,47′ detected in the ultrasound image 202 and associated with a particular needle 32 with the dimensions of the needle 32 stored in the memory unit 122. For example, if the needle 32 detected in the ultrasound image 202 includes two echogenic portions 50,50′ spaced from one another by a band 52, with the echogenic portion(s) 50,50′ and the band 52 each having a specified length, the recognition and detection system 200 can determine the length of the body 40 of the needle 32 that is present within the image 202 based on the length of the echogenic portion(s) 50,50′ and band(s) 52 visible in the image 202.” [0029], “the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a horizontal elongation according to the horizontal image range and an effective length of the needle as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 8, Najafi teaches all limitations noted above except that the effective length is greater than or equal to a sum of a first length of the first marking area and a second length of the second marking area.
However, in a similar field of endeavor, Halmann teaches that the effective length is greater than or equal to a sum of a first length of the first marking area and a second length of the second marking area (“This is accomplished by the recognition and detection system 200 by comparing the location and/or dimensions (e.g., length) of the echogenic portion(s) 50,50′ and associated pattern(s) 47,47′ detected in the ultrasound image 202 and associated with a particular needle 32 with the dimensions of the needle 32 stored in the memory unit 122. For example, if the needle 32 detected in the ultrasound image 202 includes two echogenic portions 50,50′ spaced from one another by a band 52, with the echogenic portion(s) 50,50′ and the band 52 each having a specified length, the recognition and detection system 200 can determine the length of the body 40 of the needle 32 that is present within the image 202 based on the length of the echogenic portion(s) 50,50′ and band(s) 52 visible in the image 202.” [0029], “the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with the effective length is greater than or equal to a sum of a first length of the first marking area and a second length of the second marking area as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 9, Najafi teaches all limitations noted above except that the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a vertical elongation according to the vertical image range and the effective length of the needle.
However, in a similar field of endeavor, Halmann teaches the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a vertical elongation according to the vertical image range and the effective length of the needle (“This is accomplished by the recognition and detection system 200 by comparing the location and/or dimensions (e.g., length) of the echogenic portion(s) 50,50′ and associated pattern(s) 47,47′ detected in the ultrasound image 202 and associated with a particular needle 32 with the dimensions of the needle 32 stored in the memory unit 122. For example, if the needle 32 detected in the ultrasound image 202 includes two echogenic portions 50,50′ spaced from one another by a band 52, with the echogenic portion(s) 50,50′ and the band 52 each having a specified length, the recognition and detection system 200 can determine the length of the body 40 of the needle 32 that is present within the image 202 based on the length of the echogenic portion(s) 50,50′ and band(s) 52 visible in the image 202.” [0029], “the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining a vertical elongation according to the vertical image range and the effective length of the needle as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 11, Najafi teaches an augmented reality method, comprising: forming an ultrasound field of view based on a measuring object by an ultrasound generator (“a needle insertion procedure with ultrasound guidance, real-time calculation and visualization of the needle trajectory can help to guide the choice of puncture site and needle angle to reach the target depicted in the ultrasound image” [Abstract], “A SonixTOUCH ultrasound machine (Ultrasonix Medical, Richmond, BC, Canada) was used for ultrasound imaging. An L14-5 transducer with 7.2-MHz center frequency was used for ultrasound imaging.” [Methods]);
capturing a first marking area, and a second marking area of a needle to perform positioning or a marking pose estimation by a camera (“Epidural needle with 1-cm markings is projected into the camera plane using the standard camera model.” [Mathematical Framework], “The camera was an FL2G-13S2M-C (Point Grey Research, Richmond, BC, Canada) with a DF6HA-1B lens (Fujifilm Group, Fujinon, Saitama, Japan). The camera had a mono 1.3-MP Sony ICX204 CCD, 1/3-in., 4.65-μm sensor. The frame rate was 30 FPS and the image size was 1,288 × 964 pixels.” [Methods], “To estimate the pose of the needle, marking edge points should be extracted from each image of the needle. These edge points are defined as the intersection of the centerline of the needle with the marking edge lines (Fig. 4a). The accuracy of the pose estimation is directly related to the accuracy of this segmentation procedure.” [Automatic feature extraction], see Figure 2 for the captured marking areas);
PNG
media_image2.png
265
677
media_image2.png
Greyscale
Figure 3
wherein the first marking area has a first grayscale color, the second marking area has a second grayscale color, the first grayscale color and the second grayscale color are different from each other, and the second grayscale color is arranged after the first marking area (“the first grayscale color and the second grayscale color are different from each other,” [Mathematical Framework], as seen in Fig. 1 there is alternating grey marking areas on the needle with both having different gray colors);
PNG
media_image1.png
908
1295
media_image1.png
Greyscale
Figure 4
confirming the first marking area and the second marking area of the needle are located on a straight line for a feature pose estimation according to a linear regression algorithm (“To estimate the pose of the needle, marking edge points should be extracted from each image of the needle. These edge points are defined as the intersection of the centerline of the needle with the marking edge lines (Fig. 4a). The accuracy of the pose estimation is directly related to the accuracy of this segmentation procedure.” [Automatic feature extraction] “The algorithm starts with an initial estimate for the needle centerline. The estimate can come from the previous frame, if the current frame is not the first frame. For the first frame, the estimate is provided by the Hough transform (using the Canny edge detector). Then, random lines are generated around this estimate by adding random variables to the slope and line intercept. Pixel values on each line are examined to determine if it is an “on-the-needle” line. In that case, the intersection points of the line with the marking edges will be stored.” [Collecting marking edge points]);
comparing the first marking area, the second marking area, and a matching template to confirm a direction of the needle (“A novel mathematical–geometrical formulation is devised here for needle trajectory calculation using the centimeter-spaced black markings on many needles,” [Single Camera Pose Estimation]);
obtaining a feature points number according to the first marking area and the second marking area in the needle, wherein the feature points number is a positive integer greater than or equal to 2 (“The feature extraction procedure, described in the next section, finds the projected points in the image, and is calculated in the camera calibration procedure. Actual marker distances are assumed to be known. For an epidural needle, in particular, the marking distances are all equal. However, the proposed method is not limited to the equal spacing constraint and can be generalized for needles with non-equal marking spacings.” [Mathematical framework]);
obtaining a range prediction interval of the ultrasound field of view according to the feature points number of the needle, wherein the processor projects the needle into the ultrasound field of view according to the feature points number, and a projection range in the ultrasound field of view of the needle is the range prediction interval, wherein the range prediction interval has a width, and the width of the range prediction interval related to the feature points number (“To measure the effect of various parameters such as depth, needle tilt angle, needle yaw angle and number of edge points on accuracy, a sensitivity analysis test was performed by simulating the pose estimation process. The mathematical formulation of the system and the standard camera model was used with a noise source model as input. As discussed under Precision of the Feature Extraction, the root mean squared segmentation error was measured as 2.2 pixels, and hence, here, the noise was modeled as a normal random variable with a standard deviation of 2.5 pixels For the base case, the needle's pose and the other parameters were chosen similar to those in the Overall System Accuracy section. The angle between the needle and the ultrasound image plane was 45°, the intersection depth was 20 mm and three edge points were used… Finally, the number of edge points was varied from 3 to 7 to investigate if using more edge points increases the accuracy because of the averaging effect. Results illustrated in Figure 12d confirm that the estimation error decreases by using more edge points.” [Sensitivity analysis of the needle pose estimation], Najafi discloses an estimation error (range prediction interval) between the projected needle on the ultrasound vs the actual coordinates of the needle, this estimation error is dependent on multiple factors; however, when the number of edge points (feature points) is increased the estimation error is shown to decrease in size (width) as seen in Fig. 12 D below);
PNG
media_image3.png
570
677
media_image3.png
Greyscale
determining the feature points number is greater than a feature points number threshold , and reducing the width of the range interval (“Finally, the number of edge points was varied from 3 to 7 to investigate if using more edge points increases the accuracy because of the averaging effect. Results illustrated in Figure 12d confirm that the estimation error decreases by using more edge points.” [Sensitivity analysis of the needle pose estimation]).
Najafi does not specifically teach a location code; and capturing the location code.
However, in a similar field of endeavor, Halmann teaches a system and method is provided for providing an indication of viewable and non-viewable parts of an interventional device in an ultrasound image [Abstract].
Halmann also teaches a location code; and capturing the location code (“The detection and recognition system 200 can then identify the pattern 47 for the needle 32 selected by the user and operate to locate that pattern 47 within the image data/ultrasound image 202. This information on the needle 32 to be used can be supplied to the detection and recognition system 200 by the user in various manners, such as through the input device 124, such as by manually entering identifying information on the needle 32, or by scanning a barcode or RFID located on packaging for the needle 32 including the identifying information.” [0027])
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with a location code; and capturing the location code as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 12, Najafi teaches a length of the first marking area is smaller than a length of the second marking area (“The feature extraction procedure, described in the next section, finds the projected points in the image, and is calculated in the camera calibration procedure. Actual marker distances are assumed to be known. For an epidural needle, in particular, the marking distances are all equal. However, the proposed method is not limited to the equal spacing constraint and can be generalized for needles with non-equal marking spacings.” [Mathematical framework]).
Regarding Claim 13, Najafi teaches a length of the first marking area is smaller than a length of the second marking area (“The feature extraction procedure, described in the next section, finds the projected points in the image, and is calculated in the camera calibration procedure. Actual marker distances are assumed to be known. For an epidural needle, in particular, the marking distances are all equal. However, the proposed method is not limited to the equal spacing constraint and can be generalized for needles with non-equal marking spacings.” [Mathematical framework]).
Regarding Claim 14, Najafi does not specifically teach setting a bending threshold by the processor according to the bending relational expression, wherein the bending relational expression has a distance error mean value; and when the distance error mean value smaller than the bending threshold, it is determined that the needle approaches the straight line by the processor..
However, in a similar field of endeavor, Halmann teaches setting a bending threshold by the processor according to the bending relational expression, wherein the bending relational expression has a distance error mean value; and when the distance error mean value smaller than the bending threshold, it is determined that the needle approaches the straight line by the processor. (“in the situation where the detection and recognition system 200 determines that less than the entire length of the first echogenic portion 50′ is represented or viewable within the ultrasound image 202, illustrating that the tip 42 has been directed and/or deflected out of the imaging plane for the image 202, the system 200 will position the boundary lines 404 along each side of the needle 32 represented in the ultrasound image 202. However, in this situation the boundary lines 404 presented by the system 202 will extend beyond the length of the needle 32 represented in the ultrasound image 202 to correspond in length and position to the actual position of the tip 42 of the needle 32 as determined by the detection and recognition system 200. As shown in the exemplary illustrated embodiment of FIG. 7 , the boundary lines 404 extend past the representation of the needle 32 to the estimated point where the tip 42 of the needle 32 is actually positioned as determined by the system 200, thus visually illustrating the position of the entire needle 32 including the tip 42 relative to the actual position of the first portion of the needle 32 that remains viewable in the ultrasound image 202 and bounded by a first portion 411 of the boundary lines 404. In this manner, the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035].
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with setting a bending threshold by the processor according to the bending relational expression, wherein the bending relational expression has a distance error mean value; and when the distance error mean value smaller than the bending threshold, it is determined that the needle approaches the straight line by the processor as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 15, Najafi teaches obtaining a depth of field length between the camera and the measuring object by the processor (“Proper values for these parameters should be chosen based on the experimental setup. Specifically, minpeakdistance is related to the length of the markings in pixels and, therefore, depends on the image resolution, lens field of view and distance of the needle to the camera.” [Collecting marking edge points]);
and obtaining a horizontal image range according to a horizontal viewing angle and the depth of field length of the camera by the processor (“By use of a linear stage, the needle was moved in 2-mm steps laterally at approximately 10- and 20-mm axial depths (26 points in total), as illustrated in Figure 11a. The tilt angle of the needle, the angle between the needle and the ultrasound image plane, was approximately 42°, and the angle between the needle and the optical axis of the camera was about 67°. The average angle between the needle and the camera image plane was 26.9°.” [Overall system accuracy]).
PNG
media_image4.png
512
852
media_image4.png
Greyscale
Regarding Claim 16, Najafi teaches obtaining a vertical image range by the processor according to a vertical viewing angle and the depth of field length of the camera (“By use of a linear stage, the needle was moved in 2-mm steps laterally at approximately 10- and 20-mm axial depths (26 points in total), as illustrated in Figure 11a. The tilt angle of the needle, the angle between the needle and the ultrasound image plane, was approximately 42°, and the angle between the needle and the optical axis of the camera was about 67°. The average angle between the needle and the camera image plane was 26.9°.” [Overall system accuracy]).
PNG
media_image4.png
512
852
media_image4.png
Greyscale
Regarding Claim 17, Najafi teaches all limitations noted above except obtaining a horizontal elongation by the processor according to the horizontal image range and an effective length of the needle.
However, in a similar field of endeavor, Halmann teaches obtaining a horizontal elongation by the processor according to the horizontal image range and an effective length of the needle. (“This is accomplished by the recognition and detection system 200 by comparing the location and/or dimensions (e.g., length) of the echogenic portion(s) 50,50′ and associated pattern(s) 47,47′ detected in the ultrasound image 202 and associated with a particular needle 32 with the dimensions of the needle 32 stored in the memory unit 122. For example, if the needle 32 detected in the ultrasound image 202 includes two echogenic portions 50,50′ spaced from one another by a band 52, with the echogenic portion(s) 50,50′ and the band 52 each having a specified length, the recognition and detection system 200 can determine the length of the body 40 of the needle 32 that is present within the image 202 based on the length of the echogenic portion(s) 50,50′ and band(s) 52 visible in the image 202.” [0029], “the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with obtaining a horizontal elongation by the processor according to the horizontal image range and an effective length of the needle as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 18, Najafi teaches all limitations noted above except that the effective length is greater than or equal to a sum of a first length of the first marking area and a second length of the second marking area.
However, in a similar field of endeavor, Halmann teaches that the effective length is greater than or equal to a sum of a first length of the first marking area and a second length of the second marking area (“This is accomplished by the recognition and detection system 200 by comparing the location and/or dimensions (e.g., length) of the echogenic portion(s) 50,50′ and associated pattern(s) 47,47′ detected in the ultrasound image 202 and associated with a particular needle 32 with the dimensions of the needle 32 stored in the memory unit 122. For example, if the needle 32 detected in the ultrasound image 202 includes two echogenic portions 50,50′ spaced from one another by a band 52, with the echogenic portion(s) 50,50′ and the band 52 each having a specified length, the recognition and detection system 200 can determine the length of the body 40 of the needle 32 that is present within the image 202 based on the length of the echogenic portion(s) 50,50′ and band(s) 52 visible in the image 202.” [0029], “the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with the effective length is greater than or equal to a sum of a first length of the first marking area and a second length of the second marking area as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Regarding Claim 19, Najafi teaches all limitations noted above except obtaining a vertical elongation by the processor according to the vertical image range and the effective length of the needle.
However, in a similar field of endeavor, Halmann teaches obtaining a vertical elongation by the processor according to the vertical image range and the effective length of the needle (“This is accomplished by the recognition and detection system 200 by comparing the location and/or dimensions (e.g., length) of the echogenic portion(s) 50,50′ and associated pattern(s) 47,47′ detected in the ultrasound image 202 and associated with a particular needle 32 with the dimensions of the needle 32 stored in the memory unit 122. For example, if the needle 32 detected in the ultrasound image 202 includes two echogenic portions 50,50′ spaced from one another by a band 52, with the echogenic portion(s) 50,50′ and the band 52 each having a specified length, the recognition and detection system 200 can determine the length of the body 40 of the needle 32 that is present within the image 202 based on the length of the echogenic portion(s) 50,50′ and band(s) 52 visible in the image 202.” [0029], “the detection and recognition system 200 not only provides the user with a position of the needle tip 42 based on the position of the ends 410 of the boundary lines 404 in the ultrasound image 202 based on their alignment with the tip 42 as determined by the system 202, but additionally indicates an estimation of the length of the needle 32 that extends out of the image plane for the ultrasound image 202 as represented by the second portion 412 of the boundary lines 402 extending between the ends 410 of the boundary lines 404 and the foremost viewable part of the needle 32 within the ultrasound image 202.” [0035]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri as outlined above with obtaining a vertical elongation by the processor according to the vertical image range and the effective length of the needle as taught by Halmann, because it is desirable to develop a system that can provide the user with an accurate indication of the location of the tip of the needle [0006].
Claim 10 & 20 are rejected under 35 U.S.C. 103 as being unpatentable over Najafi in view of Halmann as applied to Claim 9 and 19 above, and further in view of Dein (US 20090317002 A1)
Regarding Claim 10, Najafi in view of Halmann teaches all limitations noted above except that the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining an area elongation according to the horizontal image range, the vertical image range, and the effective length of the needle.
However, in a similar field of endeavor, Dein teaches intra-operative systems for identifying surgical sharp objects.
Dein also teaches that the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining an area elongation according to the horizontal image range, the vertical image range, and the effective length of the needle (“Any convenient machine vision and image processing technique may be employed. To distinguish between specific surgical sharp objects such as a scalpel and a needle, the module may use variables such as the size, shape, aspect ratio, outline, or color of the of the sharp object, certain angles or curves, the two-dimensional projection that is unique to a given surgical sharp object, etc. For example, the curve formed by a certain size of surgical needle can be used to identify the needle via automated shape recognition protocols.” [0045]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri and Halmann as outlined above with the processor is further configured to perform the following steps according to the plurality of commands of the memory: obtaining an area elongation according to the horizontal image range, the vertical image range, and the effective length of the needle as taught by Dein, because there is a need for improved systems and methods for identifying and tracking surgical items, including needles [0005].
Regarding Claim 20, Najafi in view of Halmann teaches all limitations noted above except obtaining an area elongation by the processor according to the horizontal image range, the vertical image range, and the effective length of the needle.
However, in a similar field of endeavor, Dein teaches intra-operative systems for identifying surgical sharp objects.
Dein also teaches obtaining an area elongation by the processor according to the horizontal image range, the vertical image range, and the effective length of the needle (“Any convenient machine vision and image processing technique may be employed. To distinguish between specific surgical sharp objects such as a scalpel and a needle, the module may use variables such as the size, shape, aspect ratio, outline, or color of the of the sharp object, certain angles or curves, the two-dimensional projection that is unique to a given surgical sharp object, etc. For example, the curve formed by a certain size of surgical needle can be used to identify the needle via automated shape recognition protocols.” [0045]).
It would have been obvious to an ordinary skilled person in the art before the effective filing
date of the claimed invention to modify the system of Najafiri and Halmann as outlined above with obtaining an area elongation by the processor according to the horizontal image range, the vertical image range, and the effective length of the needle as taught by Dein, because there is a need for improved systems and methods for identifying and tracking surgical items, including needles [0005].
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVEN MALDONADO whose telephone number is 703-756-1421. The examiner can normally be reached 8:00 am-4:00 pm PST M-Th Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at
http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached on (571) 272-7230. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Steven Maldonado/
Patent Examiner, Art Unit 3797
/CHRISTOPHER KOHARSKI/Supervisory Patent Examiner, Art Unit 3797