DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 09/15/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Status
Claim(s) 1, 3, 9-15 and 17 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Konofagou et al (US 20070049824 A1; Konofagou).
Claim(s) 2 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Konofagou et al (US 20070049824 A1; Konofagou), and in view of Rigney et al (U.S 6,985,172 B1; Rigney).
Claim(s) 4-8 and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Konofagou et al (US 20070049824 A1; Konofagou), and in view of Lysyansky et al (US 20040143189 A1; Lysyansky).
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 3, 9-15 and 17 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Konofagou et al (US 20070049824 A1; Konofagou).
Regarding claim 1, Konofagou discloses a method (Paragraph 51: “The system and methods described herein are useful for analyzing data obtained by an image generating device, such as an ultrasound transducer”) comprising:
receiving a set of temporal images of a target area (the heart), wherein the set of temporal images of the target area comprise a sequence of images of the target area taken over a period of time; (Fig.2 and Paragraph 54: “In an early stage in the procedure, raw imaging data of the body structure is acquired by image acquisition equipment such as the ultrasound probe 102 and scanner 104. In the exemplary embodiment, a set of N frames of raw ultrasound data of the heart is acquired during a cardiac cycle at high frame rate, e.g., higher than 100 fps, although frame rates of about 56 fps and 170 fps, etc., yield useful results (step 202).”; Paragraph 74: “This setup allows the real-time acquisition of more than one thousand 2D RF-data, e.g., images.”; Paragraph 76: “addition to the real-time scanning mode, a high frame rate acquisition mode (EKV) was provided on the scanner in the exemplary embodiment in order to allow detailed visualization of the heart contraction. The equipment operate as quickly as 8000 frames per minute,”)
generating a set of temporal difference images (N-1 displacement 2D maps) from the set of temporal images; (Paragraphs 56-57: “At step 206, the raw data received from the image acquisition equipment is processed. In the exemplary embodiment, the data processing step computes an estimation of the displacement of particular objects in the images, such as the myocardium, between consecutive frames. … N-1 displacement 2D maps (also referred to as correlation matrices) are computed through the correlation of two consecutive frames i and i+1 (1<i<N-1). Each frame is represented by a matrix of pixel values. The displacement maps provide an indication of the local axial movements between frames. … a matlab program Multiframe (see Appendix) is used to compute the displacement maps for the complete sequence of frames obtained at step 202, above. Multiframe calls the matlab routine FunCalculDispl (see Appendix) to compute the displacements for the sequence of frames. … ”)
generating a regional motion display from the set of temporal difference images comprising a representative line through the target area along a y-axis and an x-axis representing time; (Paragraph 59: “a video of the sequence of N-1 displacement maps may be assembled to create a video of the displacements of the body structure or tissue (step 212). In the exemplary embodiment, a video of the myocardium displacements is created by this technique. … The video of the displacement map of the myocardium will depict the propagation of the electromechanical wave. A next step in the procedure may be observation and tracking of the wave propagation (step 214). … The parameters of the electromechanical wave, e.g., velocity, amplitude, attenuation, frequency, etc., may be analyzed at step 216. For example, the velocity of the electromechanical wave may be computed as a function of its position in the myocardium.”; Fig.20a-20b: “ Paragraph 85: “A temporal analysis of the motion was performed for single RF lines of the image. The axial displacement along one central line of the image (indicated by the white, dotted vertical line 2050 on FIG. 20(b)) is shown as a function of time in FIG. 21(a) with the corresponding ECG signal (FIG. 21(d)). (FIGS. 2(a)-(d) are aligned on a temporal basis.) On this line 2050, the displacements of the septum, the papillary muscle and the posterior wall are shown in a M-mode format over two cardiac cycles. It shows the successive main phases of the cardiac cycle: the contraction of the myocardium (systole) indicated by arrow 2120 initiated at the R-wave peak of the ECG, followed by the relaxation phase (diastole) indicated by arrow 2130. The duration of the active contraction was approximately 50 ms, and that of the relaxation 35 ms. In addition to this slow and large motion, some rapid transient variations of a few ms were observed at the beginning and at the end of the systolic phase, in the septum and the posterior wall.”) and
displaying the regional motion display. (Figs. 20A-D,Fig. 21A-D, 22-25; and Paragraph 53: “ Output files 112 may include the displacement maps, videos of myocardium displacements, or computed data, such as electromechanical wave properties. … Typically, an output device, such as monitor 114, and an input device, such as keyboard 116, are also components of the system.”)
Regarding claim 3, Konofagou discloses generating the regional motion display from the set of temporal difference images comprises: calculating, for each y-axis pixel of the regional motion display, an integral of all pixels along a line perpendicular to the target area, a non-perpendicular line to the target area, a contour along the target area, or a surface of the target area in a temporally corresponding difference image of the set of temporal difference images; (Paragraph 57 : “N-1 displacement 2D maps (also referred to as correlation matrices) are computed through the correlation of two consecutive frames i and i+1 (1<i<N-1). Each frame is represented by a matrix of pixel values. The displacement maps provide an indication of the local axial movements between frames. … Multiframe calls the matlab routine FunCalculDispl (see Appendix) to compute the displacements for the sequence of frames. FunCalculDispl in turn calls the routine Correlation2D.cpp (see Appendix) which is a C program that computes the displacement map between consecutive frames. … a video of the sequence of N-1 displacement maps may be assembled to create a video of the displacements of the body structure or tissue (step 212). In the exemplary embodiment, a video of the myocardium displacements is created by this technique.”; Paragraph 14: “A correlation calculation is performed on the image frames to generate a matrix with the location of correlation maxima representing the relative displacement between the first and second image frames, also referred to as a displacement map. A video is generated comprising a series of displacement maps.”; it show that the correlation calculation perform on all pixels values/ matrix of pixel values of the image frames ) and
assigning, for each y-axis pixel of the regional motion display, the integral of all pixel along the line perpendicular to the target area, the non-perpendicular line to the target area, the contour along the target area, or the surface of the target area as a pixel value for a temporally corresponding portion of the representative line through the target area along the y-axis. (Figs. 20(a)-(d); Fig. 21(a)-(d) and Paragraphs 84-85: “FIG. 20(a) and 20(b) show the color-coded axial displacements overlaid onto the grayscale B-mode image for two different phases of the cardiac cycle. … A temporal analysis of the motion was performed for single RF lines of the image. The axial displacement along one central line of the image (indicated by the white, dotted vertical line 2050 on FIG. 20(b)) is shown as a function of time in FIG. 21(a) with the corresponding ECG signal (FIG. 21(d)).”; Paragraph 14: “A correlation calculation is performed on the image frames to generate a matrix with the location of correlation maxima representing the relative displacement between the first and second image frames, also referred to as a displacement map. A video is generated comprising a series of displacement maps.”, it shows that the value of y-axis pixels is represent in Figs. 20 and 21 as depth(mm) along with x-axis time (ms))
Regarding claim 9, The method of claim 1, wherein the regional motion display further comprises one or more time-synchronization elements displayed along the x-axis. (Paragraphs 33-34: “ FIG. 21(c) is a time plot illustrating the temporal variation of the axial displacements after bandpass filtering of the plot illustrated in FIG. 21(a) showing the transient and high frequency components … FIG. 21(d) illustrates the ECG signal acquired simultaneously with the data illustrated in FIGS. 21(a)-(c)”)
Regarding claim 10, Konofagou discloses the target area is a heart of a patient, wherein the one or more time-synchronization elements is a temporally corresponding electrocardiogram displayed along the x-axis. (Paragraph 42: “ FIGS. 26(a)-(e) illustrate a sequence of axial displacement maps overlaid to the grayscale image (0.12 ms between successive frames) indicating an electromechanical wave propagating in the posterior wall of the mouse from the apex towards the base during pacing in the right atrium close to the sinoatrial node”; Fig. 27(a)-(e)”)
Regarding claim 11, Konofagou discloses the set of temporal images of the target area comprises at least 250 images per second. (Paragraph 77: “The data were then processed off-line, RF-lines were synchronized using the R-wave peak of the ECG signal, and a complete set of 2D ultrasound RF-data was reconstructed at 8000 fps for one complete cardiac cycle (approximately 150 ms).”)
Regarding claim 12, Konofagou discloses further comprising adjusting the regional motion display by temporal scaling. (Paragraph 79: “The axial displacements were analyzed in the frequency domain as a function of the time. … The displacement estimates were temporally filtered using an FIR band-pass filter with cut-frequencies of f.sub.1=50 Hz and f.sub.2=500 Hz, which allows the removal of the low frequency components but also the high frequency noise.”)
Regarding claim 13, Konofagou discloses further comprising automatically detecting a propagating event. (Paragraph 69: “ On the displacement video, two electromechanical waves were clearly detected, propagating in the posterior wall of the left ventricular, from the septum (left side of the images) to the lateral wall (right side). The propagation of the mechanical wave corresponds to the electrical activity shown on an associated EKG.”; Paragraph 87-88; he FIG. 24 shows a wave propagating in the posterior wall (see the white arrows W). The displacements were initiated at the apex (left side of the images) and then propagated towards the base (right side). The phase velocity was determined using the method previously described at the frequency of 80 Hz. The distance of propagation was plotted in FIG. 25 as a function of the phase of the wave divided by the angular frequency. The phase velocity of the wave was obtained using a linear regression fit and was estimated to be 0.44 m/s.”)
Regarding claim 14, Konofagou discloses further comprising measuring at least one of onset timing of the propagating event, duration of the propagating event, and velocity of the propagating event. (Paragraph 90: “FIG. 24 shows a wave propagating in the posterior wall (see the white arrows W). The displacements were initiated at the apex (left side of the images) and then propagated towards the base (right side). The phase velocity was determined using the method previously described at the frequency of 80 Hz. The distance of propagation was plotted in FIG. 25 as a function of the phase of the wave divided by the angular frequency. The phase velocity of the wave was obtained using a linear regression fit and was estimated to be 0.44 m/s.”)
Regarding claim 15, Konofagou discloses A computing device (Paragraph 51: “The system and methods described herein are useful for analyzing data obtained by an image generating device, such as an ultrasound transducer”) comprising: a processor, memory and instructions stored in the memory that when executed by the processor, (Fig.1 and Paragraph 53: “The raw data produced by the scanner 104 may be transferred to a computer 106 having a CPU 108 for processing the data. In the exemplary embodiment, the computer and CPU would be Dell PC with a 2 GHz processor.”) direct the computing device to:
receive a set of temporal images of a target area, wherein the set of temporal images of the target area comprise a sequence of images of the target area taken over a period of time; (Fig.2 and Paragraph 54: “In an early stage in the procedure, raw imaging data of the body structure is acquired by image acquisition equipment such as the ultrasound probe 102 and scanner 104. In the exemplary embodiment, a set of N frames of raw ultrasound data of the heart is acquired during a cardiac cycle at high frame rate, e.g., higher than 100 fps, although frame rates of about 56 fps and 170 fps, etc., yield useful results (step 202).”; Paragraph 74: “This setup allows the real-time acquisition of more than one thousand 2D RF-data, e.g., images.”; Paragraph 76: “addition to the real-time scanning mode, a high frame rate acquisition mode (EKV) was provided on the scanner in the exemplary embodiment in order to allow detailed visualization of the heart contraction. The equipment operate as quickly as 8000 frames per minute,”)
generate a set of temporal difference images (N-1 displacement 2D maps) from the set of temporal images; (Paragraphs 56-57: “At step 206, the raw data received from the image acquisition equipment is processed. In the exemplary embodiment, the data processing step computes an estimation of the displacement of particular objects in the images, such as the myocardium, between consecutive frames. … N-1 displacement 2D maps (also referred to as correlation matrices) are computed through the correlation of two consecutive frames i and i+1 (1<i<N-1). Each frame is represented by a matrix of pixel values. The displacement maps provide an indication of the local axial movements between frames. … a matlab program Multiframe (see Appendix) is used to compute the displacement maps for the complete sequence of frames obtained at step 202, above. Multiframe calls the matlab routine FunCalculDispl (see Appendix) to compute the displacements for the sequence of frames. … ”)
generate a regional motion display from the set of temporal difference images comprising a representative line through the target area along a y-axis and an x-axis representing time; (Paragraph 59: “a video of the sequence of N-1 displacement maps may be assembled to create a video of the displacements of the body structure or tissue (step 212). In the exemplary embodiment, a video of the myocardium displacements is created by this technique. … The video of the displacement map of the myocardium will depict the propagation of the electromechanical wave. A next step in the procedure may be observation and tracking of the wave propagation (step 214). … The parameters of the electromechanical wave, e.g., velocity, amplitude, attenuation, frequency, etc., may be analyzed at step 216. For example, the velocity of the electromechanical wave may be computed as a function of its position in the myocardium.”; Fig.20a-20b: “ Paragraph 85: “A temporal analysis of the motion was performed for single RF lines of the image. The axial displacement along one central line of the image (indicated by the white, dotted vertical line 2050 on FIG. 20(b)) is shown as a function of time in FIG. 21(a) with the corresponding ECG signal (FIG. 21(d)). (FIGS. 2(a)-(d) are aligned on a temporal basis.) On this line 2050, the displacements of the septum, the papillary muscle and the posterior wall are shown in a M-mode format over two cardiac cycles. It shows the successive main phases of the cardiac cycle: the contraction of the myocardium (systole) indicated by arrow 2120 initiated at the R-wave peak of the ECG, followed by the relaxation phase (diastole) indicated by arrow 2130. The duration of the active contraction was approximately 50 ms, and that of the relaxation 35 ms. In addition to this slow and large motion, some rapid transient variations of a few ms were observed at the beginning and at the end of the systolic phase, in the septum and the posterior wall.”) and
cause to display the regional motion display. (Figs. 20A-D,Fig. 21A-D, 22-25; and Paragraph 53: “ Output files 112 may include the displacement maps, videos of myocardium displacements, or computed data, such as electromechanical wave properties. … Typically, an output device, such as monitor 114, and an input device, such as keyboard 116, are also components of the system.”)
Regarding claim 17, Konofagou discloses generating the regional motion display from the set of temporal difference images comprises: calculating, for each y-axis pixel of the regional motion display, an integral of all pixels along a line perpendicular to the target area, a non-perpendicular line to the target area, a contour along the target area, or a surface of the target area in a temporally corresponding difference image of the set of temporal difference images; (Paragraph 57 : “N-1 displacement 2D maps (also referred to as correlation matrices) are computed through the correlation of two consecutive frames i and i+1 (1<i<N-1). Each frame is represented by a matrix of pixel values. The displacement maps provide an indication of the local axial movements between frames. … Multiframe calls the matlab routine FunCalculDispl (see Appendix) to compute the displacements for the sequence of frames. FunCalculDispl in turn calls the routine Correlation2D.cpp (see Appendix) which is a C program that computes the displacement map between consecutive frames. … a video of the sequence of N-1 displacement maps may be assembled to create a video of the displacements of the body structure or tissue (step 212). In the exemplary embodiment, a video of the myocardium displacements is created by this technique.”; Paragraph 14: “A correlation calculation is performed on the image frames to generate a matrix with the location of correlation maxima representing the relative displacement between the first and second image frames, also referred to as a displacement map. A video is generated comprising a series of displacement maps.”; it show that the correlation calculation perform on all pixels values/ matrix of pixel values of the image frames ) and assigning, for each y-axis pixel of the regional motion display, the integral of all pixels along the line perpendicular to the target area, the non-perpendicular line to the target area, the contour along the target area, or the surface of the target area as a pixel value for a temporally corresponding portion of the representative line through the target area along the y-axis. (Figs. 20(a)-(d); Fig. 21(a)-(d) and Paragraphs 84-85: “FIG. 20(a) and 20(b) show the color-coded axial displacements overlaid onto the grayscale B-mode image for two different phases of the cardiac cycle. … A temporal analysis of the motion was performed for single RF lines of the image. The axial displacement along one central line of the image (indicated by the white, dotted vertical line 2050 on FIG. 20(b)) is shown as a function of time in FIG. 21(a) with the corresponding ECG signal (FIG. 21(d)).”; Paragraph 14: “A correlation calculation is performed on the image frames to generate a matrix with the location of correlation maxima representing the relative displacement between the first and second image frames, also referred to as a displacement map. A video is generated comprising a series of displacement maps.”, it shows that the value of y-axis pixels is represent in Figs. 20 and 21 as depth(mm) along with x-axis time (ms))
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 2 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Konofagou et al (US 20070049824 A1; Konofagou), and in view of Rigney et al (U.S 6,985,172 B1; Rigney).
Regarding claim 2, Konofagou discloses all the claims invention except wherein generating the set of temporal difference images from the set of temporal images comprises, for two temporally sequential images of the set of temporal images: calculating an absolute difference in detected brightness value of each pixel and assigning the absolute difference in the detected brightness value as a brightness value of a corresponding pixel in a difference image of the set of temporal difference images.
Rigney discloses generating the set of temporal difference images from the set of temporal images comprises, for two temporally sequential images of the set of temporal images: calculating an absolute difference in detected brightness value of each pixel (Fig. 4-5; Col 7 – lines 4-11: “ The temporal difference image may be computed as the absolute value of the difference between the sensor image 23 and the reference image 22 … for the special cases of detecting motion objects brighter than or darker than the reference image (scene background).”) and assigning the absolute difference in the detected brightness value as a brightness value of a corresponding pixel in a difference image of the set of temporal difference images. (Figs.4-5 and Col 7 – lines 4-31: “Temporal difference filtering involves detecting motion in a sensor image by computing a temporal difference image from image 23 and reference image 22, which is absent of motion objects. The temporal difference image may be computed as the absolute value of the difference between the sensor image 23 and the reference image 22. Alternatively, the temporal difference image may be computed as the positive or negative signed difference between the sensor image 23 and the reference image 22, for the special cases of detecting motion objects brighter than or darker than the reference image (scene background). The result of the temporal difference computation is a temporal difference image 24. … FIG. 5 illustrates the computation of temporal difference statistics, obtained from a set of temporal difference images 24. … Another statistic is an intensity of Nth Percentile. Another statistic is a measure of mean and standard deviation. Computations are evaluated at each pixel over the set of temporal difference images. )
Therefore, it would been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to modify the invention of Konofagou by including temporal processing and model based analysis in video images that is taught by Ridgney, to make the invention that motion detection in video images; thus, one of ordinary skilled in the art would have been motivated to combine the references since this will improving detection of motion objects as well as enhancing image processing techniques applied to infrared and visible light spectrum images in a time sequence.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filling date of the claimed invention.
Regarding claim 16, Konofagou discloses all the claims invention except generating the set of temporal difference images from the set of temporal images comprises, for two temporally sequential images of the set of temporal images: calculating an absolute difference in detected brightness value of each pixel; and assigning the absolute difference in the detected brightness value as a brightness value of a corresponding pixel in a difference image of the set of temporal difference images.
Rigney discloses generating the set of temporal difference images from the set of temporal images comprises, for two temporally sequential images of the set of temporal images: calculating an absolute difference in detected brightness value of each pixel (Fig. 4-5; Col 7- lines 4-11: “ The temporal difference image may be computed as the absolute value of the difference between the sensor image 23 and the reference image 22 … for the special cases of detecting motion objects brighter than or darker than the reference image (scene background).”) and assigning the absolute difference in the detected brightness value as a brightness value of a corresponding pixel in a difference image of the set of temporal difference images. (Figs.4-5 and Col 7 – lines 4-31: “Temporal difference filtering involves detecting motion in a sensor image by computing a temporal difference image from image 23 and reference image 22, which is absent of motion objects. The temporal difference image may be computed as the absolute value of the difference between the sensor image 23 and the reference image 22. Alternatively, the temporal difference image may be computed as the positive or negative signed difference between the sensor image 23 and the reference image 22, for the special cases of detecting motion objects brighter than or darker than the reference image (scene background). The result of the temporal difference computation is a temporal difference image 24. … FIG. 5 illustrates the computation of temporal difference statistics, obtained from a set of temporal difference images 24. … Another statistic is an intensity of Nth Percentile. Another statistic is a measure of mean and standard deviation. Computations are evaluated at each pixel over the set of temporal difference images. )
Therefore, it would been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to modify the invention of Konofagou by including temporal processing and model based analysis in video images that is taught by Ridgney, to make the invention that motion detection in video images; thus, one of ordinary skilled in the art would have been motivated to combine the references since this will improving detection of motion objects as well as enhancing image processing techniques applied to infrared and visible light spectrum images in a time sequence.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filling date of the claimed invention.
Claim(s) 4-8 and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Konofagou et al (US 20070049824 A1; Konofagou), and in view of Lysyansky et al (US 20040143189 A1; Lysyansky).
Regarding claim 4, Konofagou disclose all the claims invention except further comprising selecting a corresponding region of interest from the target area across the set of temporal difference images; wherein the integral is calculated for all pixels along a line perpendicular to the corresponding region of interest, a non-perpendicular line to the corresponding region of interest, a contour along the corresponding region of interest, or a surface of the corresponding region of interest; and wherein the integral calculated for all pixels along the line perpendicular to the corresponding region of interest, the non-perpendicular line to the corresponding region of interest, the contour along the corresponding region of interest, or the surface of the corresponding region of interest is assigned as the pixel value for the temporally corresponding portion of the representative line through the corresponding region of interest along the y-axis
Lysyansky discloses further comprising selecting a corresponding region of interest from the target area across the set of temporal difference images; (Paragraph 36-37: “ Visual observation by an echo-cardiographer for the purpose of diagnosing myocardial tissue abnormalities may be based on an observation of the features 136. Therefore, an image processing algorithm was developed to recognize and select the features 136 inside the ROI 132 … FIG. 4 illustrates the portion 140 of the ultrasound image frame 130 of FIG. 3.First locations of a set of features 142 are illustrated with small bullets, and represent features 136 identified (selected) on a current frame 130. Second locations of a set of features 144 are illustrated with large bullets, and represent the same features 136 which are found (tracked) on a next consecutive frame 130.”) wherein the integral is calculated for all pixels along a line perpendicular to the corresponding region of interest, a non-perpendicular line to the corresponding region of interest, a contour along the corresponding region of interest, or a surface of the corresponding region of interest; (Figs.5-6 and Paragraphs 39: “The geometric shift of the feature 136 illustrated by the first and second locations 152, 154 represents local tissue movement. The local tissue velocity can be calculated as a location shift divided by the time between B-mode frames: (Vx,Vy)=(dX,dY)*FR Equation 1 “) and wherein the integral calculated for all pixels along the line perpendicular to the corresponding region of interest, the non-perpendicular line to the corresponding region of interest, the contour along the corresponding region of interest, or the surface of the corresponding region of interest is assigned as the pixel value for the temporally corresponding portion of the representative line through the corresponding region of interest along the y-axis. (Figs.5-6 and Paragraphs 39-41: “The geometric shift of the feature 136 illustrated by the first and second locations 152, 154 represents local tissue movement. The local tissue velocity can be calculated as a location shift divided by the time between B-mode frames: (Vx,Vy)=(dX,dY)*FR Equation 1 … Equation 1 is repeated to calculate the 2D velocity vector 156 for each feature 136 identified on consecutive frames 130, resulting in a discrete set of tissue velocity vectors 156. … Each bullet on graph 160 represents the velocity 162 of a single feature 136 calculated between two consecutive image frames 130. The position of each feature 136 may be defined relative to the main axis L of the ROI 132 (FIG. 2), such as along (longitudinal to) and across (transverse to) the main axis L. L(pixels) represents the coordinate position of each feature 136 along the main axis L. The transverse coordinate axis is not illustrated, but is used as a second dimension when calculating two-dimensional tissue dynamics.”)
Therefore, it would been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to modify the invention of Konofagou by including an image processing algorithm to recognize and select the features inside the region of interest in ultrasound image frame that is taught by Lysyansky, to make the invention that an ultrasonic system for assessing tissue motion and deformation; thus, one of ordinary skilled in the art would have been motivated to combine the references since this will improving the accuracy of heart muscle assessment by making the assessment more objective and quantitative would have a significant benefit as well as enhancing the assessment of myocardium performance.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filling date of the claimed invention.
Regarding claim 5, Konofagou as modified by Lysyansky, discloses all the claims invention. Lysyansky further discloses selecting the corresponding region of interest across the set of temporal difference images is received via manual input. (Paragraph 33: “The ROI 132 may be entered via user input 124 or can be defined using any automatic myocardial outline procedure such as edge detection. The ROI 132 is selected in accordance with anatomical structure of the heart muscle.”; Fig.7 and Paragraph 44: “n Step 172, a region of interest, such as ROI 132, is defined on the current frame 130. An echo-cardiographer may identify the ROI 132 through the user input 124.”)
Regarding claim 6, Konofagou as modified by Lysyansky, discloses all the claims invention. Lysyansky further discloses selecting the corresponding region of interest across the set of temporal difference images is performed by automatic anatomical object recognition. (Paragraph 33: “The ROI 132 may be entered via user input 124 or can be defined using any automatic myocardial outline procedure such as edge detection. The ROI 132 is selected in accordance with anatomical structure of the heart muscle.”Fig.7 and Paragraph 44: “n Step 172, a region of interest, such as ROI 132, is defined on the current frame 130. An echo-cardiographer may identify the ROI 132 through the user input 124. Alternatively, the ROI 132 may be defined by software algorithms stored in the processor 116.”)
Regarding claim 7, Konofagou discloses all the claims invention except wherein generating the regional motion display from the set of temporal difference images further comprises applying, for each y-axis pixel of the regional motion display, a brightness transfer function to the integral of all pixels along the line perpendicular to the target area, the non-perpendicular line to the target area, the contour along the target area, or the surface of the target area, wherein a result of the brightness transfer function is assigned as the pixel value for the temporally corresponding portion of the representative line through the target area along the y-axis.
Lysyansky discloses generating the regional motion display from the set of temporal difference images further comprises applying, for each y-axis pixel of the regional motion display, a brightness transfer function to the integral of all pixels along the line perpendicular to the target area, the non-perpendicular line to the target area, the contour along the target area, or the surface of the target area, wherein a result of the brightness transfer function is assigned as the pixel value for the temporally corresponding portion of the representative line through the target area along the y-axis. (Paragraph 33: “Points 134, illustrated as bullets, define a main axis L, or a mid line along the ROI 132, and will be discussed further below. Tissue motions may be described in terms of longitudinal (along) and transverse (across) movements relative to the main axis L. The ultrasound data representing blood is typically displayed as a darker or black pixel, while heart tissue is displayed as a lighter or white pixel. Alternatively, the blood and heart tissue may be displayed using color.”, it shows that the display as dark/black pixel and lighter/white pixel is interpreted as “brightness transfer function)
Therefore, it would been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to modify the invention of Konofagou by including temporal processing and model based analysis in video images that is taught by Lysyansky, to make the invention that an ultrasonic system for assessing tissue motion and deformation; thus, one of ordinary skilled in the art would have been motivated to combine the references since this will improving detection of motion objects as well as enhancing image processing techniques applied to infrared and visible light spectrum images in a time sequence.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filling date of the claimed invention.
Regarding claim 8, Konofagou discloses all the claims invention except wherein the integral of all pixels along the line perpendicular to the target area, the non-perpendicular line to the target area, the contour along the target area, or the surface of the target area in a temporally corresponding difference image of the set of temporal difference images is calculated using a weighted function.
Lysyansky discloses the integral of all pixels along the line perpendicular to the target area, the non-perpendicular line to the target area, the contour along the target area, or the surface of the target area in a temporally corresponding difference image of the set of temporal difference images is calculated using a weighted function. (Paragraph 41: “FIG. 6 illustrates the transition from randomly distributed tissue velocities 162 to a continuous velocity distribution … position of each feature 136 may be defined relative to the main axis L of the ROI 132 (FIG. 2), such as along (longitudinal to) and across (transverse to) the main axis L. L(pixels) represents the coordinate position of each feature 136 along the main axis L. … This approximation allows an accurate presentation of the continuous velocity distribution within the ROI 132 area. As was mentioned above, correlation values are used as confidential weights in the fitting procedure, thus the resulting velocity representative function is closer to the velocities 162 with higher correlation coefficient (confidence level) and farther from the velocities 162 with lower correlation coefficient.”, it shows that the correlation function includes values/weight is interpreted as weight function)
Therefore, it would been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to modify the invention of Konofagou by including temporal processing and model based analysis in video images that is taught by Lysyansky, to make the invention that an ultrasonic system for assessing tissue motion and deformation; thus, one of ordinary skilled in the art would have been motivated to combine the references since this will improving detection of motion objects as well as enhancing image processing techniques applied to infrared and visible light spectrum images in a time sequence.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filling date of the claimed invention.
Regarding claim 18, Konofagou discloses all the claims invention except further comprising selecting a corresponding region of interest from the target area across the set of temporal images or temporal difference images; wherein the integral is calculated for all pixels along a line perpendicular to the corresponding region of interest, a non-perpendicular line to the corresponding region of interest, a contour along the corresponding region of interest, or a surface of the corresponding region of interest; and wherein the integral calculated for all pixels along the line perpendicular to the corresponding region of interest, the non-perpendicular line to the corresponding region of interest, the contour along the corresponding region of interest, or the surface of the corresponding region of interest is assigned as the pixel value for the temporally corresponding portion of the representative line through the corresponding region of interest along the y-axis.
Lysyansky discloses further comprising selecting a corresponding region of interest from the target area across the set of temporal difference images; (Paragraph 36-37: “ Visual observation by an echo-cardiographer for the purpose of diagnosing myocardial tissue abnormalities may be based on an observation of the features 136. Therefore, an image processing algorithm was developed to recognize and select the features 136 inside the ROI 132 … FIG. 4 illustrates the portion 140 of the ultrasound image frame 130 of FIG. 3.First locations of a set of features 142 are illustrated with small bullets, and represent features 136 identified (selected) on a current frame 130. Second locations of a set of features 144 are illustrated with large bullets, and represent the same features 136 which are found (tracked) on a next consecutive frame 130.”) wherein the integral is calculated for all pixels along a line perpendicular to the corresponding region of interest, a non-perpendicular line to the corresponding region of interest, a contour along the corresponding region of interest, or a surface of the corresponding region of interest; (Figs.5-6 and Paragraphs 39: “The geometric shift of the feature 136 illustrated by the first and second locations 152, 154 represents local tissue movement. The local tissue velocity can be calculated as a location shift divided by the time between B-mode frames: (Vx,Vy)=(dX,dY)*FR Equation 1 “) and wherein the integral calculated for all pixels along the line perpendicular to the corresponding region of interest, the non-perpendicular line to the corresponding region of interest, the contour along the corresponding region of interest, or the surface of the corresponding region of interest is assigned as the pixel value for the temporally corresponding portion of the representative line through the corresponding region of interest along the y-axis. (Figs.5-6 and Paragraphs 39-41: “The geometric shift of the feature 136 illustrated by the first and second locations 152, 154 represents local tissue movement. The local tissue velocity can be calculated as a location shift divided by the time between B-mode frames: (Vx,Vy)=(dX,dY)*FR Equation 1 … Equation 1 is repeated to calculate the 2D velocity vector 156 for each feature 136 identified on consecutive frames 130, resulting in a discrete set of tissue velocity vectors 156. … Each bullet on graph 160 represents the velocity 162 of a single feature 136 calculated between two consecutive image frames 130. The position of each feature 136 may be defined relative to the main axis L of the ROI 132 (FIG. 2), such as along (longitudinal to) and across (transverse to) the main axis L. L(pixels) represents the coordinate position of each feature 136 along the main axis L. The transverse coordinate axis is not illustrated, but is used as a second dimension when calculating two-dimensional tissue dynamics.”)
Therefore, it would been obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to modify the invention of Konofagou by including an image processing algorithm to recognize and select the features inside the region of interest in ultrasound image frame that is taught by Lysyansky, to make the invention that an ultrasonic system for assessing tissue motion and deformation; thus, one of ordinary skilled in the art would have been motivated to combine the references since this will improving the accuracy of heart muscle assessment by making the assessment more objective and quantitative would have a significant benefit as well as enhancing the assessment of myocardium performance.
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filling date of the claimed invention.
Regarding claim 19, Konofagou as modified by Lysyansky, discloses all the claims invention. Lysyansky further discloses selecting the corresponding region of interest across the set of temporal difference images is received via manual input. (Paragraph 33: “The ROI 132 may be entered via user input 124 or can be defined using any automatic myocardial outline procedure such as edge detection. The ROI 132 is selected in accordance with anatomical structure of the heart muscle.”; Fig.7 and Paragraph 44: “n Step 172, a region of interest, such as ROI 132, is defined on the current frame 130. An echo-cardiographer may identify the ROI 132 through the user input 124.”)
Regarding claim 20, Konofagou as modified by Lysyansky, discloses all the claims invention. Lysyansky further discloses selecting the corresponding region of interest across the set of temporal difference images is performed by automatic anatomical object recognition. (Paragraph 33: “The ROI 132 may be entered via user input 124 or can be defined using any automatic myocardial outline procedure such as edge detection. The ROI 132 is selected in accordance with anatomical structure of the heart muscle.”Fig.7 and Paragraph 44: “n Step 172, a region of interest, such as ROI 132, is defined on the current frame 130. An echo-cardiographer may identify the ROI 132 through the user input 124. Alternatively, the ROI 132 may be defined by software algorithms stored in the processor 116.”)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Bussadori et al (U.S. 20090028404 A1), “Method and Corresponding Apparatus for Quantitative Measurements on Sequences of Images, Particularly Ultrasonic Images”, teaches about a method for assessing motion, including deformation, of a structure from a sequence of at least two consecutive image frames of such structure, which images are timely separated by a certain time interval. The method including the steps of defining a certain number of reference points at least on one image frame, and determining the velocity of motion of such reference points between two successive image frames.
Kasahara et al (U.S. 20100262006 A1), “Ultrasound Diagnostic Apparatus”, teaches about an ultrasound diagnostic apparatus for forming display images of an object in periodic motion. It also teaches about a scanning plane is displaced over a plurality of periods of the motion so as to form a plurality of scanning planes within the three-dimensional region; searches for a plurality of base images from an image string constituted of a plurality of images corresponding to the plurality of scanning planes according to a feature amount relating to the periodicity of the motion; divides the image string into a plurality of image groups using the respective base images as dividing units, and extracts a plurality of images which correspond to one another on a periodic basis from the respective image groups; and a display image forming unit that forms a display image of the object based on the plurality of images which correspond to one another on a periodic basis.
Sakaguchi (U.S. 20120155737 A1), “ Image Processing Apparatus and Image Processing Method”, teaches about an image processing apparatus includes the difference image generating unit generates a difference image by calculating a difference in a second X-ray transmission image from a first X-ray transmission image, the second X-ray transmission image being an image in which a myocardial tissue of an examined subject is not opacified and the first X-ray transmission image being an image in which the myocardial tissue of the examined subject is opacified with a contrast agent that has been injected into a coronary artery and the display controlling unit exercises control so that a predetermined display unit displays the difference image that has been generated by the difference image generating unit.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Duy A Tran whose telephone number is (571)272-4887. The examiner can normally be reached Monday-Friday 8:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ONEAL R MISTRY can be reached at (313)-446-4912. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DUY TRAN/Examiner, Art Unit 2674
/ONEAL R MISTRY/Supervisory Patent Examiner, Art Unit 2674