DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
Claims 8, 12 and 20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The limitations in claims 8 and 12 are the same and both depend from claim 1. Claim 12 minor formalities of antecedent for “a target” vs. “the target” and “a region of interest” vs. “the region of interest. Claim 1 defines a region of interest but not target region. The metes and bounds are unclear.
Claim 20, recites the limitation "a region of interest". There is insufficient antecedent basis for this limitation in the claim. It is unclear if this region of interest is the same or different from claim 14.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 6-16 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Takeda (U.S. 20130274608, Oct. 17, 2013)(hereinafter, “Takeda”) in view of Arai et. al. (U.S. 20170196535, July 13, 2017)(hereinafter, “Arai”).
Regarding Claim 1, Takeda teaches: A method performed by a processor in an ultrasound system (Figs. 3 and 6), the method comprising: receiving ultrasound signals from a region of interest during a needle injection procedure (Fig. 3, element 22, ultrasound probe, element 22a, transducer, element 24, puncture needle [0061-0062]);
converting the ultrasound signals into image frames of ultrasound data (Fig. 3, element 203, receiving unit, [0067] and Fig. 4 element 204, image processor [0073]);
storing the image frames in a buffer memory (Fig. 3, element 205, image memory, [0075]);
obtaining a first subset of the image frames from the buffer memory (“The ultrasound image data as generated above is sent from the image memory 205 to the DSC 206 by one frame every predetermined time by being controlled by the control unit 208.” [0077]);
identify, from the first subset, a first image frame indicating a first injection event using a first image processing technique (“When the sampling memory 203c stores the received signals which are obtained from the reflected ultrasound of the puncture needle searching beam as transmitted above, the puncture needle position detection unit 203e analyzes these received signals to generate puncture needle echo information which indicates the angle and position of the puncture needle 24 inserted in the subject.” [0072]; “…the control unit 208 executes a puncture needle recognition processing which puts the puncture needle position detection unit 203e into operation, and thus obtains the puncture needle echo information (step S101).” [0086]);
obtaining a second subset of the image frames from the buffer memory, wherein the second subset of the image frames have been generated and stored subsequent to the first subset in the buffer memory (“The sampling memory 203c has a memory area of multiple channels corresponding to the respective transducers…The sampling memory 203c thus stores the received signals in chronological order.” [0070]; “…the control unit 208 creates the puncture video data which enables to reproduce multiple frames of the composite image data obtained between the beginning and end of generating the puncture video data in the form of video where they are displayed one after another in chronological order (step S406).” [0143]);
identifying, from the second subset, a second image frame (“The puncture needle image frame buffer 205a stores the puncture image data on a frame basis. The biological tissue image frame buffer 205b stores biological tissue image data on the biological tissue in the subject on a frame basis…The composite image frame buffer 205c stores a composite image data on a frame basis, which is a composite ultrasound image data of the puncture needle image data and biological tissue image data which are respectively read out from the puncture needle image frame buffer 205a and biological tissue image frame buffer 205b.” [0076]),
Takeda does not teach: indicating a second injection event using a second image processing technique, wherein the first image processing technique and the second image processing technique are different image processing techniques to reduce processing burden for the system.
Arai in the field of medical systems for puncture needles insertion assistance teaches: “…two or three puncture needles are used simultaneously during treatment to apply a high-frequency cauterization treatment to a target (the target tissue to be treated).” [0071]; “FIG. 3 shows a puncture needle array. Specifically, the drawing shows a view in which three puncture needles (treatment tools) 18A, 18B, 18C are used to treat a target 70. In the depicted example, the three puncture needles 18A, 18B, 18C are arranged such that the target 70 is surrounded by the tips of the three puncture needles 18A, 18B, 18C...” [0072]; “…processes such as a binarization process and segmentation process are applied to the sectional image 86 on the left. As a result of these image processes, a sectional image 86A is generated. The sectional image 86A includes a first needle image 106A extracted by the image processes.” [0092]; “By the registration methods…the information of the actual insertion length in the insertion path is registered in the insertion history storage unit...” [0093].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the injection event in Takeda to indicate an additional second injection event using a second image processing technique, wherein the first image processing technique and the second image processing technique are different image processing techniques as taught in Arai “…to assist insertion of two or more puncture needles…to enable an appropriate and easy setting of an insertion path of a puncture needle for subsequent insertion relative to a puncture needle after insertion completion.” (Arai, [0017]).
Regarding Claim 2, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda further teaches: saving at least one of the first image frame and the second image frame into a first memory (“The puncture needle image frame buffer 205a stores the puncture image data on a frame basis. The biological tissue image frame buffer 205b stores biological tissue image data on the biological tissue in the subject on a frame basis…The composite image frame buffer 205c stores a composite image data on a frame basis, which is a composite ultrasound image data of the puncture needle image data and biological tissue image data which are respectively read out from the puncture needle image frame buffer 205a and biological tissue image frame buffer 205b.” [0076]; “…the control unit 208 creates the puncture video data which enables to reproduce multiple frames of the composite image data obtained between the beginning and end of generating the puncture video data in the form of video where they are displayed one after another in chronological order (step S406).” [0143]).
Regarding Claim 3, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda further teaches: wherein the first imaging technique is an image frames comparing technique (“The control unit 208 determines a distance from the insert position to the tip position of the puncture needle 24 with respect to each puncture needle image data developed on the x-y space, and compares them. Regarding the target of the comparison, the integrals about the x-axis may be compared to specify the deepest puncture needle image data. Further, the lengths of the puncture needle may be determined by means of trigonometric function and compared.” [0145]),
Takeda does not explicitly teach: and the second image processing technique is a machine learning algorithm image processing technique.
Arai in the field of medical systems for puncture needles insertion assistance teaches: “…two or three puncture needles are used simultaneously during treatment to apply a high-frequency cauterization treatment to a target (the target tissue to be treated).” [0071]; “FIG. 3 shows a puncture needle array. Specifically, the drawing shows a view in which three puncture needles (treatment tools) 18A, 18B, 18C are used to treat a target 70. In the depicted example, the three puncture needles 18A, 18B, 18C are arranged such that the target 70 is surrounded by the tips of the three puncture needles 18A, 18B, 18C...” [0072]; “…processes such as a binarization process and segmentation process are applied to the sectional image 86 on the left. As a result of these image processes, a sectional image 86A is generated. The sectional image 86A includes a first needle image 106A extracted by the image processes.” [0092]; “By the registration methods…the information of the actual insertion length in the insertion path is registered in the insertion history storage unit...” [0093];
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination such that the second image processing technique is a machine learning algorithm image processing technique as taught in Arai “…to assist insertion of two or more puncture needles…to enable an appropriate and easy setting of an insertion path of a puncture needle for subsequent insertion relative to a puncture needle after insertion completion.” (Arai, [0017]).
Regarding Claim 6, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda further teaches: further comprising marking, in the buffer memory, at least one of the first image frame and the second image frame for inclusion into a patient record (“The ultrasound diagnostic imaging apparatus 20 also generates supplementary information of the generated ultrasound image data on the basis of the image capturing order information. The ultrasound diagnostic imaging apparatus 20 may add the supplementary information to the ultrasound image data to generate an image file of a DICOM (digital imaging and communication in medicine) image data which meets the DICOM standard, and may send it to the PACS 30.” [0059]; “…the received signals obtained from the ultrasound reflected on the puncture needle 24 show up as seen in the area A surrounded by the dotted line in FIG. 10A. Thus, there is no specific received signals, and the puncture needle 24 is difficult to detect. On the contrary, if the transmitted ultrasound beam is a plane wave, the ultrasound reflected on the puncture needle 24 forms a plane wave. The receives signals obtained from the reflected ultrasound beams on the puncture needle 24 show up as seen in the area B surrounded by the dotted line in FIG. 10B. As a result, the rectilinear received signals are obtained, and the puncture needle 24 can be detected with such signals.” [0088]).
Regarding Claim 7, the combination of Takeda and Arai teach the claim limitations as noted above.
further comprising: comparing the first subset of the image frames to each other to detect a motion , wherein the motion is associated with at least one of the needle, a fluid delivered by the needle, or a material removed by the needle, and wherein the first image frame is identified based on the comparing (“When the sampling memory 203c stores the received signals which are obtained from the reflected ultrasound of the puncture needle searching beam as transmitted above, the puncture needle position detection unit 203e analyzes these received signals to generate puncture needle echo information which indicates the angle and position of the puncture needle 24 inserted in the subject. Based on the generated puncture needle echo information, the puncture needle position detection unit 203e also generates puncture access information which specifies the actual insert angle and depth of the puncture needle 24 inserted in the subject. Specific methods of generating the puncture needle echo information and puncture access information are described below. Based on the generated puncture access information, the puncture needle position detection unit 203e then tells the phasing addition unit 203d the channel which corresponds to the receiving aperture center in the phasing addition in order that it generates the sound ray data which includes the puncture needle image data described below.” [0072]; Fig. 6 element 208, control unit, [0085-0086][0139-0140]; “If the control unit 208 determines that it extracts the composite image data to generate the video data, in which the composite image data are multiple frames of data consisting of the composite image data which shows the puncture needle 24 at the deepest position and other composite image data within the predetermined time period before and after it and the video data is data which displays these data one after another in chronological order (step S411, Y), it extracts the composite image data which shows the puncture needle 24 at the deepest position, as well as other composite image data within the predetermined time period before and after it (step S412). The control unit 208 then creates the deepest puncture video data file for reproducing these composite image data in the form of video which displays them one after another in chronological order (step S413). The control unit 208 saves the deepest puncture picture motion data file as created above in the storage unit 209 (step S407), and ends the processing” [0148]).
Regarding Claim 8, the combination of Takeda and Arai teach the claim limitations as noted above.
wherein the first injection event indicates that a needle has approached to a target location within a region of interest and the second injection event indicates a tip of the needle.
Takeda further teaches indicating a tip of the needle: “The control unit 208 determines a distance from the insert position to the tip position of the puncture needle 24 with respect to each puncture needle image data developed on the x-y space, and compares them.” [0145].
Takeda does not explicitly teach an injection event that indicates a needle has approached to a target location within a region of interest.
Arai in the field of medical systems for puncture needles insertion assistance teaches: “A target cross section 14 has appeared on the scan plane 12…The puncture adapter 16 is mounting hardware which guides a puncture needle 18 at a certain distance and angle relative to the probe body 10A. In FIG. 1, the insertion direction, in other words, the insertion path, is represented by a reference numeral 20. In FIG. 1, the insertion path 20 passes through the target cross section 14. The puncture needle 18 is supported by the puncture adapter 16 such that the actual insertion path is within the scan plane 12, in other words, the puncture needle advances on the scan plane 12. The puncture adapter 16 may include a sensor which senses the amount of insertion.” [0054]
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Takeda to include an injection event that indicates a needle has approached to a target location within a region of interest as taught in Arai, “…to assist insertion of two or more puncture needles…to enable an appropriate and easy setting of an insertion path of a puncture needle for subsequent insertion relative to a puncture needle after insertion completion.” (Arai, [0017]).
Regarding Claim 9, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda further teaches: further comprising: generating an indication to an operator that the first image frame depicts the first injection event (“When the sampling memory 203c stores the received signals which are obtained from the reflected ultrasound of the puncture needle searching beam as transmitted above, the puncture needle position detection unit 203e analyzes these received signals to generate puncture needle echo information which indicates the angle and position of the puncture needle 24 inserted in the subject.” [0072]; “…the control unit 208 executes a puncture needle recognition processing which puts the puncture needle position detection unit 203e into operation, and thus obtains the puncture needle echo information (step S101).” [0086]);
Regarding Claim 10, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda further teaches: further comprising: displaying the image frames on a display device (“The display unit 207 can be a display device such as LCD (liquid crystal display), CRT (cathode-ray tube) display, organic EL (electronic luminescence) display, inorganic EL display and plasma display. The display unit 207 displays the ultrasound image on a display screen according to the image signal output from the DSC 206. The embodiment employs a 15-inch LCD with a white or full-color LED (light-emitting diode) backlight as the display unit 207. The LCD with a white backlight may have a function of adjusting the brightness of the LED, for example, by analyzing the ultrasound image data. In this case, the screen may be divided into a plurality of areas and the brightness of the LED may be adjusted in each of the areas.” [0079]),
Regarding Claim 11, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda further teaches: further comprising: transmitting, in two or more pulses, the ultrasound signals to the region of interest during the needle injection procedure (“The pulse generator circuit generates a pulse signal as the driving signal with a predetermined period. The transmission unit 202 as configured above drives, for example, a certain contiguous part (e.g. 64 pieces) of the n (e.g. 192) transducers arrayed in the ultrasound probe 22, so as to generate the transmission ultrasound. Such ultrasound beam to be focused may be called scanning beam.” [0066]).
Regarding Claim 12, the combination of Takeda and Arai teach the claim limitations as noted above.
wherein the first injection event indicates that a needle has approached to the target location within the region of interest and the second injection event indicates a tip of the needle.
Takeda further teaches indicating a tip of the needle: “The control unit 208 determines a distance from the insert position to the tip position of the puncture needle 24 with respect to each puncture needle image data developed on the x-y space, and compares them.” [0145].
Takeda does not explicitly teach an injection event that indicates a needle has approached to a target location within a region of interest.
Arai in the field of medical systems for puncture needles insertion assistance teaches: “A target cross section 14 has appeared on the scan plane 12…The puncture adapter 16 is mounting hardware which guides a puncture needle 18 at a certain distance and angle relative to the probe body 10A. In FIG. 1, the insertion direction, in other words, the insertion path, is represented by a reference numeral 20. In FIG. 1, the insertion path 20 passes through the target cross section 14. The puncture needle 18 is supported by the puncture adapter 16 such that the actual insertion path is within the scan plane 12, in other words, the puncture needle advances on the scan plane 12. The puncture adapter 16 may include a sensor which senses the amount of insertion.” [0054]
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Takeda to include an injection event that indicates a needle has approached to a target location within a region of interest as taught in Arai, “…to assist insertion of two or more puncture needles…to enable an appropriate and easy setting of an insertion path of a puncture needle for subsequent insertion relative to a puncture needle after insertion completion.” (Arai, [0017]).
Regarding Claim 13, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda further teaches: further comprising: marking a region of at least one of the first subset of image frames that likely depicts the first injection event (“The ultrasound diagnostic imaging apparatus 20 also generates supplementary information of the generated ultrasound image data on the basis of the image capturing order information. The ultrasound diagnostic imaging apparatus 20 may add the supplementary information to the ultrasound image data to generate an image file of a DICOM (digital imaging and communication in medicine) image data which meets the DICOM standard, and may send it to the PACS 30.” [0059]; “…the received signals obtained from the ultrasound reflected on the puncture needle 24 show up as seen in the area A surrounded by the dotted line in FIG. 10A. Thus, there is no specific received signals, and the puncture needle 24 is difficult to detect. On the contrary, if the transmitted ultrasound beam is a plane wave, the ultrasound reflected on the puncture needle 24 forms a plane wave. The receives signals obtained from the reflected ultrasound beams on the puncture needle 24 show up as seen in the area B surrounded by the dotted line in FIG. 10B. As a result, the rectilinear received signals are obtained, and the puncture needle 24 can be detected with such signals.” [0088]).
Regarding Claim 14, Takeda teaches: A non-transitory machine readable medium comprising instructions that cause a data processing system to perform operations comprising (“ The storage unit 209 is made of, for example, a high-capacity record medium such as HDD (hard disk drive) and SSD (solid state drive), and is capable of storing the ultrasound image data as generated above. The storage unit 209 is capable of storing a frame of the ultrasound image data on a one-frame still image as well as a video data in which the ultrasound image data on several frames are displayed as a video. Besides the above record medium, a portable record medium such as DVD-R (digital versatile disk-recordable) and CD-R (compact disk-recordable) and a data reading/writing device such as DVD-R drive or CD-R drive for recording data thereon may be provided to the storage unit 209. The storage unit 209 may be capable of storing an image file of the DICOM image data as generated above.” [0082]:
receiving ultrasound signals from a region of interest during a needle injection procedure (Fig. 3, element 22, ultrasound probe, element 22a, transducer, element 24, puncture needle [0061-0062]);
converting the ultrasound signals into image frames of ultrasound data (Fig. 3, element 203, receiving unit, [0067] and Fig. 4 element 204, image processor [0073]);
storing the image frames in a buffer memory (Fig. 3, element 205, image memory, [0075]);
obtaining a first subset of the image frames from the buffer memory (“The ultrasound image data as generated above is sent from the image memory 205 to the DSC 206 by one frame every predetermined time by being controlled by the control unit 208.” [0077]);
identify, from the first subset, a first image frame indicating a first injection event using a first image processing technique (“When the sampling memory 203c stores the received signals which are obtained from the reflected ultrasound of the puncture needle searching beam as transmitted above, the puncture needle position detection unit 203e analyzes these received signals to generate puncture needle echo information which indicates the angle and position of the puncture needle 24 inserted in the subject.” [0072]; “…the control unit 208 executes a puncture needle recognition processing which puts the puncture needle position detection unit 203e into operation, and thus obtains the puncture needle echo information (step S101).” [0086]);
obtaining a second subset of the image frames from the buffer memory, wherein the second subset of the image frames have been generated and stored subsequent to the first subset in the buffer memory (“The sampling memory 203c has a memory area of multiple channels corresponding to the respective transducers…The sampling memory 203c thus stores the received signals in chronological order.” [0070]; “…the control unit 208 creates the puncture video data which enables to reproduce multiple frames of the composite image data obtained between the beginning and end of generating the puncture video data in the form of video where they are displayed one after another in chronological order (step S406).” [0143]);
identifying, from the second subset, a second image frame (“The puncture needle image frame buffer 205a stores the puncture image data on a frame basis. The biological tissue image frame buffer 205b stores biological tissue image data on the biological tissue in the subject on a frame basis…The composite image frame buffer 205c stores a composite image data on a frame basis, which is a composite ultrasound image data of the puncture needle image data and biological tissue image data which are respectively read out from the puncture needle image frame buffer 205a and biological tissue image frame buffer 205b.” [0076]),
Takeda does not teach: indicating a second injection event using a second image processing technique, wherein the first image processing technique and the second image processing technique are different image processing techniques to reduce processing burden for the system.
Arai in the field of medical systems for puncture needles insertion assistance teaches: “…two or three puncture needles are used simultaneously during treatment to apply a high-frequency cauterization treatment to a target (the target tissue to be treated).” [0071]; “FIG. 3 shows a puncture needle array. Specifically, the drawing shows a view in which three puncture needles (treatment tools) 18A, 18B, 18C are used to treat a target 70. In the depicted example, the three puncture needles 18A, 18B, 18C are arranged such that the target 70 is surrounded by the tips of the three puncture needles 18A, 18B, 18C...” [0072]; “…processes such as a binarization process and segmentation process are applied to the sectional image 86 on the left. As a result of these image processes, a sectional image 86A is generated. The sectional image 86A includes a first needle image 106A extracted by the image processes.” [0092]; “By the registration methods…the information of the actual insertion length in the insertion path is registered in the insertion history storage unit...” [0093].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the injection event in Takeda to indicate an additional second injection event using a second image processing technique, wherein the first image processing technique and the second image processing technique are different image processing techniques as taught in Arai “…to assist insertion of two or more puncture needles…to enable an appropriate and easy setting of an insertion path of a puncture needle for subsequent insertion relative to a puncture needle after insertion completion.” (Arai, [0017]).
Regarding Claim 15, the combination of Takeda and Arai teach the claim limitations as noted above.
Claim 15 further recites limitations: further comprising instructions that cause the data processing system to perform operations comprising saving at least one of the first image frame and the second image frame into a first memory. These limitations are present in claim 2 and is therefore, rejected under the same rationale.
Regarding Claim 16, the combination of Takeda and Arai teach the claim limitations as noted above.
Claim 16 further recites limitations: wherein the first imaging technique is an image frames comparing technique, and the second image processing technique is a machine learning algorithm image processing technique. These limitations are present in claim 3 and is therefore, rejected under the same rationale.
Regarding Claim 19, the combination of Takeda and Arai teach the claim limitations as noted above.
Claim 19 further recites limitations: further comprising instructions that cause the data processing system to perform operations comprising marking, in the buffer memory, at least one of the first image frame and the second image frame for inclusion into a patient record. These limitations are present in claim 6 and is therefore, rejected under the same rationale.
Regarding Claim 20, the combination of Takeda and Arai teach the claim limitations as noted above.
Claim 20 further recites limitations: wherein the first injection event indicates that a needle has approached to a target location within a region of interest and the second injection event indicates a tip of the needle. These limitations are present in claim 8 and is therefore, rejected under the same rationale.
Claim 4 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Takeda in view of Arai and Fosodeder (U.S. 20160173770, June 16, 2016)(hereinafter, “Fosodeder”).
Regarding Claim 4, the combination of Takeda and Arai teach the claim limitations as noted above.
Takeda does not teach: further comprising: obtaining a third subset of the image frames from the buffer memory, wherein the third subset of the image frames have been generated and stored subsequent to the second subset in the buffer memory; identifying, from the third subset, a third image frame indicating a third injection event using a third image processing technique that is different from the first image processing technique and the second image processing technique.
Fosodeder in the field of ultrasound visualization enhancement systems teaches: “An image buffer 136 is included for storing processed frames of acquired ultrasound scan data that are not scheduled to be displayed immediately. Preferably, the image buffer 136 is of sufficient capacity to store at least several seconds worth of frames of ultrasound scan data. The frames of ultrasound scan data are stored in a manner to facilitate retrieval thereof according to its order or time of acquisition. “ [0029];”… as illustrated in FIG. 3, the signal processor 132 may retrieve raw ultrasound image data 300 from the cine buffer 138 and generate 208 a cine sequence 310 having a plurality of frames 312.” [0036].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the buffer memory in Takeda to obtain a third subset of the image frames as taught in Fosodeder by using a cine memory allowing for retrieving and storing of various number of frames at a time while storing more subsets of frames can be performed while retrieving of a subset, the selection process allows for enhanced visualization and super-resolution image processing (Fosodeder, [0027]).
Fosodeder does not teach a third injection event using a third image processing technique.
Arai in the field of medical systems for puncture needles insertion assistance teaches: “…two or three puncture needles are used simultaneously during treatment to apply a high-frequency cauterization treatment to a target (the target tissue to be treated).” [0071]; “FIG. 3 shows a puncture needle array. Specifically, the drawing shows a view in which three puncture needles (treatment tools) 18A, 18B, 18C are used to treat a target 70. In the depicted example, the three puncture needles 18A, 18B, 18C are arranged such that the target 70 is surrounded by the tips of the three puncture needles 18A, 18B, 18C...” [0072]; “…processes such as a binarization process and segmentation process are applied to the sectional image 86 on the left. As a result of these image processes, a sectional image 86A is generated. The sectional image 86A includes a first needle image 106A extracted by the image processes.” [0092]; “By the registration methods…the information of the actual insertion length in the insertion path is registered in the insertion history storage unit...” [0093];
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify a third subset of image frames of the combination of references such that a third injection event using a third image processing technique as taught in Arai“…to assist insertion of two or more puncture needles…to enable an appropriate and easy setting of an insertion path of a puncture needle for subsequent insertion relative to a puncture needle after insertion completion.” (Arai, [0017]).
Regarding Claim 17, the combination of Takeda and Arai teach the claim limitations as noted above.
Claim 17 further recites limitations: further comprising instructions that cause the data processing system to perform operations comprising obtaining a third subset of the image frames from the buffer memory, wherein the third subset of the image frames have been generated and stored subsequent to the second subset in the buffer memory; identifying, from the third subset, a third image frame indicating a third injection event using a third image processing technique that is different from the first image processing technique and the second image processing technique. These limitations are present in claim 4 and is therefore, rejected under the same rationale.
Claims 5 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Takeda in view of Arai and Fosodeder as applied to claims 4 and 17 above, and further in view of Averkiou et. al. (20160343134, November 24, 2016)(hereinafter, “Averkiou”).
Regarding Claim 5, the combination of Takeda, Arai and Fosodeder teach the claim limitations as noted above.
Takeda does not teach: wherein the third injection event is an injection taking place within the region of interest and the third image processing technique is a color flow imaging technique.
Arai in the field of medical systems for puncture needles insertion assistance teaches: “…two or three puncture needles are used simultaneously during treatment to apply a high-frequency cauterization treatment to a target (the target tissue to be treated).” [0071]; “FIG. 3 shows a puncture needle array. Specifically, the drawing shows a view in which three puncture needles (treatment tools) 18A, 18B, 18C are used to treat a target 70. In the depicted example, the three puncture needles 18A, 18B, 18C are arranged such that the target 70 is surrounded by the tips of the three puncture needles 18A, 18B, 18C...” [0072]; “…processes such as a binarization process and segmentation process are applied to the sectional image 86 on the left. As a result of these image processes, a sectional image 86A is generated. The sectional image 86A includes a first needle image 106A extracted by the image processes.” [0092]; “By the registration methods…the information of the actual insertion length in the insertion path is registered in the insertion history storage unit...” [0093];
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination such that the third injection event is an injection taking place within the region of interest as taught in Arai“…to assist insertion of two or more puncture needles…to enable an appropriate and easy setting of an insertion path of a puncture needle for subsequent insertion relative to a puncture needle after insertion completion.” (Arai, [0017]).
Arai does not teach: the third image processing technique is a color flow imaging technique
Averkiou in the field of ultrasound systems and methods for image acquisition during the delivery of a contrast agent teaches: “The patients were injected with 2 ml of Sonovue (Bracco s.p.a., Milan, Italy), an ultrasonic microbubble contrast agent, and one minute ultrasound loops were acquired and saved as the contrast agent flowed through the carotid artery and the microvasculature (neovessels) of the plaque. The ultrasound images can be acquired by B mode imaging which shows the increasing signal intensity from microbubbles which perfuse the plaque. Preferably, the images are acquired by colorflow imaging so that motion of the microbubbles can simultaneously be detected along with signal intensity. Signals exhibiting a high intensity harmonic return together with Doppler-detected motion at the same location are indicative of moving microbubbles at that location. This correlation can be used to distinguish over and reject signal returns from static bright reflectors which are often artifacts. The result is the detection of dynamic contrast agent microflow in the plaque vasculature.” [0003]. Fig. 6, [0029].
Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the third image processing technique in the combination to be a colorflow imaging technique as taught in Averkiou “…to distinguish over and reject signal returns from static bright reflectors which are often artifacts.” and provide detection result that is of “…dynamic contrast agent microflow in the plaque vasculature.” (Avirkou, [0003]).
Regarding Claim 18, the combination of Takeda, Arai and Fosodeder teach the claim limitations as noted above.
Claim 18 further recites limitations: wherein the third injection event is an injection taking place within the region of interest and the third image processing technique is a color flow imaging technique. These limitations are present in claim 5 and is therefore, rejected under the same rationale.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMAL FARAG whose telephone number is (571)270-3432. The examiner can normally be reached 8:30 - 5:30 M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Keith Raymond can be reached at (571) 270-1790. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMAL ALY FARAG/ Primary Examiner, Art Unit 3798