DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
Claims 1, 6-8, 11, 14, 15, and 18-20 have been amended.
Claims 1-20 are pending.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more.
The claims recite or similarly recite the following limitations:
[a] adapting, by a feedback loop, parameters of the one or more processes based on a cost function of the feedback data; (claims 1 and 20)
[b] generating an ultrasound image; (claims 1 and 20)
[c] generating an updated ultrasound image, the updated ultrasound image having updated color-flow data based on the parameter values; (claims 1, 7, 19, and 20)
[d] determine a cost function associated with the ultrasound image and evaluate the cost function associated with the ultrasound image, the evaluation of the cost function based on feedback data from a feedback loop based on one or more properties of ultrasound data from one or more processes; (claim 8)
[e] determine values for parameters based on the evaluation; (claim 8)
[f] generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images; (claims 6, 18, 20)
[g] initializing an adaptation process with the parameter values and adapting the parameter values through the adaptation process by using parameter values from the target ultrasound image (claims 1, 8, 18, and 20).
Claim limitation [a], as drafted and under its broadest reasonable interpretation, recites a mathematical concept (i.e., mathematical formula or calculations) because it involves applying a cost function (see, e.g., [0043], top of page 12). In some cases, real-value scalars are selected to tune the relative costs of total SNR. (Id). The parameters may be optimized individually (see, e.g., [0053]) or two or more parameters may be optimized simultaneously (see, e.g., [0054]).
Claim limitations [b] and [c], as drafted and under its broadest reasonable interpretation, recites a mathematical concept (i.e., mathematical formula or calculations) because they involve indexing the raw signal by time to determine depth, filtering, and applying a depth-dependent gain.
Claim limitation [d], as drafted and under its broadest reasonable interpretation, recites a mathematical concept (i.e., mathematical formula or calculations) because it involves determining the cost function by selecting soft or hard constraints (see, e.g., [0081]) for weights of a cost function. ([0043]). The cost function is then evaluated over multiple parameters by using one or more equations (see, e.g., [0082]) or by generating similarity scores for corresponding ultrasound images and then ranking and selecting the image with the highest score. (Id).
Claim limitation [e], as drafted and under its broadest reasonable interpretation, recites a mathematical concept (i.e., mathematical formula or calculations) because it can involve maximizing a cost function. (see, e.g., [0084]). “For example, the ultrasound system can determine a value of a first parameter that maximizes the evaluated cost function, fix the first parameter value, and then determine a second parameter that maximizes the evaluated cost function, using the determined first parameter value.” (Id).
Claim limitation [f], as drafted and under its broadest reasonable interpretation, recites a mathematical concept and/or mental process. It recites a mental process because generating similarity scores can include comparing images and determining scores based on the comparisons. These comparisons and determinations are examples of “observations, evaluations, judgments, and opinions” that are considered to be mental processes. (MPEP 2106.04(a)(2), III). Claim limitation [f] can also recite a mental process and/or mathematical concept because, in some embodiments, generating similarity scores involves implementing any suitable “function,” such as a machine-learned model (e.g., neural network) or a stochastic or deterministic signal processing model, to determine the scores. (see, e.g., [0056]).
Claim limitation [g], as drafted and under its broadest reasonable interpretation, recites a mathematical concept. It recites a mathematical concept because it involves initializing and using an adaptation process, which includes optimizing cost functions as discussed above. ([0057]). For example, the cost function can be optimized over multiple parameters by using one or more equations (see, e.g., [0082]).
This judicial exception is not integrated into a practical application. Additional elements include receiving a first user selection that selects a process from a process group and a second user selection that selects a feedback data from a data group. These elements are examples of insignificant pre-solution activity (i.e., gathering information necessary when implementing the abstract idea). Likewise, the additional element of generating an ultrasound image having color-flow data based on the parameters is an example of insignificant post-solution activity (i.e., an output generated after applying the abstract idea). These do not impose a meaningful limitation on the judicial exception.
Other additional elements include an ultrasound scanner, a display device, one or more processors, and one or more computer-readable media. But each of these is recited at a high-level of generality in the claims and only serves to generally link the use of the judicial exception to a particular technological environment or field of use. (MPEP 2106.04(d), I). Other additional elements include the one or more processors being caused to (1) determine weights for regions of a color box associated with an ultrasound image and (2) generate, based on the values of the parameters, an additional ultrasound image with color-flow data. However, these elements are examples of insignificant extra-solution activity. Determining weights for regions of a color box is an example of pre-solution activity (i.e., gathering information necessary when implementing the abstract idea). Generating an additional ultrasound image based on the values is an example of post-solution activity (i.e., an output generated using the abstract idea). None of these additional elements impose a meaningful limitation on the judicial exception.
Other additional elements include (1) generating an ultrasound image having color-flow data; (2) selecting a target ultrasound image based on similarity scores; (3) obtaining parameter values; and (4) generating an updated ultrasound image with updated color-flow data based on the adapted parameter values. However, each of these elements is insignificant extra-solution activity. For example, in order to update an image, it is necessary to first generate that image. In order to adapt the parameter values, it is necessary to identify parameter values from an ultrasound image with the highest similarity score. Each of these is an example of pre-solution activity. Generating the updated image with updated color-flow data based on the adapted parameter values is an example of post-solution activity.
The dependent claims do not render the claims patent eligible.
For example, dependent claims 2-5 include additional elements that only generally link the use of the judicial exception to a particular technological environment or field of use and/or recite insignificant extra-solution activity. For example, dependent claim 2 only specifies what the user may select for a process group. In other words, claim 2 only clarifies the data obtained during the pre-solution activity. Likewise, dependent claim 3 only specifies the type of feedback data the user may select. In other words, claim 3 only clarifies the feedback data that will be used by the abstract idea.
Dependent claims 4 and 5 recite additional pre-solution activities. For example, in response to selections by the user, the method displays certain panels so that the user may provide additional data for carrying out the abstract idea.
Dependent claim 6 recites steps similar to those recited in claim 20. More specifically, claim 6 recites additional abstract ideas that include generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images and initializing an adaptation process with the parameter values.
Dependent claim 9 recites wherein the ultrasound-control panel includes selectable controls for gain, depth, image-saving, examination presets, and configuration-storage.
Dependent claim 10 recites wherein the ultrasound-image panel displays ultrasound images.
Dependent claim 11 wherein the process-selection panel includes selectable controls for the process group and the data group, and the selectable controls are configured to cause, responsive to selection of the selectable controls, the display device to display the process group panel and the data group panel. These are additional elements that only generally link the use of the judicial exception to a particular technological environment or field of use and/or provide a means for the user to provide information that will be used by the method. Moreover, many of these are well-understood, routine, and conventional graphical user-interface (GUI) elements found within ultrasound systems.
Dependent claims 12-14 relate to the cost function and, as such, recite a mathematical concept. Dependent claims 15-17 relate to the parameters or how the values of the parameters recited in [c] are determined.
Dependent claim 18 recites steps similar to those recited in claim 20. More specifically, dependent claim 18 recites that the processors are configured to generate similarity scores based on a comparison between the ultrasound image and a plurality of additional ultrasound images; select a target ultrasound image from the plurality of additional ultrasound images, the target ultrasound image selected based on the similarity scores; obtain parameter values of the one or more processes from the process group; and initialize an adaptation process with the parameter values. For the same reasons as those discussed with respect to claim 20, claim 18 does not render the subject matter patent eligible.
Dependent claim 19 recites steps also similar to those recited in claim 20. More specifically, dependent claim 19 recites that the processors are further configured to: adapt the parameter values through the adaptation process by using parameter values from the target ultrasound image; and generate an updated ultrasound image, the updated ultrasound image having updated color-flow data based on the parameter values. For the same reasons as those discussed with respect to claim 20, claim 18 does not render the subject matter patent eligible.
Moreover, the additional elements are not sufficient to amount to significantly more than the judicial exception. As explained above, each of the additional elements either serve only to generally link the use of the judicial exception to a particular technological environment or field of use or are insignificant extra-solution activities. They are not meaningful limitations that transform the exception into a patent-eligible application, such that the claim does not amount to significantly more than the exception itself.
Accordingly, claims 1-20 are not patent eligible.
RESPONSE TO APPLICANT’S ARGUMENTS
Applicant did not provide any arguments regarding Examiner’s analysis of the previously-pending claims and does provide any argument as to why the amended claims recite patent-eligible subject matter. As discussed above, claims 1-20 are not patent eligible.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Appl. Publ. No. 2007/0232915 A1 to Pelissier et al. (hereinafter “PELISSIER”) and U.S. Patent Appl. Publ. No. 2002/0169378 A1 to Mo et al. (hereinafter “MO”).
With respect to claim 1, PELISSIER concerns an ultrasound machine that is configured for developing new modes for obtaining images or other useful information from ultrasound signals. (Abstract). “In some embodiments, apparatus 10 can be selectively operated either as a research platform, as described above, or as a diagnostic ultrasound machine.” ([0063]). PELISSIER teaches a method comprising receiving a first user selection that selects one or more processes from a process group. “A user can operate design mode application software to change the manner in which the RF data is processed to yield images or other useful information.” If the user finds the result unacceptable ([0044], [0045] and elements 68 and 69 in Figure 2), “then the user has the option of…modifying the data processing parameters 46 at block 72B or modifying transmit parameters 39 at block 72C.” ([0045]).
PELISSIER also teaches receiving a second user selection that selects feedback data from a data group. The user is enabled to use previously-acquired RF data, (see, e.g., [0046]: “without acquiring new RF data 24”), or to “acquire fresh RF data 24.” ([0046]). In addition, PELISSIER teaches that the “data processing functions” may access various sets of data, including raw RF data, pre-scan converted data, or post-scan converted data (Doppler data stream or an echo data stream). The user is able to interact with PELISSIER’s system by way of a suitable user interface 50. ([0024]).
PELISSIER teaches generating an ultrasound image, the ultrasound image having color-flow data based on the parameters. “Data processor 30 is also configured to process RF data from RF data memory area 28 to yield images that can be displayed on a display 40.” ([0021]; see also claims 1 and 3 in which the “result” includes an image that is displayed). PELISSIER also teaches generating an updated ultrasound image, the updated ultrasound image having updated color-flow data based on the parameter values. After changing parameters, PELISSIER teaches displaying and checking the result. ([0045] and Abstract). Claim 6 refers to this as a “fresh result.” (claim 6). The new selected parameters may be applied to the original RF data or to new RF data. ([0046]). NOTE: With respect to “color-flow data,” the data-processing functions may use a “Doppler data stream 85.” ([0062]). One example of computed information that may be obtained in PELISSIER includes “maximum blood flow velocity obtained by processing the RF data to yield information regarding the Doppler shift of reflected ultrasound signals.” ([0022]). Moreover, Applicant does not define “color-flow data.” Examiner is interpreting color-flow data to include information regarding “maximum blood flow velocity” and “the Doppler shift of reflected ultrasound signals.”
However, PELISSIER does not explicitly teach that the feedback data is provided by a feedback loop based on one or more properties of ultrasound data from the one or more processes. PELISSIER does not explicitly teach adapting, by the feedback loop, parameters of the one or more processes based on a cost function of the feedback data. However, at least one goal of PELISSIER is to modify parameters to determine how the result is changed. “The user can readily test and modify the ultrasound mode until the user is satisfied with the results obtained.” ([0042]; see also, e.g., claim 6 in which the user can select “modified values” of “transmit parameters” to obtain “fresh RF data to produce a fresh result.”). PELISSIER also does not explicitly teach obtaining parameter values of the one or more processes from the process group and initializing an adaptation process with the parameter values.. As discussed above, however, at least one goal of PELISSIER is to modify parameters to determine how the result is changed. (see, e.g., [0042] and claim 6).
In the same field of endeavor, MO teaches, in the context of ultrasound color flow imaging, an adaptive clutter filtering process that includes an iterative algorithm used to optimize the clutter filter. (Abstract). MO teaches that the feedback data is provided by a feedback loop based on one or more properties of ultrasound data from the one or more processes and also teaches adapting, by the feedback loop, parameters of the one or more processes based on a cost function of the feedback data. “In an embodiment, the HPF cutoff frequency setting is automatically adjusted for each I/Q data packet on a packet by packet basis (i.e., one packet per acoustic point). An iterative search algorithm may be performed to find the optimal HPF cutoff frequency that is just high enough to remove all the clutter.” ([0081]). “A criterion for determining and/or detecting the presence of clutter is when the magnitude of the filtered I/Q data is greater than a certain threshold and/or whether the absolute value of the mean frequency shift (the average phase change per PRI) is lower than a certain clutter frequency threshold.” ([0082]). With the optimized clutter filter, MO generates an ultrasound image with color-flow data. (see, e.g., [0013] and [0076]-[0078]). “Video processing 764 includes combining the color and B-mode data into a single image frame based on predetermined write priority rules, and applying selected color maps (e.g. red, green, blue values) to represent the flow signal parameter (e.g., mean velocity) in color, and selected gray maps to represent the B-mode background echo data.” ([0078]).
MO also teaches obtaining parameter values of the one or more processes from the process group and initializing an adaptation process with the parameter values. The process uses “pre-selected” filter coefficients that are stored in memory. “The sets of HPF coefficients for a range of Nf cutoff frequency settings (or parameters) is predetermined and stored in memory buffer 732 for each color flow imaging setup, which may include, but is not limited to, probe type, transmit frequency, application type, packet size and PRF. The HPF may be implemented using standard FIR or IIR filters. The HPF coefficients may range from a simple FIR filter [1,-1] to higher order filters involving 10 or more real and/or complex coefficients.” ([0080]).
It would have been obvious to one having ordinary skill in the art at the time of filing to modify the PELISSIER method to include the adaptive clutter filtering algorithm as taught in MO. More specifically, PELISSIER seeks to enable a user to develop new ways to process ultrasound data by, for example, choosing different processing parameters to improve images and provide more useful information. ([0025], [0044], [0045] of PELISSIER). One would have been motivated to include MO’s iterative algorithm that optimizes the clutter filter as it furthers PELISSIER’s goal. Moreover, an optimized clutter filter can suppress color flash artifacts without compromising low flow detection. ([0032] of MO). There would have been a reasonable expectation of success as MO teaches that it can be implemented by modifying a conventional ultrasound system.
With respect to claim 2, PELISSIER teaches that the process group includes a transmitting process, a receiving process, and a color-imaging process. “In the illustrated embodiment, block 62 permits the user to select: transmit parameters 39 at block 62A; a data processing function 44 at block 62B; and data processing parameters 46 for the selected data processing function 44 at block 62C.” ([0043]). NOTE: Applicant does not define “color-imaging process,” Applicant does acknowledge that color-image data can be improved by adjusting parameters within the transmitting or receiving processes. (see [0035] of Applicant’s specification). Moreover, the scope of a “color-imaging process” at least includes the parameters of “bandpass filter, a wall filter, a temporal filter, a spatial filter.” ([0036]). These filters can be performed by digital filtering, which is taught by PELISSIER at [0065].
With respect to claim 3, PELISSIER teaches wherein the data group includes radiofrequency data and color-image data. PELISSIER teaches that the data processed in PELISSIER can include raw RF data as well as Doppler data (i.e., color-image data) that is derived from the RF data. (see, e.g., [0062]). The RF data may be processed to provide information regarding “maximum blood flow velocity” and “the Doppler shift of reflected ultrasound signals.” ([0022]). This is consistent with Applicant’s specification in which the color-image data 312 is produced after the color-imaging process is applied to the RF data. (see, e.g., [0037] of Applicant’s disclosure).
Claims 4 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Appl. Publ. No. 2007/0232915 A1 to Pelissier et al. (hereinafter “PELISSIER”) and U.S. Patent Appl. Publ. No. 2002/0169378 A1 to Mo et al. (hereinafter “MO”) as applied to claim 1 above, and further in view of U.S. Patent No. 5,161,535 A to Short et al. (hereinafter “SHORT”).
With respect to claim 4, neither PELISSIER nor MO teaches in response to receiving the first user selection: displaying a process group panel, the process group panel having parameters associated with the one or more processes.
SHORT teaches “[a]n ultrasound imaging system and method for controlling the system” in which the system has “a control panel which includes menu items divided into system mode menu items for selecting a system mode, and control set menu items for selecting functions corresponding to a selected system mode menu item.” (Abstract). As described in the Background section, SHORT is concerned with a selection process in which the user interface provides options to the user after the user makes a selection. (see, e.g., col. 1, lines 44-57). SHORT teaches displaying several categories or “modes” that are always available for selection. (Abstract). Once a mode is selected, a separate menu of items for the selected mode is displayed. (Id., see also col. 1, line 66 to col. 2, line 8).
Figures 2-5, 7, and 8 illustrate different plan views of a control panel. Each plan view shows an arrangement of icons that appears in response to the user making a selection. (see, e.g., col. 3, lines 5-7 in which Figures 7(a) through 7(e) include “plan views of the control panel…displaying menu changes in response to ultrasound system mode selections.”) (emphasis added). For example, the view of Figure 4(a) appears after the user selects “COLOR” to indicate the “color flow imaging mode” menu. (col. 5, lines 26-31). In response to the user selecting “VELOCITY TAG” in the color flow imaging mode, the view of Figure 4(b) will appear providing options regarding the velocity, “TAG 1” or “TAG 2.” (col. 5, lines 51-65).
It would have been obvious to one skilled in the art to modify a PELISSIER -LIN system to display a process group panel, responsive to a user selection, in which the process group panel has parameters associated with the one or more processes. One would have been motivated to add a user interface, as described in SHORT, that is responsive to user selections regarding ultrasound options. More specifically, one would have been motivated to provide available options (i.e., parameters) that are possible for the selected process(es) because, once a person decides upon the processes to be investigated, they would like to see the available parameters associated with the process(es).
With respect to claim 5, neither PELISSIER nor MO teaches in response to receiving the second user selection: displaying a data group panel, the data group panel having properties associated with the feedback data.
As described above with respect to claim 4, SHORT is concerned with a selection process in which the user interface provides options to the user after the user makes a selection, (see, e.g., col. 1, lines 44-57 or col. 5, lines 26-31), and then provides additional options for the selected option. (col. 5, lines 32-65).
It would have been obvious to one skilled in the art to modify the PELISSIER -LIN system to display a data group panel, responsive to a user selection, in which the data group panel has properties associated with the feedback data. One would have been motivated to add a user interface, as described in SHORT, that is responsive to user selections regarding ultrasound options. In this case, one would have been motivated to provide available options (i.e., data groups, such as RF data or a more particular data set) that are possible for the selected process(es) because, once a person decides upon the processes to be investigated, they would like to see the data that may be used to with the process(es).
Claims 6, 7, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Appl. Publ. No. 2007/0232915 A1 to Pelissier et al. (hereinafter “PELISSIER”) and U.S. Patent Appl. Publ. No. 2002/0169378 A1 to Mo et al. (hereinafter “MO”) as applied to claim 1 above, and further in view of U.S. Patent Appl. Publ. No. 2021/0093301 A1 to Wang et al. (hereinafter “WANG”).
With respect to claim 6, neither PELISSIER nor MO teach generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images or selecting a target ultrasound image from the plurality of additional ultrasound images, the target ultrasound image selected based on the similarity scores.
WANG discloses a system and method that uses an artificial neural network to retrieve imaging parameters. “In order to suitably adapt the image acquisition for a specific patient, organ, or a specific disease, numerous imaging parameters of the imaging system may need to be set appropriately. These parameters are related to the transmission and reception of the ultrasound signals, the processing of the acquired signals, image reconstruction, image display, and image storage.” ([0002]). WANG teaches matching a current image with a prior image and then using the settings of the prior image. As WANG explains, “[t]he view matching process in effect determines a probability that any given prior image was taken with the probe oriented and beam directed toward the tissue the same manner as those of the current image, such that measurements of a region of interest, for example measurements of a size of a lesion, may be more accurately performed over time by being taken at the same location. Furthermore, because the acquisition settings affect the image ultimately produced by the system and thus possibly the measurements obtainable from that image, the same or substantially the same acquisition parameters should be used in longitudinal studies. Thus, in accordance with the present disclosure, if a view (or image) matching the current image is found in the prior image data, the acquisition settings associated with that prior matching view are used to reconfigure the scanner 12 and acquire an updated current view (as shown in block 124).” (emphasis added) ([0029]).
In particular, WANG teaches that the neural network may be trained “to compute a similarity score…for each of a plurality of input image pairs.” ([0043]). Figure 5 demonstrates the neural network generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images. In Figure 5, several “newly acquired” images and “prior images” are compared to one another to generate a score.
WANG also teaches selecting a target ultrasound image from the plurality of additional ultrasound images, the target ultrasound image selected based on the similarity scores. “[T]he neural network 228 may output a similarity score for any given pair of images and/or may additionally automatically output determination of a match if the similarity score exceed a certain value (e.g., a similarity score greater than 0.7 on a normalized scale of 0 to 1). In other examples, the neural network 228…may be configured simply to determine if there is a match or no match, and if a match is found to proceed to the automatic settings of parameters or if no match if found to continue searching through the database of previous images.” (emphasis added) ([0044]). To this end, WANG teaches a settings retrieval engine. “The controller 224 may be communicatively coupled to the settings retrieval engine 227 for the automatic setting of imaging parameters as described herein. The controller 224 may be…further configured to adjust the settings e.g., responsive to user input or to input(s) provided from the settings retrieval engine 227.” (emphasis added) ([0036]).
It would have been obvious to one skilled in the art to modify the PELISSIER-MO system to generate similarity scores and identify a target image that is similar to the ultrasound image. One would have been motivated to identify the similar target image in order to retrieve the imaging parameters associated with the target image, as taught in WANG, so that consistent images may be acquired for analysis. There would have been a reasonable expectation of success because, as taught in WANG, newly-acquired and previously-acquired images can be compared and assigned a similarity metric.
With respect to claim 7, PELISSIER teaches, as discussed above, generating a further updated ultrasound image, the further updated ultrasound image having updated color-flow data based on the parameter values. After changing parameters, PELISSIER teaches displaying and checking the result. ([0045] and Abstract). Claim 6 refers to this as a “fresh result.” (claim 6). The new selected parameters may be applied to the original RF data or to new RF data. ([0046]).
As discussed above, WANG teaches adapting the parameter values through the adaptation process by using parameter values from the target ultrasound image. “The controller 224 may be communicatively coupled to the settings retrieval engine 227 for the automatic setting of imaging parameters as described herein. The controller 224 may be…further configured to adjust the settings e.g., responsive to user input or to input(s) provided from the settings retrieval engine 227.” (emphasis added) ([0036]; see also [0029] as discussed above with respect to claim 6).
It would have been obvious to one skilled in the art to modify the PELISSIER-MO system to adapt the parameter values through the adaptation process by using parameter values from the target ultrasound image. One would have been motivated to use the parameter values from the selected target image in order to, as taught in WANG, achieve consistent images for subsequent analysis. There would have been a reasonable expectation of success because, as taught in WANG, newly-acquired and previously-acquired images can be compared and assigned a similarity metric.
With respect to claim 20, PELISSIER concerns an ultrasound machine that is configured for developing new modes for obtaining images or other useful information from ultrasound signals. (Abstract). “In some embodiments, apparatus 10 can be selectively operated either as a research platform, as described above, or as a diagnostic ultrasound machine.” ([0063]). PELISSIER teaches a method comprising receiving a first user selection that selects one or more processes from a process group. “A user can operate design mode application software to change the manner in which the RF data is processed to yield images or other useful information.” If the user finds the result unacceptable ([0044], [0045] and elements 68 and 69 in Figure 2), “then the user has the option of…modifying the data processing parameters 46 at block 72B or modifying transmit parameters 39 at block 72C.” ([0045]).
PELISSIER also teaches receiving a second user selection that selects feedback data from a data group. The user is enabled to use previously-acquired RF data, (see, e.g., [0046]: “without acquiring new RF data 24”), or to “acquire fresh RF data 24.” ([0046]). In addition, PELISSIER teaches that the “data processing functions” may access various sets of data, including raw RF data, pre-scan converted data, or post-scan converted data (Doppler data stream or an echo data stream). The user is able to interact with PELISSIER’s system by way of a suitable user interface 50. ([0024]).
PELISSIER teaches generating an ultrasound image, the ultrasound image having color-flow data. “Data processor 30 is also configured to process RF data from RF data memory area 28 to yield images that can be displayed on a display 40.” ([0021]; see also claims 1 and 3 in which the “result” includes an image that is displayed). PELISSIER also teaches generating an updated ultrasound image, the updated ultrasound image having updated color-flow data based on the parameter values. After changing parameters, PELISSIER teaches displaying and checking the result. ([0045] and Abstract). Claim 6 refers to this as a “fresh result.” (claim 6). The new selected parameters may be applied to the original RF data or to new RF data. ([0046]). NOTE: With respect to “color-flow data,” the data-processing functions may use a “Doppler data stream 85.” ([0062]). One example of computed information that may be obtained in PELISSIER includes “maximum blood flow velocity obtained by processing the RF data to yield information regarding the Doppler shift of reflected ultrasound signals.” ([0022]). Moreover, Applicant does not define “color-flow data.” Examiner is interpreting color-flow data to include information regarding “maximum blood flow velocity” and “the Doppler shift of reflected ultrasound signals.”
However, PELISSIER does not explicitly teach that the feedback data is provided by a feedback loop based on one or more properties of ultrasound data from the one or more processes. PELISSIER does not explicitly teach adapting, by the feedback loop, parameters of the one or more processes based on a cost function of the feedback data. However, at least one goal of PELISSIER is to modify parameters to determine how the result is changed. “The user can readily test and modify the ultrasound mode until the user is satisfied with the results obtained.” ([0042]; see also, e.g., claim 6 in which the user can select “modified values” of “transmit parameters” to obtain “fresh RF data to produce a fresh result.”). PELISSIER also does not explicitly teach obtaining parameter values of the one or more processes from the process group and initializing an adaptation process with the parameter values.. As discussed above, however, at least one goal of PELISSIER is to modify parameters to determine how the result is changed. (see, e.g., [0042] and claim 6).
In the same field of endeavor, LIN teaches a method for automatic image optimization in ultrasound imaging of an object. (Abstract). “The [optimization] technique includes an image quality cost function that dynamically enables optimization of image quality by adjusting ultrasound imaging parameters such as beamforming parameters and signal processing parameters.” ([0013]). LIN teaches the feedback data is provided by a feedback loop based on one or more properties of ultrasound data from the one or more processes. The feedback data in LIN is an image quality metric that is sought to be maximized. LIN teaches that “when the image quality metric has not been maximized, the entire process is repeated until a maximized image quality is reached.” (emphasis added) ([0016]; see also [0015] describing that the image quality metric is maximized when “a plot of successive image quality metrics stops increasing” or when “a threshold of maximal number of iterations allowed is reached.”).
LIN also teaches obtaining parameter values of the one or more processes from the process group and initializing an adaptation process with the values of the parameters.
The above limitations are taught by claim 1 of LIN. NOTE: The “feedback data” in LIN includes image quality factors of the ultrasound images. More specifically, claim 1 of LIN provides:
receiving a first set of electrical signals representing reflections of the first ultrasound signals from the object;
processing the first set of electrical signals into a first image;
evaluating an image quality cost function for the first image to produce a first image quality metric;
determining a second plurality of signal parameters based upon the first image quality metric; [i.e., obtaining parameter values of the one or more processes from the process group]
transmitting a second ultrasound signal into the object, the signal having the second plurality of signal parameters; [i.e., initializing an adaptation process with the parameter values]
receiving a second set of electrical signals representing reflections of the second ultrasound signal from the object;
processing the second set of electrical signals into a second image;
evaluating an image quality cost function for the second image to produce a second image quality metric;
comparing the first image quality metric and the second image quality metric to determine whether a maximized image quality metric has been reached; [NOTE: After initialing the adaptation process, the process of imaging and adjusting parameters continues until an image quality metric is maximized. “[W]hen the image quality metric has not been maximized, the entire process is repeated until a maximized image quality is reached” ([0016])]
assigning a plurality of signal parameters that produced the maximized image quality metric as optimum parameters;
imaging the object using an ultrasound signal having the optimum parameters; and
displaying a resulting image of the object based upon the ultrasound signal with the optimum parameters.
It would have been obvious to one skilled in the art to modify the PELISSIER system by enabling the user to adapt the imaging parameters of the one or more processes based on a cost function of the feedback data as taught in LIN. One would have been motivated to modify the PELISSIER system because LIN’s technique provides “a convenient and efficient means for optimizing image quality.” ([0024]). There would have been a reasonable expectation of success because the “intelligent optimization algorithm” of LIN “may be implemented using a framework of decision tree or dynamic programming that is well known by experts in those areas.” ([0021]).
However, neither nor explicitly teach generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images; selecting a target ultrasound image from the plurality of additional ultrasound images, the target ultrasound image selected based on the similarity scores; adapting, by the feedback loop, the parameter values through the adaptation process by using parameter values from the target ultrasound image.
WANG discloses a system and method that uses an artificial neural network to retrieve imaging parameters. “Initially, the imaging system (e.g., scanner 12) loads default acquisition settings (as show in block 110), which may be a set of acquisition parameter settings determined (e.g., through optimization and/or in consultation with expert users) to be the preferred settings for a particular imaging mode (e.g., B-mode, M-mode, Doppler, etc.) and/or for a specific clinical imaging application (e.g., cardiac, breast, maternal-fetal imaging, etc.).” ([0027]).
“In order to suitably adapt the image acquisition for a specific patient, organ, or a specific disease, numerous imaging parameters of the imaging system may need to be set appropriately. These parameters are related to the transmission and reception of the ultrasound signals, the processing of the acquired signals, image reconstruction, image display, and image storage.” ([0002]). WANG teaches matching a current image with a prior image and then using the settings of the prior image. As WANG explains, “[t]he view matching process in effect determines a probability that any given prior image was taken with the probe oriented and beam directed toward the tissue the same manner as those of the current image, such that measurements of a region of interest, for example measurements of a size of a lesion, may be more accurately performed over time by being taken at the same location. Furthermore, because the acquisition settings affect the image ultimately produced by the system and thus possibly the measurements obtainable from that image, the same or substantially the same acquisition parameters should be used in longitudinal studies. Thus, in accordance with the present disclosure, if a view (or image) matching the current image is found in the prior image data, the acquisition settings associated with that prior matching view are used to reconfigure the scanner 12 and acquire an updated current view (as shown in block 124).” (emphasis added) ([0029]).
In particular, WANG teaches that the neural network may be trained “to compute a similarity score…for each of a plurality of input image pairs.” ([0043]). Figure 5 demonstrates the neural network generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images. In Figure 5, several “newly acquired” images and “prior images” are compared to one another to generate a score.
WANG teaches selecting a target ultrasound image from the plurality of additional ultrasound images, the target ultrasound image selected based on the similarity scores. “[T]he neural network 228 may output a similarity score for any given pair of images and/or may additionally automatically output determination of a match if the similarity score exceed a certain value (e.g., a similarity score greater than 0.7 on a normalized scale of 0 to 1). In other examples, the neural network 228…may be configured simply to determine if there is a match or no match, and if a match is found to proceed to the automatic settings of parameters or if no match if found to continue searching through the database of previous images.” (emphasis added) ([0044]). To this end, WANG teaches a settings retrieval engine. “The controller 224 may be communicatively coupled to the settings retrieval engine 227 for the automatic setting of imaging parameters as described herein. The controller 224 may be…further configured to adjust the settings e.g., responsive to user input or to input(s) provided from the settings retrieval engine 227.” (emphasis added) ([0036]).
WANG also teaches adapting, by the feedback loop, the parameter values through the adaptation process by using parameter values from the target ultrasound image. “The controller 224 may be communicatively coupled to the settings retrieval engine 227 for the automatic setting of imaging parameters as described herein. The controller 224 may be…further configured to adjust the settings e.g., responsive to user input or to input(s) provided from the settings retrieval engine 227.” (emphasis added) ([0036]; see also [0029] as discussed above with respect to claim 6).
It would have been obvious to one skilled in the art to modify the PELISSIER-LIN system to select a target image based on similarity scores and adapt the parameter values, by the feedback loop, through the adaptation process by using parameter values from the target ultrasound image. One skilled in the art would be motivated to use WANG’s method to identify parameters after the first (or subsequent) image is acquired using pre-set imaging parameters, as taught in WANG and LIN. To be clear, the first image would be acquired using pre-set imaging parameters. These pre-set imaging parameters would initialize the adaptation process as taught in LIN. Alternatively, the parameter values after the first iteration would initialize the subsequent iteration in the adaptation process as taught in LIN. A second or subsequent image in the adaptation process would be acquired using imaging parameters retrieved from a matched target image (as taught in WANG) and the subsequent iteration would acquire an image using LIN’s automatic optimization technique. The final acquired image (i.e., the one with maximized image quality) could then be appended to the database of prior image data, as taught in WANG, for future use. “The newly acquired images and settings may be formatted similarly to the remaining entries in the database of prior image data so they can be appended thereto for future use in automatic reconfiguring of the scanner 12 during a subsequent imaging session.” ([0032]).
Claims 8-10, and 12-17 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Appl. Publ. No. 6,162,176 A1 to Washburn et al. (hereinafter “WASHBURN”) and U.S. Patent Appl. Publ. No. 2010/0305441 A1 to Lin et al. (hereinafter “LIN”) and ARIETTA 65, Fujifilm 2021.
With respect to claim 8, WASHBURN teaches an ultrasound system. “An ultrasound color flow imaging system is programmed to optimize display images of power and velocity by automatically adjusting thresholds and data compression by using histograms and samplings of color flow data.” (Abstract). WASHBURN’s system includes an ultrasound scanner configured to generate ultrasound images based on echoes of ultrasound signals transmitted by the ultrasound scanner into a subject at one or more anatomical targets of interest. Figure 1 shows a transducer array 2, a beamformer 4, a demodulator 6, and processors 8C, 8G, scan converter 14, video processor 16, and display monitor 18. “One aspect of the invention is useful in an ultrasound imaging system generating color flow signals in response to ultrasound signals backscattered from a subject under study.” (col. 2, lines 44-46). WASHBURN also teaches a display device (Figure 1, display monitor 18).
WASHBURN also discloses one or more processors (see, e.g., mid-processor 8C for color processing shown in Figures 1 and 2; mid-processor 8G shown in Figures 1 and 3; and master controller 26 having a CPU 30 shown in Figure 4) and one or more computer-readable storage media having instructions stored thereon (see, e.g., “[t]he CPU 30 has random access memory for storing routines used in…” (col. 7, lines 55-62) that, responsive to execution by the one or more processors, cause the one or more processors to determine weights for regions of a color box associated with an ultrasound image. With respect to “[t]he Auto B/Color Priority Threshold Selection Algorithm part of the Auto Color Flow Post Processing Mode,” (col. 10, lines 31-32), WASHBURN teaches establishing an initial or baseline count of the number of B-mode holes. “Since the threshold is 100%, the baseline count is a minimum count. Then the B/color priority threshold is incrementally lowered, and the filter is again applied each time for each new B/color priority threshold, establishing a new count of the B-mode ‘holes/. This process of lowering the threshold and establishing a new count is continued until the number of B-mode “holes” is some factor k greater than the minimum number of baseline B-mode “holes” in the baseline count. Factor k is preset differently as the user selects tissue type and flow type via keyboard 29. Preferably the baseline count at the 100% threshold is equal to or greater than 1. The factor k can be an offset or a multiple of the baseline count.” (col. 11, lines 2-13).
WASHBURN does not teach one or more processors that are configured to determine a cost function associated with the ultrasound image; evaluate the cost function associated with the ultrasound image, the evaluation of the cost function based on feedback data from a feedback loop based on one or more properties of ultrasound data from one or more processes; determine values for parameters based on the evaluation; initialize an adaptation process with the values of the parameters; and generate, based on the values of the parameters, an additional ultrasound image with color-flow data.
In the same field of endeavor, LIN teaches a method for automatic image optimization in ultrasound imaging of an object. (Abstract). “The [optimization] technique includes an image quality cost function that dynamically enables optimization of image quality by adjusting ultrasound imaging parameters such as beamforming parameters and signal processing parameters.” ([0013]). LIN teaches determining a cost function associated with the ultrasound image. “In general, the cost function may be any function of image quality factors, representing the complicated relations within different quality factors.” ([0017]). “An input may affect multiple outputs in a complicated manner. In a particular embodiment, a higher transmit frequency may improve spatial resolution, but reduce SNR, penetration, and frame rate. In another embodiment, increasing an aperture size may improve the spatial resolution, but results in degradation of image uniformity and reduction in frame rate.” ([0019]).
LIN also teaches evaluating the cost function associated with the ultrasound image, wherein the evaluation of the cost function is based on feedback data from a feedback loop based on one or more properties of ultrasound data from one or more processes; determining values for parameters based on the evaluation; initializing an adaptation process with the values of the parameters; and generating, based on the values of the parameters, an additional ultrasound image with color-flow data.
The above limitations are taught by claim 1 of LIN. NOTE: The “feedback data” in LIN includes image quality factors of the ultrasound images. More specifically, claim 1 of LIN provides:
receiving a first set of electrical signals representing reflections of the first ultrasound signals from the object;
processing the first set of electrical signals into a first image;
evaluating an image quality cost function for the first image to produce a first image quality metric; [i.e., evaluating the cost function associated with the ultrasound image, wherein the evaluation of the cost function is based on feedback data from a feedback loop based on one or more properties of ultrasound data from one or more processes]
determining a second plurality of signal parameters based upon the first image quality metric; [i.e., determining values for parameters based on the evaluation]
transmitting a second ultrasound signal into the object, the signal having the second plurality of signal parameters; [i.e., initializing an adaptation process with the values of the parameters]
receiving a second set of electrical signals representing reflections of the second ultrasound signal from the object;
processing the second set of electrical signals into a second image;
evaluating an image quality cost function for the second image to produce a second image quality metric;
comparing the first image quality metric and the second image quality metric to determine whether a maximized image quality metric has been reached; [NOTE: After initialing the adaptation process, the process of imaging and adjusting parameters continues until an image quality metric is maximized. “[W]hen the image quality metric has not been maximized, the entire process is repeated until a maximized image quality is reached” ([0016])]
assigning a plurality of signal parameters that produced the maximized image quality metric as optimum parameters;
imaging the object using an ultrasound signal having the optimum parameters; and
displaying a resulting image of the object based upon the ultrasound signal with the optimum parameters [i.e., generating, based on the values of the parameters, an additional ultrasound image with color-flow data].
It would have been obvious to one skilled in the art to modify the WASHBURN system by enabling the user to adapt the imaging parameters of the one or more processes based on a cost function of the feedback data as taught in LIN. One would have been motivated to modify the WASHBURN system because LIN’s technique provides “a convenient and efficient means for optimizing image quality.” ([0024]). There would have been a reasonable expectation of success because the “intelligent optimization algorithm” of LIN “may be implemented using a framework of decision tree or dynamic programming that is well known by experts in those areas.” ([0021]).
Neither WASHBURN nor LIN teaches that the display device is configured to display a user interface including an ultrasound-control panel, an ultrasound-image panel, and a process-selection panel.
In the same field of endeavor, ARIETTA 65 teaches a user interface having multiple panels configured for “reducing examiner fatigue and facilitating examinations in a variety of clinical settings.” (p.2). With respect to page 3 (left side), the user interface includes a primary monitor and a secondary monitor. The secondary monitor teaches an ultrasound-control panel. For example, on page 6, under “Easy Operation” the second monitor provide touch-sensitive controls “TGC sliders” that “makes it easier to customize imaging parameters.”
The primary monitor teaches an ultrasound-image panel and a process-selection panel. (top of page 7 under “Protocol Assistant”). More specifically, the right portion of the primary monitor includes an ultrasound image, and the left side provides a list of options to the user. The Protocol Assistant “[p]rompts you through the exam following your previously registered protocols and automatically prepares the next tool or window as dictated for each step in the exam.” (Id).
It would have been obvious to one skilled in the art to modify the system of WASHBURN to include a user interface having multiple panels as recited in claim 8. One would have been motivated to add the multiple panels to reduce “examiner fatigue and facilitate[e] examinations in a variety of clinical settings.” (p.2). There would have been a reasonable expectation of success because, as taught in ARIETTA, user interfaces can display multiple panels.
With respect to claim 9, ARIETTA teaches wherein the ultrasound-control panel includes selectable controls for gain (see, e.g., Auto Optimizer on page 7 having “gain adjustment”), depth (controls for depth are an inherent feature of ultrasound imaging systems), image-saving (see, e.g., Protocol Assistant on page 7: “This significantly reduces keystrokes and prevents duplications or omissions as you store images, take measurements, and add body marks or annotations.”).
It would have been obvious to one skilled in the art to modify the system of WASHBURN to include a user interface having multiple panels as taught in ARIETTA, including a panel with selectable controls. One would have been motivated to add the multiple panels to reduce “examiner fatigue and facilitate[e] examinations in a variety of clinical settings.” (p.2). There would have been a reasonable expectation of success because, as taught in ARIETTA, user interfaces can display multiple panels.
With respect to examination presets and configuration-storage, these are well-known features of ultrasound imaging systems. (See, e.g., [0004] of LIN: “A common practice is to pre-set imaging parameters for each ultrasound probe and each clinical application.”)
With respect to claim 10, ARIETTA teaches wherein the ultrasound-image panel displays ultrasound images. (see, e.g., primary monitor on page 7).
It would have been obvious to one skilled in the art to modify the system of WASHBURN to include a user interface having multiple panels as taught in ARIETTA, including a panel that displays ultrasound images. One would have been motivated to add the multiple panels to reduce “examiner fatigue and facilitate[e] examinations in a variety of clinical settings.” (p.2). There would have been a reasonable expectation of success because, as taught in ARIETTA, user interfaces can display multiple panels.
With respect to claim 12, WASHBURN and LIN teach that the weights are determined based on a cost function of the weights. After determining first and second quality metrics in LIN, “[t]he processor further compares the first image quality metric and the second image quality metric to determine whether a maximized image quality metric has been reached. Multiple parameters that produce a maximized image quality metric are assigned as optimum parameters.” (Abstract). LIN would necessarily determine the weights based on the cost function of the weights.
With respect to claim 13, WASHBURN and LIN teach that the cost function associated with the ultrasound image is determined based on ultrasound data associated with the ultrasound image or on color-image data generated from the ultrasound data. “The image optimization problem is transformed to a mathematical problem of maximizing a cost function.” ([0017]). “Various input-output relations exist in terms of ultrasound image optimization. In a particular embodiment, varying an input (imaging parameter) may affect multiple outputs (image quality factors).” ([0018]). Claim 5 of LIN recites computing an image quality cost function in which “the image quality cost function [is] defined as a weighted sum of a plurality of image quality factors.” In other words, the cost function associated with the ultrasound image is determined based on ultrasound data associated with the ultrasound image.
With respect to claim 14, LIN teaches wherein the cost function associated with the ultrasound image is evaluated over multiple values of parameters associated with the process group. ”The image optimization problem is transformed to a mathematical problem of maximizing a cost function.” ([0017]). “Various input-output relations exist in terms of ultrasound image optimization. In a particular embodiment, varying an input (imaging parameter) may affect multiple outputs (image quality factors).” ([0018]). Different images are acquired that are based on different signal parameters (first signal parameters vs. second signal parameters). (Abstract). The image quality metric for each image are determined and then compared. In other words, the cost function is evaluated over multiple values of parameters associated with a process group.
With respect to claim 15, WASHBURN and LIN teach that wherein the parameters are used to generate the ultrasound data associated with the ultrasound image. ”The method 80 includes transmitting a first ultrasound signal into the object in step 82, wherein the signal has a first set of signal parameters. In a particular embodiment, the ultrasound signal is transmitted via an ultrasound transducer. A first set of electrical signals representing reflections of the first ultrasound signals are received from the object in step 84. The first set of electrical signals are processed into a first image in step 86.” (emphasis added).
With respect to claim 16, WASHBURN and LIN teach that wherein the values for the parameters are determined sequentially (“After using first signal parameters to acquire a first image, “[t]he method evaluates an image quality cost function for the first image to produce a first image quality metric and determines a second plurality of signal parameters based upon the first image quality metric” (ABSTRACT of Lin).
With respect to claim 17, WASHBURN and LIN teach that wherein the values for the parameters are determined simultaneously, the values are based on a vector, and the vector includes two or more parameter values. The cost function is LIN includes a “vector X” that represents image parameters that effect image quality. ([0017]). (see, also, regarding the first and second signal parameters: “In one embodiment, the first set of signal parameters and the second set of signal parameters include multiple beamforming parameters and multiple image processing parameters.” ([0021]).
Claims 18 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Appl. Publ. No. 6,162,176 A1 to Washburn et al. (hereinafter “WASHBURN”) and U.S. Patent Appl. Publ. No. 2010/0305441 A1 to Lin et al. (hereinafter “LIN”) and ARIETTA 65, Fujifilm 2021 (hereinafter “ARIETTA”) as applied to claim 8 above, and further in view of U.S. Patent Appl. Publ. No. 2021/0093301 A1 to Wang et al. (hereinafter “WANG”).
With respect to claim 18, LIN teaches initializing the adaptation process with the parameter values as discussed above with respect to claim 8.
However, none of WASHBURN, LIN, or ARIETTA teach generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images; selecting a target ultrasound image from the plurality of additional ultrasound images, the target ultrasound image selected based on the similarity scores; and obtaining parameter values of one or more processes from a process group.
WANG discloses a system and method that uses an artificial neural network to retrieve imaging parameters. “Initially, the imaging system (e.g., scanner 12) loads default acquisition settings (as show in block 110), which may be a set of acquisition parameter settings determined (e.g., through optimization and/or in consultation with expert users) to be the preferred settings for a particular imaging mode (e.g., B-mode, M-mode, Doppler, etc.) and/or for a specific clinical imaging application (e.g., cardiac, breast, maternal-fetal imaging, etc.).” ([0027]).
“In order to suitably adapt the image acquisition for a specific patient, organ, or a specific disease, numerous imaging parameters of the imaging system may need to be set appropriately. These parameters are related to the transmission and reception of the ultrasound signals, the processing of the acquired signals, image reconstruction, image display, and image storage.” ([0002]). WANG teaches matching a current image with a prior image and then using the settings of the prior image. As WANG explains, “[t]he view matching process in effect determines a probability that any given prior image was taken with the probe oriented and beam directed toward the tissue the same manner as those of the current image, such that measurements of a region of interest, for example measurements of a size of a lesion, may be more accurately performed over time by being taken at the same location. Furthermore, because the acquisition settings affect the image ultimately produced by the system and thus possibly the measurements obtainable from that image, the same or substantially the same acquisition parameters should be used in longitudinal studies. Thus, in accordance with the present disclosure, if a view (or image) matching the current image is found in the prior image data, the acquisition settings associated with that prior matching view are used to reconfigure the scanner 12 and acquire an updated current view (as shown in block 124).” (emphasis added) ([0029]).
In particular, WANG teaches that the neural network may be trained “to compute a similarity score…for each of a plurality of input image pairs.” ([0043]). Figure 5 demonstrates the neural network generating similarity scores, the similarity scores generated based on comparing the ultrasound image to a plurality of additional ultrasound images. In Figure 5, several “newly acquired” images and “prior images” are compared to one another to generate a score.
WANG teaches selecting a target ultrasound image from the plurality of additional ultrasound images, the target ultrasound image selected based on the similarity scores and obtaining parameter values of one or more processes from a process group. “[T]he neural network 228 may output a similarity score for any given pair of images and/or may additionally automatically output determination of a match if the similarity score exceed a certain value (e.g., a similarity score greater than 0.7 on a normalized scale of 0 to 1). In other examples, the neural network 228…may be configured simply to determine if there is a match or no match, and if a match is found to proceed to the automatic settings of parameters or if no match if found to continue searching through the database of previous images.” (emphasis added) ([0044]). To this end, WANG teaches a settings retrieval engine. “The controller 224 may be communicatively coupled to the settings retrieval engine 227 for the automatic setting of imaging parameters as described herein. The controller 224 may be…further configured to adjust the settings e.g., responsive to user input or to input(s) provided from the settings retrieval engine 227.” (emphasis added) ([0036]).
It would have been obvious to one skilled in the art to modify the WASHBURN-LIN system to generate similarity scores and identify a target image that is similar to the ultrasound image. One skilled in the art would be motivated to use WANG’s method to identify parameters after the first (or subsequent) image is acquired using pre-set imaging parameters, as taught in WANG and LIN. To be clear, the first image would be acquired using pre-set imaging parameters. These pre-set imaging parameters would initialize the adaptation process as taught in LIN. Alternatively, the parameter values after the first iteration would initialize the subsequent iteration in the adaptation process as taught in LIN. A second or subsequent image in the adaptation process would be acquired using imaging parameters retrieved from a matched target image (as taught in WANG) and the subsequent iteration would acquire an image using LIN’s automatic optimization technique. The final acquired image (i.e., the one with maximized image quality) could then be appended to the database of prior image data, as taught in WANG, for future use. “The newly acquired images and settings may be formatted similarly to the remaining entries in the database of prior image data so they can be appended thereto for future use in automatic reconfiguring of the scanner 12 during a subsequent imaging session.” ([0032]).
With respect to claim 19, the combined teaching of WASHBURN-LIN teach generating an updated ultrasound image, the updated ultrasound image having updated color-flow data based on the parameter values. As discussed with respect to claim 8 above, LIN’s claim 1 teaches:
imaging the object using an ultrasound signal having the optimum parameters; and
displaying a resulting image of the object based upon the ultrasound signal with the optimum parameters [i.e., generating, based on the values of the parameters, an additional ultrasound image with color-flow data].
It would have been obvious to one skilled in the art to modify the WASHBURN system to generate an updated ultrasound image, the updated ultrasound image having updated color-flow data based on the parameter values. One would have been motivated to modify the WASHBURN system because LIN’s technique provides “a convenient and efficient means for optimizing image quality” ([0024]) and one would desire to view the updated images. There would have been a reasonable expectation of success because LIN teaches that images may be updated with new data.
However, none of WASHBURN, LIN, or ARIETTA teach adapting the parameter values through the adaptation process by using parameter values from the target ultrasound image. WANG teaches retrieving the settings from images that match with the current image. “The controller 224 may be communicatively coupled to the settings retrieval engine 227 for the automatic setting of imaging parameters as described herein. The controller 224 may be…further configured to adjust the settings e.g., responsive to user input or to input(s) provided from the settings retrieval engine 227.” (emphasis added) ([0036]; see also [0029] as discussed above with respect to claim 6).
It would have been obvious to one skilled in the art to modify the WASHBURN-LIN system to adapt the parameter values through the adaptation process by using parameter values from the target ultrasound image. One skilled in the art would be motivated to use WANG’s method to identify parameters after the first (or subsequent) image is acquired using pre-set imaging parameters, as taught in WANG and LIN. To be clear, the first image would be acquired using pre-set imaging parameters. These pre-set imaging parameters would initialize the adaptation process as taught in LIN. Alternatively, the parameter values after the first iteration would initialize the subsequent iteration in the adaptation process as taught in LIN. A second or subsequent image in the adaptation process would be acquired using imaging parameters retrieved from a matched target image (as taught in WANG) and the subsequent iteration would acquire an image using LIN’s automatic optimization technique. The final acquired image (i.e., the one with maximized image quality) could then be appended to the database of prior image data, as taught in WANG, for future use. “The newly acquired images and settings may be formatted similarly to the remaining entries in the database of prior image data so they can be appended thereto for future use in automatic reconfiguring of the scanner 12 during a subsequent imaging session.” ([0032]).
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Appl. Publ. No. 6,162,176 A1 to Washburn et al. (hereinafter “WASHBURN”) and U.S. Patent Appl. Publ. No. 2010/0305441 A1 to Lin et al. (hereinafter “LIN”) and ARIETTA 65, Fujifilm 2021 as applied to claim 8 above and further in view of U.S. Patent No. 5,161,535 A to Short et al. (hereinafter “SHORT”).
With respect to claim 11, none of the cited art teaches wherein the process-selection panel includes selectable controls for the process group and the data group, and the selectable controls are configured to cause, responsive to selection of the selectable controls, the display device to display the process group panel and the data group panel.
However, in the same field of endeavor, SHORT teaches “[a]n ultrasound imaging system and method for controlling the system” in which the system has “a control panel which includes menu items divided into system mode menu items for selecting a system mode, and control set menu items for selecting functions corresponding to a selected system mode menu item.” (Abstract). As described in the Background section, SHORT is concerned with a selection process in which the user interface provides options to the user after the user makes a selection. (see, e.g., col. 1, lines 44-57). SHORT teaches displaying several categories or “modes” that are always available for selection. (Abstract). Once a mode is selected, a separate menu of items for the selected mode is displayed. (Id., see also col. 1, line 66 to col. 2, line 8).
Figures 2-5, 7, and 8 illustrate different plan views of a control panel. Each plan view shows an arrangement of icons that appears in response to the user making a selection. (see, e.g., col. 3, lines 5-7 in which Figures 7(a) through 7(e) include “plan views of the control panel…displaying menu changes in response to ultrasound system mode selections.”) (emphasis added). For example, the view of Figure 4(a) appears after the user selects “COLOR” to indicate the “color flow imaging mode” menu. (col. 5, lines 26-31). In response to the user selecting “VELOCITY TAG” in the color flow imaging mode, the view of Figure 4(b) will appear providing options regarding the velocity, “TAG 1” or “TAG 2.” (col. 5, lines 51-65).
It would have been obvious to one skilled in the art to modify the process-selection panel to include selectable controls for the process group and the data group, and the selectable controls are configured to cause, responsive to selection of the selectable controls, the display device to display the process group panel and the data group panel. One would have been motivated to add such a user interface, as described in SHORT, that is responsive to user selections regarding ultrasound options and includes such selectable controls. More specifically, one would have been motivated to provide all of the available options (i.e., parameters) that are possible for the selected process(es). There would be a reasonable expectation of success because such user interfaces are known, as taught in SHORT.
RESPONSE TO APPLICANT’S ARGUMENTS
Claims 1-7
Applicant’s arguments with respect to claims 1-7 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. More specifically, newly cited reference MO is relied upon for teaching the claim limitations added by Applicant in the Response dated October 8, 2025.
Claims 8-19
With respect to claims 8-19, Applicant argues, at pages 14-15 of the Remarks, that WASHBURN and/or LIN do not describe “the evaluation of the cost function based on feedback data from a feedback loop based on one or more properties of ultrasound data from the one or more processes; determine values for parameters based on the evaluation; initialize an adaptation process with the determined values for parameters; and generate, based on the adaptation process, an additional ultrasound image with color-flow data.”
Examiner respectfully disagrees. LIN teaches each of these claim limitations. First, LIN teaches evaluating a cost function based on feedback data from a feedback loop based on one or more properties of ultrasound data. Claim 1 teaches “evaluating an image quality cost function for the first image to produce a first image quality metric.” Notably, this image quality metric is the feedback data.
Second, LIN teaches determining values for parameters based on the evaluation. Again, claim 1 teaches “determining a second plurality of signal parameters based upon the first image quality metric.”
Third, LIN teaches initializing an adaptation process with the values of the parameters. Here, the initialization of the adaptation process is the start of the second evaluation. While claim 1 only describes a single loop, LIN further teaches that “when the image quality metric has not been maximized, the entire process is repeated until a maximized image quality is reached.” (emphasis added) ([0016]; see also [0015] describing that the image quality metric is maximized when “a plot of successive image quality metrics stops increasing” or when “a threshold of maximal number of iterations allowed is reached.”).
Fourth, LIN teaches generating, based on the values of the parameters, an additional ultrasound image with color-flow data. According to claim 1: “imaging the object using an ultrasound signal having the optimum parameters; and displaying a resulting image of the object based upon the ultrasound signal with the optimum parameters.”
Applicant also argues, at page 15 of the Remarks, that there is no suggestion or motivation to combine these references because WASHBURN optimizes downstream display settings whereas the claimed system, like LIN, performs upstream optimization. Even if this is true, there is no reason why one skilled in the art would be discouraged from applying LIN’s upstream image optimization method in addition to WASHBURN’s downstream optimization method. In fact, LIN suggests that both can be applied. “Automatic gain optimization has also been widely implemented in ultrasound scanners. The acquired images are analyzed and local amplitude is adjusted to obtain optimal image brightness, contrast, and uniformity. However, such a technique only addresses a portion of the image optimization issues and does not account for fundamental beamforming parameters, such as, for example, frequency, aperture size, which are also critical to image quality.” (emphasis added) ([0005]). One having ordinary skill in the art would be motivated to use LIN’s technique because the technique “dynamically enables optimization of image quality by adjusting ultrasound imaging parameters such as beamforming parameters and signal processing parameters.” ([0013]).
Claim 20
With respect to claim 20, Applicant argues, at pages 16-17 of the Remarks, that the references do not describe that “the feedback data is provided by a feedback loop based on one or more properties of ultrasound data from the one or more processes” and “adapting, by the feedback loop, the parameter values through the adaptation process by using parameter values from the target ultrasound image.”
Examiner respectfully disagrees. With respect to the “the feedback data” limitation, Applicant only provides a conclusory statement that PELISSIER, LIN, and/or WANG do not teach this limitation. As discussed above, LIN teaches using an image quality metric as feedback data that is maximized in order to determine optimal imaging parameters.
With respect to the “adapting, by the feedback loop” limitation, Applicant only provides a conclusory statement that PELISSIER, LIN, and/or WANG do not teach this limitation. However, as described above, WANG teaches adapting, by the feedback loop, the parameter values through the adaptation process by using parameter values from the target ultrasound image. “The controller 224 may be communicatively coupled to the settings retrieval engine 227 for the automatic setting of imaging parameters as described herein. The controller 224 may be…further configured to adjust the settings e.g., responsive to user input or to input(s) provided from the settings retrieval engine 227.” (emphasis added) ([0036]; see also [0029] as discussed above with respect to claim 6).
Lastly, Applicant argues that there is no suggestion or motivation to combine these references. In particular, Applicant argues that “Wang's method of replacing settings is an alternative to Lin's cost-function optimization, not a complementary feature.” (page 17 of Remarks). Examiner respectfully disagrees. One skilled in the art would be motivated to use WANG’s method to identify parameters after the first (or subsequent) image is acquired using pre-set imaging parameters, as taught in WANG and LIN. To be clear, the first image would be acquired using pre-set imaging parameters. These pre-set imaging parameters would initialize the adaptation process as taught in LIN. Alternatively, the parameter values after the first iteration would initialize the subsequent iteration in the adaptation process as taught in LIN. A second or subsequent image in the adaptation process would be acquired using imaging parameters retrieved from a matched target image (as taught in WANG) and the subsequent iteration would acquire an image using LIN’s automatic optimization technique. The final acquired image (i.e., the one with maximized image quality) could then be appended to the database of prior image data, as taught in WANG, for future use. “The newly acquired images and settings may be formatted similarly to the remaining entries in the database of prior image data so they can be appended thereto for future use in automatic reconfiguring of the scanner 12 during a subsequent imaging session.” ([0032]).
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON P GROSS whose telephone number is (571)272-1386. The examiner can normally be reached Monday-Friday 9:00-5:00CT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne M. Kozak can be reached at (571) 270-5284. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JASON P GROSS/Examiner, Art Unit 3797
/JOSEPH M SANTOS RODRIGUEZ/Primary Examiner, Art Unit 3797