DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 10/01/2025 was filed in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description:
FIG. 1: Although this figure includes the labels 120 and 122, these labels do not appear in the specification.
FIG. 3: Although this figure includes the labels 300 and 306, these labels do not appear in the specification.
FIG. 6: Although this figure includes the label 602, this label does not appear in the specification. The examiner notes that paragraph [0069] recites “Additionally, this right-hand image is further enhanced to display annotations and includes indications of where the patient’s ribs are located 620, the pleural line 604 of the patient and an A line 608 within the lung of the patient as well”. The examiner believes the “602” in FIG. 6 is indented to be “620” as disclosed in [0069]. If this is accurate, the examiner recommends updating either the drawings or the specification accordingly.
FIG. 11C: Although this figure includes the label 1114, this label does not appear in the specification.
FIG. 13: Although this figure includes the label 1304, this label does not appear in the specification.
Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Specification
The disclosure is objected to because of the following informalities:
[0018]: As written it reads “The system and methods described herein further provide, in one aspect, ultrasound imaging devices, whether portable, wearable, cart-based, PMUT, CMUT or otherwise, have controllers that have a real-time enhanced image display mode where the system may continuously sweep back and forth in the elevational dimension”. However, this is the first indication of the acronym PMUT, therefore, the term should be spelled out to provide clarity.
[0048]: As written it reads “In operation, the transducer head 106 detects ultrasonic waves returning from the patient and these waves may be processed by processing circuitry formed on the same chip as the transducers, a signal processor, a CPU, an FPGA, or any suitable type of processing device, or any combination thereof, which may process the returned ultrasound waves to construct image data”. However, this is the first indications of the acronyms CPU and FPGA, therefore, the terms should be spelled out to provide clarity.
[0052]: As written it reads “Additionally, as described below, the user interface may allow the clinician to select an auto fan mode that will sequentially present a portion of the images to simulate viewing the anatomical area as fi the clinician were manually fanning or rocking the probe during imaging”. However, to correct the typo, “fi” should be “if”.
[0055]: As written it reads “In one embodiment, the array 230 includes or is connected to an ASIC or microcontroller that can collect operating parameters from the control module and load then into the ASIC in a manner that will allow the parameters to control operation of the individual elements of the transducer array 230”. However, this is the first instance of the acronym “ASIC” therefore, the term should be spelled out to provide clarity.
[0057]: As written it reads “The beamforming process implemented by module 219 may include applying different timing/phase delays to the transmitted and received ultrasound waves/data from different portions of the ultrasound transducer array 230 such that there are different delays for different portions of the ultrasound transducer array 230 such that there are different delays for different elevational rows, where a row refers to the transducer elements spaced along an line extending along the long axis of the ultrasound transducer array”. However, to correct the typo, “an” should be “a”.
[0068]: As written it reads “Again, although FIG. 5 presents the sweep/slice mode moving images as still images it is to be understood by those of skill in the art that all three are moving images, like gifs, or bouncing live images of an smartphone display, and made from a composite of slices taken by simulated rocking of the transducer back and forth by about 20 degrees about the center angle”. However, to be grammatically correct “an” should be “a”.
[0081]: As written it reads “In particular. In Figure 12A, the border of the target organ 1210 is outlined to indicate where the ultrasound device would determine the cross-sectional shape of the target organ is at the point”. However, the examiner believes that “particular. In” should be “particular, in”.
Appropriate correction is required.
Claim Objections
Claims 2-11, 13-14 and 16 are objected to because of the following informalities:
Regarding claims 2-10, as written the preambles read “The system of claim […]”. However, the examiner believes that they should read “The ultrasound system of claim […]” as set forth in claim 1 on which these claims depend.
Regarding claims 3 and 4, as written they read “The system of claim 1 wherein the image processor analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data and converts the ultrasound image data into image data in cartesian coordinates” (Claim 3) “The system of claim 3 where the image processor applying a polar coordinate reference frame comprises comparing an image angle of each image and a distance from the array of each element identified in the image and comparing the corresponding values in each image in the series of images” (Claim 4). However, claim 4 depends on claim 3 which ultimately depends on claim 1. The examiner notes that claim 1 does not involve identifying elements within the image, rather this concept is discussed within claim 2. Therefore, to correct the antecedent basis issue, the examiner would recommend updating the dependency of claim 3 to be dependent from claim 2.
Regarding claim 11, as written it reads “a handheld ultrasound imaging device including […] a processor […] the processor of the ultrasound imaging device processes”. The examiner believes that “the ultrasound imaging device” is referring to the “handheld ultrasound imaging device. If this assumption is correct the examiner would recommend amending the claim to recite “the handheld ultrasound imaging device” to maintain proper antecedent basis.
Regarding claims 13 and 14, as written they read “The method of claim 11 where processing the ultrasound image data comprises applying a polar coordinate reference frame to the ultrasound image data and converting the ultrasound image data into image data in cartesian coordinates” (Claim 13) “The method of claim 13 where applying a polar coordinate reference frame comprises comparing an imaging angle of each image and a distance from the array of each element identified in the image and comparing the corresponding values in each image in the series of ultrasound images” (Claim 14). However, claim 14 depends on claim 13 which ultimately depends on claim 11. The examiner notes that claim 11 does not involve identifying elements within the image, rather this concept is discussed within claim 12. Therefore, to correct the antecedent basis issue, the examiner would recommend updating the dependency of claim 13 to be dependent from claim 12.
Regarding claim 16, the claim reads “The method of claim 15 where the image processor displays the key frame as a default still image, or as a central frame in the displayed video or cine”. However, this is the first indication of the term “the displayed video or cine” within the set of method claims. The examiner notes that this limitation is similar to that of claim 6 which recites “a displayed video or cine”. The examiner would recommend amending claim 16 to recite “a displayed video or cine” instead of “the displayed video or cine” to avoid any potential antecedent basis issues.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
Claims 1-20 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention.
Regarding claim 1, as written it reads “a handheld computing device coupled to the handheld ultrasound imaging device having a memory capable of storing ultrasound image data, and wherein the handheld ultrasound imaging device is configured to transmit ultrasound image data to the handheld computing device and store the data in the memory”. However, when stated in this way, it is unclear whether 1) the handheld ultrasound imaging device includes the memory, 2) the handheld computing device includes the memory or 3) whether each of the handheld ultrasound imaging device and the handheld computing device include separate memories. The examiner recommends amending the claim to clarify which of the components (i.e. handheld ultrasound imaging device, handheld computing device or both) include a memory.
Additionally, there is a lack of antecedent basis for the term “the data” within the claim. The examiner believes it is intended to refer to the “ultrasound image data”. If this assumption is correct, the examiner would recommend updating the claim language accordingly.
Regarding claims 2-10, due to their dependence on claim 1, either directly or indirectly, these claims are subject to the reasoning provided therein. Thus, these claims are subject to rejection under 35 U.S.C. 112(b) for the reasons stated above.
Regarding claim 8, this claim reads “The system of claim 1 wherein the image processor is configured to join together the series of ultrasound images taken along an elevational direction of the 2D array to reconstruct the three-dimensional (3D) model of the imaging target as a model having data of the imaging target throughout a three-dimensional space”. However, it is unclear whether the “elevational direction of the 2D array” in claim 8 is the same as or different from the elevational direction recited in claim 1 (i.e. an elevational direction of the 2D array, claim 1, line 5). The examiner recommends clarifying whether these two elevational directions are the same or different from each other. If they are the same, the examiner would recommend amending claim 8 to recite “the elevational direction of the 2D array” in order to maintain proper antecedent basis.
Regarding claim 11, this claim reads “a handheld ultrasound imaging device including a two-dimensional (2D) array of micromachined ultrasound transducers (MUTs), a processor capable of controlling the array, and a handheld computing device including an image processor, where the handheld ultrasound imaging device takes a series of ultrasound images along an elevational dimension of the 2D array where each image in the series of ultrasound images has a different angle of imaging relative to an axis parallel to the elevational dimension of the array by beam steering ultrasonic signals produced by the MUTs, and the processor of the ultrasound imaging device processes each image in the series of ultrasound images to generate ultrasound image data and transmits the ultrasound image data to the handheld computing device, and where the image processor processes the ultrasound image data to generate a three-dimensional (3D) model of elements imaged by the series of ultrasound images, and displays the ultrasound image data or the 3D model to the user”.
However, the above limitations appears to describe the device used to perform the process which creates confusion as to when direct infringement occurs. A single claim which claims both an apparatus and the method steps of using the apparatus is indefinite under 35 U.S.C. 112, second paragraph, see In re Katz Interactive Call Processing Patent Litigation, 639 F.3d 1303 (Fed. Cir. 2011). It is unclear "whether infringement … occurs when one creates a system that allows the user [to use the input means], or whether infringement occurs when the user actually uses the input means." See IPXL Holdings v. Amazon.com, Inc., 430 F.2d 1377, 1384, 77 USPQ2d 1140, 1145 (Fed. Cir. 2005). See MPEP 2173.05(p).
It appears that the method steps are: 1) take a series of ultrasound images along an elevational dimension of the 2D array where each image in the series of ultrasound images has a different angle of imaging relative to an axis parallel to the elevational dimension of the array by beam steering ultrasonic signals produced by the MUTs; 2) process each image in the series of ultrasound images to generate ultrasound image data and transmit the ultrasound image data to the handheld computing device; 3) process the ultrasound image data to generate a three-dimensional (3D) model of elements imaged by the series of ultrasound images; and 4) displays the ultrasound image data or the 3D model to the user.
The examiner recommends using language such as “the processor is configured to […]” in order to clarify which of the processors (i.e. the processor of the handheld ultrasound imaging device or the image processor of the handheld computing device) are designed to perform these steps.
Regarding claims 12-20, due to their dependence on claim 11, these claims are subject to the reasoning provided therein.
Regarding claim 20, as written the claim reads “where displaying the 3D model optionally includes displaying the model from a different angle than the angle of imaging”. However, the phrase “optionally” renders the claim indefinite because it is unclear whether the limitations following the phrase are part of the claimed invention. See MPEP § 2173.05(d). If the limitation following the word “optionally” is part of the claimed invention, the examiner would recommend updating the claim language accordingly.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-2, 5-8, 10-12, 15-18, and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Howell et al. US 2020/0320694 A1 “Howell”.
Regarding claims 1 and 11, Howell teaches “An ultrasound system comprising” (Claim 1) (“FIG. 24 illustrates a schematic block diagram of an example ultrasound system 2400, in accordance with certain embodiments described herein. The ultrasound system 2400 includes an ultrasound device 2402 and a processing device 2404” [0050]. Therefore, the ultrasound system 2400, represented in FIG. 24 constitutes an ultrasound system.);
“A method of imaging an organ comprising” (Claim 11) (“FIG. 23 illustrates a process 2300 for collection of visualization of ultrasound data, in accordance with certain embodiments described herein. The process 2300 is performed by a processing device” [0040]; “While the above description has focused on imaging, measurement and visualization of a bladder, it should be appreciated that other anatomical structures (e.g., the left ventricle of the heart) may be imaged, measured, and visualized in the same manner” [0048]. As shown in FIGS. 12-22, the graphical user interface (GUI) shows images from a 3D bladder scan, the bladder being an organ. Therefore, the method shown in FIG. 23 represents a method of imaging an organ.);
“a handheld ultrasound imaging device including; a flat two-dimensional (2D) array of micromachined ultrasound transducers (MUTs), and a processor configured to” (Claim 1); “a handheld ultrasound imaging device including a two-dimensional (2D) array of micromachined ultrasound transducers (MUTs), a processor capable of controlling the array” (Claim 11) (“The ultrasound device may use a two-dimensional array of ultrasound transducers on a chip to perform the three-dimensional ultrasound imaging sweep while the user maintains the ultrasound device at the same position and orientation it was at when the ultrasound device collected the ultrasound image 212” [0025]; “The ultrasound device and the processing device may communicate over a wired communication link (e.g., over Ethernet, a Universal Serial Bus (USB) cable or a Lightning cable) or over a wireless communication link (e.g., over a BLUETOOTH, WiFi, or ZIGBEE wireless communication link)” [0040]; “The ultrasound device 2402 may be configured to generate ultrasound data that may be employed to generate an ultrasound image. […] The ultrasonic transducers may include, for example, one or more capacitive micromachined ultrasonic transducers (CMUTs), one or more CMOS (complementary metal-oxide-semiconductor) ultrasonic transducers (CUTs), one or more piezoelectric micromachined ultrasonic transducers (PMUTs), and/or one or more other suitable ultrasonic transducer cells […] In some embodiments, the ultrasonic transducers may be formed on the same chip as other electronic components in the ultrasound circuitry 2410 (e.g., transmit circuitry, receive circuitry, control circuitry, power management circuitry, and processing circuitry) to form a monolithic ultrasound device” [0052]. In order for the ultrasound device 2402 (i.e. hand-held) to communicate with the processing device 2404 (i.e. hand-held) over a wired or wireless communication link, the ultrasound circuitry 2410 (i.e. see FIG. 24), has to be present. The ultrasound circuitry 2410 represents processing circuitry i.e. a processor (see [0052]). Therefore, the ultrasound system comprises a handheld ultrasound imaging device which includes a flat (i.e. on a chip) two-dimensional (2D) array (see [0025]) of micromachined ultrasound transducers (MUTs) (i.e. either CMUT or PMUT, see [0052]) and a processor (i.e. processing circuitry in the form of the ultrasound circuitry 2410, see [0052]) capable of controlling the array (see [0052]: transmit circuitry, control circuitry).).;
“control the 2D array to take a series of ultrasound images along an elevational direction of the 2D array where each image is taken at a different angle relative to an axis parallel to the elevational direction of the array by beam steering ultrasonic signals produced by the MUTs” (Claim 1); “a handheld computing device including an image processor, where the handheld ultrasound imaging device takes a series of ultrasound images along an elevational dimension of the 2D array where each image in the series of ultrasound images has a different angle of imaging relative to an axis parallel to the elevational dimension of the array by beam steering ultrasonic signals produced by the MUTs” (Claim 11) (See [0052] above and “The 3D sweep may be an elevational sweep. In other words, during the 3D sweep, the ultrasound device may collect multiple ultrasound images, each ultrasound image collected along a different imaging slice at a different angle along the elevational dimension of the ultrasound device's transducer array. The processing device may configure the ultrasound device and/or itself to use beamforming to focus an ultrasound beam along a different direction at each stage of the 3D sweep” [0025], and “The processing device 2404 may be configured to perform certain of the processes (e.g., the process 2300) described herein using the processor 2414 (e.g., one or more computer hardware processors) and one or more articles of manufacture that include non-transitory computer-readable storage media such as the memory 2416. The processor 2414 may control writing data to and reading data from the memory 2416 in any suitable manner” [0054].
Therefore, the processor is configured to control the 2D array (see [0025]) to take a series of ultrasound images along an elevational direction of the 2D array where each image is taken at a different angle relative to an axis parallel to the elevational direction of the array by beam steering (i.e. beamforming to focus along a different direction, see [0025]) ultrasonic signals produced by the MUTs. Furthermore, the method carried out by the system operates a handheld computing device (i.e. 2404) including an image processor (i.e. processor 2414), where the handheld ultrasound imaging device takes a series of ultrasound images along an elevational dimension of the 2D array where each image in the series of ultrasound images has a different angle of imaging relative to an axis parallel to the elevational dimension of the array by beam steering ultrasonic signals produced by the MUTs (See [0025]).); and
“store the series of ultrasound images in a memory as a series of ultrasound image data” (Claim 1) (“Additionally, or alternatively, the ultrasound data may be stored temporarily in a buffer during a scanning session and processed in less than real-time” [0053]. Therefore, since ultrasound data may be stored temporarily (i.e. before, data is transmitted from the ultrasound device 2402 to the processing device 2404) the processor is configured to store the series of ultrasound images in a memory as a series of ultrasound image data.), and
“a handheld computing device coupled to the handheld ultrasound imaging device having a memory capable of storing ultrasound image data” (Claim 1) (See [0054] above and “The processor 2414 may also be adapted to control the acquisition of ultrasound data with the ultrasound device 2402” [0053]; “For example, the processing device 2404 may be implemented as a handheld device such as a mobile smartphone or a tablet. Thereby, a user of the ultrasound device 2402 may be able to operate the ultrasound device 2402 with one hand and hold the processing device 2404 with another hand” [0055] and processing device 2404 2404 in FIG. 24. Therefore, the system includes a handheld computing device (i.e. 2404) coupled to the handheld ultrasound imaging device (i.e. 2402) having a memory (i.e. 2416) capable of storing ultrasound image data.); and
“wherein the handheld ultrasound imaging device is configured to transmit ultrasound image data to the handheld computing device and store the data in the memory, the handheld computing device includes an image processor analyzing the ultrasound image data stored in memory to reconstruct a three-dimensional (3D) model of an imaging target” (Claim 1); “the processor of the ultrasound imaging device processes each image in the series of ultrasound images to generate ultrasound image data and transmits the ultrasound image data to the handheld computing device, and where the image processor processes the ultrasound image data to generate a three-dimensional (3D) model of elements imaged by the series of ultrasound images” (Claim 11) (See [0040], [0052] and [0053] above, and “In some embodiments, the 3D visualization 2140 may be generated from the ultrasound images collected during the 3D sweep and segmented portions from the ultrasound images. Each ultrasound image may be generated from an imaging slice arranged at a different angle relative to the ultrasound device. […] The processing device may then produce the 3D visualization 2140 by volume rendering the grid of voxels” [0036]. Thus, since the ultrasound device 2402 and the processing device 2404 may communicate over a wired or wireless communication link (see [0040]), the handheld ultrasound imaging device (i.e. 2402) is configured to transmit ultrasound image data to the handheld computing device (i.e. 2404) and store the data in the memory (i.e. buffer, see [0053]), the handheld computing device (i.e. 2404) includes an image processor (i.e. 2414) analyzing the ultrasound image data stored in memory (i.e. 2416) to reconstruct a three-dimensional (3D) model of an imaging target (see [0036]).
Furthermore, the processor of the ultrasound imaging device (i.e. the ultrasound circuitry 2410, see [0052]) processes each image in the series of ultrasound images to generate ultrasound image data and transmits the ultrasound image data (see [0040]) to the handheld computing device (i.e. 2404), and where the image processor processes the ultrasound image data to generate a three-dimensional (3D) model of elements imaged by the series of ultrasound images.); and
“a display for displaying ultrasound image data or the 3D model as any one of a still image, a video, or a cine” (Claim 1); “and displays the ultrasound image data or the 3D model to the user” (Claim 11) (“In act 2308, the processing device displays a cine including the ultrasound images and segmented portions of the ultrasound images that were collected during the 3D sweep. Further description of act 2308 may be found with reference to FIGS. 12-20. The process 2300 proceeds from act 2308 to act 2310.” [0044]; “In act 2310, the processing device displays a three-dimensional visualization based on the segmented portions of the ultrasound images collected during the 3D sweep. Further description of act 2310 may be found with reference to FIGS. 21-22” [0045]. As shown in FIG. 12, for example, buttons 1236 (i.e. Cine) and 1238 (i.e. 3D) are displayed to the user such that a selection can be made. Therefore, the system includes a display for displaying ultrasound image data or the 3D model as any one of a still image, a video or a cine. Furthermore, the method carried out by the system displays the ultrasound image data of the 3D model to the user.).
Regarding claims 2 and 12, Howell discloses all features of the claimed invention as discussed with respect to claims 1 and 11 above, and Howell further teaches “where the image processor analyzes the ultrasound image data stored in memory to identify an angle of imaging for each image in the series of ultrasound images, and identify elements of the imaging target within the ultrasound image” (Claim 2); “where processing the ultrasound image data comprises identifying the angle of imaging of each image in the series of ultrasound images, and identifying elements of an imaging target within the ultrasound image” (Claim 12) (“In some embodiments, the 3D visualization 2140 may be generated from the ultrasound images collected during the 3D sweep and segmented portions from the ultrasound images. Each ultrasound image may be generated from an imaging slice arranged at a different angle relative to the ultrasound device. The processing device may arrange data from both the segmented portions of the ultrasound images […] and the B-mode ultrasound images themselves at the corresponding angle of the ultrasound image relative to the ultrasound device, and convert these angled images into a grid of voxels. The processing device may then produce the 3D visualization 2140 by volume rendering the grid of voxels. More particularly, in some embodiments, the 3D visualization 2140 may be a combination (e.g., a linear combination) of data from the segmented portions of the ultrasound images and the ultrasound images themselves. […] When displayed, the 3D visualization 2140 generated as described above may include a 3D bladder visualization 2146 portion that may depict the 3D volume of the bladder and as well as a 3D environment visualization 2148 portion that may depict surrounding tissue. The 3D environment visualization 2148 may highlight the boundary of the bladder and provide orientation in three-dimensional space of the bladder by depicting surrounding landmarks (e.g., the pubic bone) using the ultrasound image component of the 3D visualization 2140” [0036].
In this case, in order for the 3D visualization to be produced, the processing device must arrange data from the segmented portions of the ultrasound images and the B-mode ultrasound images at the corresponding angle of the ultrasound image relative to the ultrasound device and convert these angled images into a grid of voxels. Therefore, the image processor (i.e. 2414 in processing device 2404) analyzes the ultrasound image data stored in memory (i.e. memory 2416) to identify an angle of imaging for each image in the series of ultrasound images, such that the 3D visualization can be rendered. Furthermore, since the 3D visualization 2140 includes a 3D bladder visualization 2146 portion (i.e. which depicts the 3D volume of the bladder) and a 3D environment visualization 2148 (i.e. which highlights the boundary of the bladder and its orientation in three-dimensional space by depicting surrounding landmarks), the image processor identifies elements of the imaging target (i.e. the bladder) within the ultrasound image. Furthermore, the method involves processing the ultrasound image data, which comprises identifying the angle of imaging of each image in the series of ultrasound images, and identifying elements of an imaging target within the ultrasound image.).
Regarding claims 5 and 15, Howell discloses all features of the claimed invention as discussed with respect to claims 1 and 11 above, and Howell further teaches “where the image processor is configured to select an image from the series of ultrasound images and designates it as a key-frame” (Claim 5); “where the image processor selects an image from the series of ultrasound images and designates it as a key-frame” (Claim 15) (“FIG. 12 illustrates another example GUI 1200, in accordance with certain embodiments described herein. The GUI 1200 includes a cine 1228 […] The cine view indicator 1236 is highlighted in FIG. 12, indicating that the GUI 1200 is showing the 3D ultrasound data collected during the 3D sweep in the form of the cine 1228. In some embodiments, the cine 1228 may depict the ultrasound images that were collected during the 3D sweep (i.e., the ultrasound images 312-1112). In FIG. 12, the cine 1228 depicts the ultrasound image 312, namely the first ultrasound image collected during the 3D sweep. […] The cine control/information bar 1230 may control and provide information about the cine 1228. For example, the cine control/information bar 1230 may provide information about how much time has elapsed during playback of the cine 1228, how much time remains for playback of the cine 1228, and may control playing, pausing, or changing to a different point in the cine 1228” [0030]; “FIGS. 14-20 illustrate further examples of the GUI 1200, in accordance with certain embodiments described herein. In each figure, the cine 1228 depicts the ultrasound image 512, 612, 712, 812, 912, 1012, or 1112, respectively. Each of the ultrasound images 512-1112 is one of the ultrasound images collected during the 3D sweep and after the ultrasound image depicted in the previous figure” [0034].
In this case, the user selects which ultrasound image (i.e. 312-1112) is depicted on the cine 1228 through positioning the slider icon within the cine control/information bar 1230. Therefore, the image processor (i.e. 2404 in FIG. 24 and GUIs shown in FIGS. 12-20) is configured to select an image from the series of ultrasound images and designates it as a key-frame (i.e. the frame that is displayed). Furthermore, the method involves causing the image processor to select an image from the series of ultrasound images and designates it as a key-frame (i.e. via the cine control/information bar 1230).).
Regarding claims 6 and 16, Howell discloses all features of the claimed invention as discussed with respect to claims 5 and 15 above, and Howell further teaches “where the image processor is configured to display the key frame as a default still image, or as a central frame in a displayed video or cine” (Claim 6); “where the image processor displays the key frame as a default still image, or as a central frame in the displayed video or cine” (Claim 16) (See [0030] as discussed with respect to claims 5 and 15 above. Therefore, the image processor is configured to display the key frame (i.e. any one of 312-1112) as a default still image or as a central frame in a displayed video or cine. Additionally, the method carried out by the system involves causing the image processor to display the key frame as a default still image, or as a central frame in the displayed video or cine.).
Regarding claims 7 and 17, Howell discloses all features of the claimed invention as discussed with respect to claims 5 and 15 above, and Howell further teaches “further comprising a user interface to allow the user to select a different image from the series of ultrasound images and designate it as the key-frame” (Claim 7); “where the user may select a different image from the series of ultrasound images and designate it as the key-frame” (Claim 17) (See [0030] and [0034] as discussed with respect to claims 5 and 15 above. In this case, the user selects which ultrasound image (i.e. 312-1112) is depicted on the cine 1228 through positioning the slider icon within the cine control/information bar 1230. Therefore, the ultrasound system further comprises a user interface (i.e. cine control/information bar 1230) to allow the user to select a different image from the series of ultrasound images and designate it as the key frame. Furthermore, the method involves allowing the user to select a different image from the series of ultrasound images and designate it as the key frame.).
Regarding claim 8, Howell discloses all features of the claimed invention as discussed with respect to claim 1 above, and Howell further teaches “wherein the image processor is configured to join together the series of ultrasound images taken along an elevational direction of the 2D array to reconstruct the three-dimensional (3D) model of the imaging target as a model having data of the imaging target throughout a three-dimensional space” (See [0036] as discussed with respect to claim 2 above. Therefore, since the processing device must arrange data from the segmented portions of the ultrasound images and the B-mode ultrasound images at the corresponding angle of the ultrasound image relative to the ultrasound device and convert these angled images into a grid of voxels, in order to generate the 3D visualization, the image processor is configured to join together the series of ultrasound images taken along an elevational direction of the 2D array (i.e. linear combination of data, see [0036]) to reconstruct the three-dimensional (3D) model of the imaging target as a model having data of the imaging target (i.e. volume, border, etc.) throughout a three-dimensional space.).
Regarding claim 18, Howell discloses all features of the claimed invention as discussed with respect to claim 11 above, and Howell further teaches “where displaying the ultrasound image data or the 3D model includes the ability for the user to manually pause, fast-forward, and rewind a displayed video or cine” (“FIG. 12 illustrates another example GUI 1200, in accordance with certain embodiments described herein. The GUI 1200 includes a cine 1228, a cine control/information bar 1230, a measurement value indicator 1232, a cine view indicator 1236, a 3D view indicator 1238, and a bladder overlay option 1250 […] The cine control/information bar 1230 may control and provide information about the cine 1228. For example, the cine control/information bar 1230 may provide information about how much time has elapsed during playback of the cine 1228, how much time remains for playback of the cine 1228, and may control playing, pausing, or changing to a different point in the cine 1228” [0030]. Therefore, the step of displaying the ultrasound image data or the 3D model includes the ability for the user to manually pause, fast-forward and rewind a displayed video or cine (i.e. using the cine control/information bar 1230).).
Regarding claims 10 and 20, Howell discloses all features of the claimed invention as discussed with respect to claims 1 and 11 above, and Howell further teaches “further comprising a user interface configured to display the 3D model from a different angle than the angle of imaging” (Claim 10); “where displaying the 3D model optionally includes displaying the model from a different angle than the angle of imaging” (Claim 20) (“FIG. 21 illustrates another example GUI 2100, in accordance with certain embodiments described herein. The GUI 2100 includes the cine view indicator 1236, the 3D view indicator 1238, the measurement value indicator 1232, a 3D visualization 2140, a first orientation indicator 2142, and a second orientation indicator 2144. The 3D visualization 2140 includes a 3D bladder visualization 2146 and a 3D environment visualization 2148” [0035]; “In some embodiments, the first orientation indicator 2142 may be an indicator of the position of the ultrasound device that performed the 3D sweep relative to the bladder depicted by the 3D visualization 2140. In some embodiments, the second orientation indicator 2144 may be an indicator of the position of the bottom plane of the ultrasound images collected during the 3D sweep relative to the bladder depicted by the 3D visualization 2140. Thus, the positions of the first orientation indicator 2142 and/or the second orientation indicator 2144 relative to the 3D visualization 2140 in the GUI 2100 may provide information about the orientation of the 3D visualization 2140 as depicted in the GUI 2100” [0037].
In this case, the GUI shown in FIG. 21 includes the first and second orientation indicators 2142 and 2144 respectively, wherein the second orientation indicator 2144 is an indicator of the position of the bottom plane of the ultrasound images collected during the 3D sweep. Therefore, the second orientation indicator 2144 is a different angle than the angle of imaging (i.e. corresponding to the first orientation indicator 2142). Thus, the ultrasound system further comprises a user interface configured to display the 3D model from a different angle (i.e. second orientation indicator 2144) than the angle of imaging (i.e. the first orientation indicator 2142). Additionally, the method step where displaying the 3D model is performed, optionally includes displaying the model from a different angle (i.e. corresponding to the second orientation indicator 2144) than the angle of imaging (i.e. corresponding to the first orientation indicator 2142).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 3-4, and 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Howell et al. US 2020/0320694 A1 “Howell” as applied to claims 1 and 11 above, and further in view of Steen US 2005/0283078 A1 “Steen”.
Regarding claims 3 and 13, Howell discloses all features of the claimed invention as discussed with respect to claims 1 and 11 above, however, Howell does not teach “where the image processor analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data and converts the ultrasound image data into image data in cartesian coordinates” (Claim 3) “where processing the ultrasound image data comprises applying a polar coordinate reference frame to the ultrasound image data and converting the ultrasound image data into image data in cartesian coordinates” (Claim 13).
Steen is within the same field of endeavor as the claimed invention because it involves an ultrasound system that includes a display processor which converts from a polar coordinate system to a Cartesian coordinate system (see [Abstract] and [0020]).
Steen teaches “where the image processor analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data and converts the ultrasound image data into image data in cartesian coordinates” (Claim 3) “where processing the ultrasound image data comprises applying a polar coordinate reference frame to the ultrasound image data and converting the ultrasound image data into image data in cartesian coordinates” (Claim 13) (“A 2D display processor, for example, the processor 116 may perform filtering of the data slice information received from the image buffer 114 as well as processing of the data slice to produce a processed image frame. […] The display processor 116 may then perform scan conversion to map data from a polar to Cartesian coordinate system for display on a computer display 124” [0020]. Therefore, the processor 116 accesses data slice information (i.e. ultrasound image data) from the image buffer 114 (i.e. memory), performs filtering and then performs scan conversion from a polar coordinate system to a Cartesian coordinate system such that it can be displayed. Therefore, the image processor (i.e. 116) analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data and converts the ultrasound image data into image data in cartesian coordinates. Furthermore, the method step of processing the ultrasound data comprises applying a polar coordinate reference frame to the ultrasound image data and converting the ultrasound image data into image data in cartesian coordinates.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the ultrasound system and method of Howell such that the image processor analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data and converts the ultrasound image data into image data in cartesian coordinates and the method step of processing the ultrasound data comprises applying a polar coordinate reference frame to the ultrasound image data and converting the ultrasound image data into image data in cartesian coordinates as disclosed in Steen in order to perform scan conversion such that images are displayed in a Cartesian coordinate system that is easily understood by a user. A Cartesian coordinate system is one of a finite number of coordinate systems to which ultrasound image data can be converted to such that it is displayed in a recognizable manner with a reasonable expectation of success. Thus, modifying the ultrasound system and method of Howell such that the image processor analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data and converts the ultrasound image data into image data in cartesian coordinates and the method step of processing the ultrasound data comprises applying a polar coordinate reference frame to the ultrasound image data and converting the ultrasound image data into image data in cartesian coordinates as disclosed in Steen would yield the predictable result of performing scan conversion such that ultrasound images are displayed in a Cartesian coordinate system which is easily understood by a user.
Regarding claims 4 and 14, Howell in view of Steen discloses all features of the claimed invention as discussed with respect to claims 3 and 13 above, and Steen further teaches “where the image processor applying a polar coordinate reference frame comprises comparing an imaging angle of each image and a distance from the array of each element identified in the image and comparing the corresponding values in each image in the series of ultrasound images” (Claim 4); “where applying a polar coordinate reference frame comprises comparing an imaging angle of each image and a distance from the array of each element identified in the image and comparing the corresponding values in each image in the series of ultrasound images” (Claim 14) (See [0020] as discussed with respect to claims 3 and 13 above. In this case, to perform scan conversion to map from a polar coordinate system to a Cartesian coordinate system (see [0020]), the image data has to first be in a polar coordinate system, (i.e. a polar coordinate system had to have been applied to the ultrasound images). In order for scan data in the polar coordinate system to be scan converted to the Cartesian coordinate system (i.e. such that an image is subsequently displayed), under broadest reasonable interpretation, the processor 116 had to have first applied a polar coordinate reference frame by comparing an imaging angle of each image and a distance from the array of each element identified in the image and comparing the corresponding values in each image in the series of ultrasound images.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the ultrasound system and method of Howell such that the image processor analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data, the application of applying a polar coordinate reference frame comprising comparing an imaging angle of each image and a distance from the array of each element identified in the image and comparing the corresponding values in each image in the series of ultrasound image, and converts the ultrasound image data into image data in cartesian coordinates and the method step of processing the ultrasound data comprises applying a polar coordinate reference frame to the ultrasound image data, as described above, and converting the ultrasound image data into image data in cartesian coordinates as disclosed in Steen in order to perform scan conversion such that images are displayed in a Cartesian coordinate system that is easily understood by a user. A Cartesian coordinate system is one of a finite number of coordinate systems to which ultrasound image data can be converted to such that it is displayed in a recognizable manner with a reasonable expectation of success. Thus, modifying the ultrasound system and method of Howell such that the image processor analyzes the ultrasound image data stored in memory to apply a polar coordinate reference frame to the ultrasound image data and converts the ultrasound image data into image data in cartesian coordinates and the method step of processing the ultrasound data comprises applying a polar coordinate reference frame to the ultrasound image data and converting the ultrasound image data into image data in cartesian coordinates as disclosed in Steen would yield the predictable result of performing scan conversion such that ultrasound images are displayed in a Cartesian coordinate system which is easily understood by a user.
Claim(s) 9 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Howell et al. US 2020/0320694 A1 “Howell” as applied to claims 1 and 11 above, and further in view of Kommu CHS US 2017/0238907 A1 “Kommu CHS”.
Regarding claims 9 and 19, Howell discloses all features of the claimed invention as discussed with respect to claims 1 and 11 above, however, Howell does not teach “further comprising a single button on the ultrasound imaging device to activate the processor to control the array” (Claim 9); “where the user activates the processor to control the array with a single button” (Claim 19).
Kommu CHS is within the same field of endeavor as the claimed invention because it involves systems and methods for generating ultrasound images and acquiring 3D ultrasound data (see [Abstract]).
Kommu CHS