DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Claim(s) 3, 7-8 and 12-19 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected Group (invention II, claims 12-19) and nonelected species A, B and D (claims 3 and 7-8), there being no allowable generic or linking claim. Claim 20, while indicated as ‘withdrawn’ in the instant claim set, is examined on the merits as being drawn to the elected Group (invention I). Election was made without traverse in the reply filed on 3/12/2026.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim(s) 5-6 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 6 is also rejected at least by virtue of dependency upon a rejected base claim.
Claim 5 recites the limitation “an Al output comprises a segmented imaged tendon which is displayed on a screen of the computing device, along with an indicator of the degree of damage to the imaged tendon” which renders the claim indefinite. There is insufficient antecedent basis for these limitations in the claim. It’s unclear whether the ‘AI output’ is generated from the ‘AI model’ recited in claims 1 and 2 (upon which claim 5 depends) or if another AI algorithm generates the ‘output’; in addition, it is not clear when this output takes place relative to the ‘AI model’ processing steps in the first interpretation. Furthermore, the language is inconsistent with the use of the ‘imaged tendon’ resulting in multiple interpretations. It is unclear if the ‘segmented imaged tendon’ refers to the ‘imaged tendon’ in claims 1, 2 and the instant claim 5; if it refers to the tendon with ‘segmented boundaries’ from claim 2; or if there is a new ‘segmented imaged tendon’. For the purposes of examination the broadest reasonable interpretation of the claim language may be any ‘AI output’ and any ‘segmentation’.
Claim 6 recites the limitation “wherein a workflow application on the computing device, which is communicatively coupled with the ultrasound scanner, receives the Al model output and automatically places a caliper set on the points on the boundaries in order to measure the thickness of the imaged tendon” which are unclear and renders the claim indefinite. Additionally, there is insufficient antecedent basis for the limitations in the claim. The first it is not clear which structure (e.g., the workflow application, the computing device) is ‘communicatively coupled’ with the ‘ultrasound scanner’ and how it is coupled. The use of “AI model output” lacks sufficient antecedent basis because it is unclear if the ‘output’ is pointing to the “AI output” from claim 5, to the processing output of the ‘AI model’ in claim 1, or to a new distinct “AI model output”. Furthermore, it is unclear what the ‘caliper set’ achieves or measures, because the thickness of the tendon is already measured based on boundary points from claim 2 (e.g., is this a visual indicator, a user interaction, etc.). For the purposes of examination the broadest reasonable interpretation of the claim language – including those discussed above – is applied to the limitations. It is suggested to amend the claim language to clarify the AI model functions and output, to clearly indicate the relationship between the computing device and the ultrasound scanner, and to clearly describe the function(s) performed by the ‘caliper set’.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 4, 10 and 20 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Park (US2021/0100530 A1; 2021-04-08) (hereinafter “Park”), as provided by Applicant.
Regarding claim 1, Park teaches a method for assessing damage of a tendon in ultrasound imaging data (“A method, comprising: training a neural network to determine a degree of damage of a tendon depicted by an ultrasound image” [clm 8]; The method initiates an ultrasound scan and uses a neural network trained to determine a degree of damage to the tendon [0033-0067], [fig. 2-4]), the method comprising:
- deploying an artificial intelligence (Al) model to execute on a computing device, wherein the Al model is trained to identify a tendon imaged in ultrasound imaging data (“training a neural network to determine a degree of damage of a tendon depicted by an ultrasound image,” [clm 8]; “The controller may include algorithms and one or more neural networks (e.g., a system of neural networks) stored within a memory of the controller for automatically identifying and diagnosing one or more anatomical features depicted by a generated ultrasound image” [0027]; “The trained neural network may include an object detection algorithm, used for pairing the generated ultrasound image with one of a plurality of sample images. The plurality of sample images may each be a sample ultrasound image slice depicting a sample anatomical feature,” [0038]; The controller (i.e., computing device) employs a neural network trained to identify a tendon in an ultrasound image [0020-0034, 0073-0079], [fig. 1-2, 7]);
- acquiring, at the computing device, new ultrasound imaging data from an ultrasound scanner (“receiving a particular ultrasound image depicting a particular tendon; and” [clm 8]; The controller receives ultrasound imaging data from the ultrasound probe [0022-0038], [fig. 1-5, 7]);
- processing, using the Al model, the new ultrasound imaging data to identify an imaged tendon (“determining a degree of damage of the particular tendon depicted by the particular ultrasound image using the trained neural network.” [clm 8]; “The controller may include algorithms and one or more neural networks […] for automatically identifying and diagnosing one or more anatomical features depicted by a generated ultrasound image,” [0027]; “based on the ultrasound imaging data, using a trained neural network to automatically identify and diagnose an anatomical feature (e.g., a tendon).” [0031]; The neural network extracts anatomical features (e.g., tendons) from input ultrasound images [fig. 2-4, 7-9]);
- automatically measuring a thickness of the imaged tendon (“the graphics module may be configured to display designated graphics along with the displayed image, such as selectable icons (e.g., image rotation icons) and measurement parameters (e.g., data) relating to the image.” [0027]; “The trained neural network may identify one or more image aspects of the anatomical feature depicted by the generated ultrasound image based on the most similar sample image. […] the one or more image aspects may include one or more tendon features (e.g., individual fiber bundles, partial ruptures, complete ruptures, etc.)” [0039]; The trained neural network may identify image aspects, including partial ruptures or complete ruptures (i.e., thickness) of the tendon [0033-0067, 0073-0079], [fig. 2-4, 7-9]); and
- assessing a degree of damage to the imaged tendon using the automatically measured thickness (“determining a degree of damage of the particular tendon depicted by the particular ultrasound image using the trained neural network.” [clm 8]; “the anatomical feature may be a tendon in a shoulder, such that a degree of damage of 0% may indicate a non-damaged tendon, a degree of damage of 50% (or any value greater than 0% and less than 100%) may indicate a tendon having a partial rupture, and a degree of damage of 100% may indicate a tendon having a complete rupture. In some examples, the degree of damage may be measured with any sort of numerical scale” [0042]; A degree of damage corresponding of the tendon corresponds to the presence of a rupture [0033-0067, 0073-0079], [fig. 2-4, 7-9]).
Regarding claim 4, Park teaches the method of claim 1
Park further teaching wherein one or more of a length, a height and a width of the imaged tendon is automatically measured (“method 200 may proceed to 230 to determine a degree of damage of the anatomical feature depicted by the generated ultrasound image based on the most similar sample image. […] the anatomical feature may be a tendon in a shoulder, such that a degree of damage of 0% may indicate a non-damaged tendon, a degree of damage of 50% (or any value greater than 0% and less than 100%) may indicate a tendon having a partial rupture, and a degree of damage of 100% may indicate a tendon having a complete rupture.” [0042]; A tendon rupture measured against a scale (e.g., partial rupture, complete rupture) is a measurement of the height and/or width of the tendon [0033-0067, 0073-0079], [fig. 2-4, 7-9], [see claim 1 rejection]).
Regarding claim 10, Park teaches the method of claim 1
Park further teaching wherein the tendon is selected from the group consisting of: Patellar, Plantar fascia, Achilles, Rotator cuff, Extensor, Peroneus, Quadricept, Peroneal, Tibialis, Adductor, Supraspinatus, and Intraspinatus (“generating an ultrasound image depicting the anatomical feature (e.g., tendon) from the ultrasound imaging data. In some examples, the generated ultrasound image may be a 2D image slice of a volume (e.g., from the volumetric ultrasound data) corresponding to a targeted slice of the volume (e.g., a sagittal, frontal, or transverse plane of a shoulder of the patient)” [0037]; “The degree of damage may be a percentage value, corresponding to a relative amount of damage to a given anatomical feature. As an example, the anatomical feature may be a tendon in a shoulder, such that a degree of damage of 0% may indicate a non-damaged tendon, a degree of damage of 50% […] may indicate a tendon having a partial rupture, and a degree of damage of 100% may indicate a tendon having a complete rupture.” [0042]).
Regarding claim 20, Park teaches a computer-readable media storing computer-readable instructions (“A method, comprising: training a neural network to determine a degree of damage of a tendon depicted by an ultrasound image” [clm 8]; “method 200 may be implemented in non-transitory memory of a computing device, such as the controller (e.g., processor) of the imaging system 100” [0034]; [0033-0067], [fig. 2-4], [see claim 1 rejection]), which, when executed by a processor cause the processor to:
- deploy an artificial intelligence (Al) model to execute on a computing device, wherein the Al model is trained to identify a tendon imaged in ultrasound imaging data (“training a neural network to determine a degree of damage of a tendon depicted by an ultrasound image,” [clm 8]; “The controller may include algorithms and one or more neural networks (e.g., a system of neural networks) stored within a memory of the controller for automatically identifying and diagnosing one or more anatomical features depicted by a generated ultrasound image” [0027]; “The trained neural network may include an object detection algorithm, used for pairing the generated ultrasound image with one of a plurality of sample images. The plurality of sample images may each be a sample ultrasound image slice depicting a sample anatomical feature,” [0038]; [0020-0034, 0073-0079], [fig. 1-2, 7], [see claim 1 rejection]);
- acquire new ultrasound imaging data from the ultrasound scanner (“receiving a particular ultrasound image depicting a particular tendon; and” [clm 8]; [0022-0038], [fig. 1-5, 7], [see claim 1 rejection]);
- process, using the Al model, the new ultrasound imaging data to identify an imaged tendon (“determining a degree of damage of the particular tendon depicted by the particular ultrasound image using the trained neural network.” [clm 8]; “The controller may include algorithms and one or more neural networks […] for automatically identifying and diagnosing one or more anatomical features depicted by a generated ultrasound image,” [0027]; “based on the ultrasound imaging data, using a trained neural network to automatically identify and diagnose an anatomical feature (e.g., a tendon).” [0031]; [fig. 2-4, 7-9], [see claim 1 rejection]);
- automatically measure a thickness of the imaged tendon (“the graphics module may be configured to display designated graphics along with the displayed image, such as selectable icons (e.g., image rotation icons) and measurement parameters (e.g., data) relating to the image.” [0027]; “The trained neural network may identify one or more image aspects of the anatomical feature depicted by the generated ultrasound image based on the most similar sample image. […] the one or more image aspects may include one or more tendon features (e.g., individual fiber bundles, partial ruptures, complete ruptures, etc.)” [0039]; [0033-0067, 0073-0079], [fig. 2-4, 7-9], [see claim 1 rejection]); and
- assess a degree of damage to the imaged tendon using the automatically measured thickness (“determining a degree of damage of the particular tendon depicted by the particular ultrasound image using the trained neural network.” [clm 8]; “the anatomical feature may be a tendon in a shoulder, such that a degree of damage of 0% may indicate a non-damaged tendon, a degree of damage of 50% (or any value greater than 0% and less than 100%) may indicate a tendon having a partial rupture, and a degree of damage of 100% may indicate a tendon having a complete rupture. In some examples, the degree of damage may be measured with any sort of numerical scale” [0042]; [0033-0067, 0073-0079], [fig. 2-4, 7-9], [see claim 1 rejection]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 2 and 5-6 is/are rejected under 35 U.S.C. 103 as being obvious over Park as applied to claim 1 above, in view of Boussouar et al. (Plantar fascia segmentation and thickness estimation in ultrasound images, Computerized Medical Imaging and Graphics, Volume 56, 2017, Pages 60-73, ISSN 0895-6111, 2017-02-24; hereinafter “Boussouar”), as provided by Applicant.
Regarding claim 2, Park teaches the method of claim 1,
Park further teaching the wherein the Al model identifies and segments boundaries of the imaged tendon in the ultrasound imaging data (“identifying, via the neural network, one or more image aspects in the sample images. The one or more image aspects may be any image feature indicating and/or characterizing the anatomical feature of interest. In examples wherein the anatomical feature is a tendon, the one or more image aspects may include one or more tendon features (e.g., individual fiber bundles, partial ruptures, complete ruptures, etc.)” [0063]; “Upon classification, the neural network may label the one or more image aspects on the sample images by generating corresponding visual indicators (e.g., arrows, boxes, circles, shading, etc.).” [0065]; The neural network extracts tendon features (i.e., segmentation) and applies arrows indicating the boundary of the tendon in the ultrasound image [0033-0067, 0073-0079], [fig. 2-4, 7-9; see fig. 9 reproduced below], [see claim 1 rejection]).
PNG
media_image1.png
397
832
media_image1.png
Greyscale
The arrows indicating tendon boundaries (Park [fig. 9])
but Park may fail to teach measuring thickness using points on the segmented boundaries.
However, in the same field of endeavor, Boussouar teaches a method for assessing damage of a tendon in ultrasound imaging data (“an automatic segmentation approach which for the first time extracts ultrasound data to estimate size across three sections of the PF (rearfoot, midfoot and forefoot). This segmentation method uses artificial neural network module (ANN) in order to classify small overlapping patches as belonging or not-belonging to the region of interest (ROI) of the PF tissue.” [abst]; “Research has reported thickening and hypoechoic deformities of the PF as part of the diagnostic criteria and PF characteristic features (Park et al., 2014). Increased PF thickness of >4 mm and decreased PF echogenicity are considered symptomatic” [p.60-61, col.2-1]; [fig. 1-5]);
Boussouar further teaching wherein the Al model identifies and segments boundaries of the imaged tendon in the ultrasound imaging data and the thickness is measured using points on the boundaries (“The proposed RBF-NN segmentation method was applied on all PF ultrasound images.” [p.64, col.2]; “(a) Thick 1 method: (1) distance transformation was applied to the segmented PF US image using Euclidean distance metric […] (2) the local maxima pixel set points (spot centers) of the distance transformed segmented PF image were found (i.e. distances from the background). These local maxima points are also known as skeleton centered points (ridges) (Blum, 1967) with respect to the shape boundary (Telea, 2014); and (3) the thickness was computed as the median of the local maxima pixel set points.” [p.65, col.1]; Segmented plantar fascia tendon was measured at the shape boundary to compute thickness of the tendon [fig. 1-5]).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to modify the method taught by Park by segmenting boundaries of the tendon as taught by Boussouar. Ultrasound (US) imaging offers significant potential in diagnosis of plantar fascia (PF) injury and monitoring treatment. Despite the advantages of US imaging, images are difficult to interpret during medical assessment. This is partly due to the size and position of the PF in relation to the adjacent tissues. It is therefore a requirement to devise a system that allows better and easier interpretation of PF ultrasound images during diagnosis (Boussouar [abst]). The method may assist a user in obtaining ultrasound imaging data via an ultrasound probe, using a trained neural network to automatically identify and diagnose an anatomical feature (e.g., a tendon) from ultrasound imaging data (Park [0030-0031]).
Regarding claim 5, Park and Boussouar teach the method of claim 2
Park further teaching wherein an Al output comprises a segmented imaged tendon which is displayed on a screen of the computing device, along with an indicator of the degree of damage to the imaged tendon (“The example user interface display 900 may include an ultrasound image 908 depicting a tendon in a shoulder of a subject, the ultrasound image 908 generated from ultrasound imaging data received by the ultrasound probe (e.g., 106). Visual indicators, such as a graphical bar 910 of a degree of damage of the tendon (e.g., the graphical bars 822, 824, 826, 828 of FIG. 8) and/or arrows 912 indicating image aspects identified by a trained neural network […] may be superimposed on the ultrasound image 908” [0081]; The display device of the ultrasound imaging system may present image aspects of the tendon and a graphical bar indicating the degree of damage [0033-0067, 0073-0083], [fig. 2-4, 7-9; see fig. 9 reproduced below]).
PNG
media_image2.png
624
1032
media_image2.png
Greyscale
Visual indicator (graphical bar 910) may be superimposed on ultrasound image of the tendon with arrows 912 pointing to image aspects (e.g., boundary points) of the tendon (Park [fig. 9])
Regarding claim 6, Park and Boussouar teach the method of claim 5
Park further teaching wherein a workflow application on the computing device, which is communicatively coupled with the ultrasound scanner, receives the Al model output and automatically places a caliper set on the points on the boundaries in order to measure the thickness of the imaged tendon (“The system controller 116 is operably connected to a user interface 122 that enables an operator to control at least some of the operations of the system 100. The user interface 122 may include hardware, firmware, software, or a combination thereof that enables an individual (e.g., an operator) to directly or indirectly control operation of the system 100” [0025]; “The ultrasound imaging data may be received by the controller (e.g., 116) communicably coupled to the ultrasound probe.” [0036]; “The example user interface display 900 may include an ultrasound image 908 depicting a tendon in a shoulder of a subject, the ultrasound image 908 generated from ultrasound imaging data received by the ultrasound probe (e.g., 106). Visual indicators, such as a graphical bar 910 of a degree of damage of the tendon (e.g., the graphical bars 822, 824, 826, 828 of FIG. 8) and/or arrows 912 indicating image aspects identified by a trained neural network […] may be superimposed on the ultrasound image 908” [0081]; The arrows (i.e., caliper set) indicates image aspects identified by the neural network from the input ultrasound image of the tendon [0033-0067, 0073-0083], [fig. 2-4, 7-9; see fig. 9 reproduced below], [see claim 1, 5 rejections]).
PNG
media_image2.png
624
1032
media_image2.png
Greyscale
Arrows (i.e., caliper set) superimposed on ultrasound image of tendon identified using the trained neural network (Park [fig. 9])
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being obvious over Park as applied to claim 1 above, in view of Berkey (US20130184584A1, 2013-07-18; hereinafter “Berkey”).
Regarding claim 9, Park teaches the method of claim 1,
Park further teaching the steps of placing a color box on the imaged tendon and subsequently acquiring, at the computing device, subsequent ultrasound imaging data from the ultrasound scanner in Power Doppler mode (“the image-processing module may process the ultrasound signals to generate […] ultrasound waveforms (e.g., continuous or pulse wave Doppler spectrum or waveforms) for displaying to the operator. […] the ultrasound modalities may include color-flow, acoustic radiation force imaging (ARFI), B-mode, A-mode, M-mode, spectral Doppler, acoustic streaming, tissue Doppler module,” [0022]; “a touchpad may be configured to the system controller 116 and display area 117, such that when a user moves a finger/glove/stylus across the face of the touchpad, a cursor atop the ultrasound image or Doppler spectrum on the display device 118 moves in a corresponding manner.” [0025]; “In some examples, the indication of the degree of damage may be a color. Again considering the example of a tendon in a shoulder, a green color may indicate a non-damaged tendon, a yellow color may indicate a tendon having a partial rupture, and a red color may indicate a tendon having a complete rupture.” [0044]; [0033-0067, 0073-0083], [fig. 2-4, 7-9], [see claim 1 rejection]);
but Park may fail to explicitly teach automatically placing at least one of a Doppler gate and a color box on the thickest part of the imaged tendon.
However, in the same field of endeavor, Berkey teaches a method for assessing damage of a tendon in ultrasound imaging data (“A method of labeling medical ultrasound images in real-time, comprising: […] taking ultrasound images from at least one frame of the ultrasound video; displaying the ultrasound images; […] automatically labeling anatomical structures shown in the ultrasound images.” [clm 19]; “wherein selecting a type of medical ultrasound study includes selecting a medical ultrasound study from a group consisting of: […] evaluation of tears or injuries to muscles and tendons,” [clm 30]; [0034-0059, 0087-0099], [fig. 5-7]);
Berkey further teaching additionally comprising the steps of automatically placing at least one of a Doppler gate and a color box on the thickest part of the imaged tendon (“Shading added to the anatomical structures may over-lie the entire structure or a portion of the structure. The shading may be easily visible, variable in size and shape, and be provided with different colors.” [0088]; “Lines may be used to delineate the edges of the anatomical structures of interest. The lines may be curved or straight, thick or thin, colored or black and white.” [0089]; The method may include automatically labeling by shading and outlining (i.e., color box) at least some of the anatomical structures (e.g., tendon) [0034-0059, 0087-0099], [fig. 5-7]).
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to modify the method taught by Park by automatically placing at least one of a Doppler gate and a color box on the thickest part of the imaged tendon as taught by Krasnow. Use of ultrasound does have limits and disadvantages. The method is operator-dependent, and requires skill and experience to acquire quality images and to interpret them with accuracy. As more providers with less training and experience (and no mandated demonstration of competency) begin to perform and interpret bedside ultrasound, new technology to overcome limits and barriers to its ease of use would be valuable (Berkey [0004]). The method may assist a user in obtaining ultrasound imaging data via an ultrasound probe, using a trained neural network to automatically identify and diagnose an anatomical feature (e.g., a tendon) from ultrasound imaging data (Park [0030-0031]).
Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being obvious over Park in view of Boussouar as applied to claim 2 above, further in view of Gupta et al. (‘Curvelet based automatic segmentation of supraspinatus tendon from ultrasound image: a focused assistive diagnostic method.’ BioMed Eng OnLine 13, 157 (2014); 2014-12-04) (hereinafter “Gupta”).
Regarding claim 11, Park and Boussouar teach the method of claim 2,
Park further teaching additionally comprising the steps of:
i) automatically identifying and annotating boundaries of the corroborated tendon, forming a segmentation mask of the corroborated tendon (“Upon classification, the neural network may label the one or more image aspects on the sample images by generating corresponding visual indicators (e.g., arrows, boxes, circles, shading, etc.).” [0065], [fig. 9]; The neural network applies arrows indicating the boundary of the tendon (i.e., forms a segmentation mask) in the ultrasound image [see fig. 9 reproduced below]); and
creating a plurality of lines on the ultrasound image (“the one or more processing operations may include one or more image transforms, such as a Radon transform for identifying linear features in the ultrasound images.” [0022]);
PNG
media_image1.png
397
832
media_image1.png
Greyscale
The arrows annotate the tendon within the ultrasound image (Park [fig. 9])
but Park and Boussouar may fail to explicitly teach the topological skeleton is equidistant from each of to the annotated boundaries.
However, in the same field of endeavor, Gupta a method the uses curvelet transform for feature extraction based on energy analysis of features followed by connected component analysis and morphological operations [Related work p.3];
Gupta further teaching additionally comprising the steps of:
a) automatically identifying and annotating the boundaries of the imaged tendon, forming a segmentation mask of the imaged tendon (“preprocessing performed enhances the features needed for automatic extraction of tendon. The feature extraction from preprocessed image is performed using curvelet transform” [Image enhancement and feature extraction, p.6]; “desired mask for segmentation of SSP tendon is obtained” [Area filtering and connected component analysis, p.10]; Feature extraction defines the edges of the tendon in the ultrasound image before segmenting and generating a mask image of the tendon [fig. 2-4; see fig. 3 reproduced below]);
PNG
media_image3.png
730
703
media_image3.png
Greyscale
A segmented tendon image (g) is generated based on derived tendon boundaries and image processing of the extracted tendon (Gupta [fig. 3])
b) using the annotated boundaries to define a topological skeleton, along the imaged tendon, wherein the topological skeleton is equidistant from each of the annotated boundaries (“capability of curvelet transform to extract directional edge features at different orientations is exploited for extracting edge features from ultrasound image.” [Image enhancement and feature extraction, p.6]; A curvelet transform applied to tendon in ultrasound image (i.e., define a topological skeleton) divides the tendon into equidistant sections [fig. 3-4, 9; see fig. 4 reproduced below]);
c) creating a plurality of lines perpendicular to the topological skeleton (“Curvelet transform decomposes the image in different direction and frequencies in the curvelet domain” [Directional edge feature extraction, p.7]; The curvelet transform comprises wedge wrapping and periodic wedge tiling to fit a curvelet to the input image, which is interpreted as creating a plurality of lines perpendicular to the skeleton [fig. 4, 9; see fig. 4 reproduced below]); and
PNG
media_image4.png
354
705
media_image4.png
Greyscale
Plurality of lines are mapped to each section of the tendon to perform the fitting (arrow), perpendicular to the boundary of the tendon (Gupta [fig. 4], annotated)
d) identifying a longest line of the plurality of lines, hereinafter the longest line, which represents a greatest thickness of the imaged tendon (“variations such as: 1) The width of the tendon, 2) Topography or location of tendon in images,” [Proposed methodology, p.4]; The width of the tendon (i.e., longest line) is considered during segmentation of the tendon [see fig. 9 reproduced below]).
PNG
media_image5.png
455
727
media_image5.png
Greyscale
Progression of segmentation of tendon in ultrasound image, where thickness (e.g., arrows) of the tendon is considered throughout (Gupta [fig. 9], annotated)
It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to modify the method taught by Park and Boussouar as outlined above with the skeleton taught by Gupta. Segmentation in ultrasound images can be difficult due to multiple reasons (e.g., contrast and resolution of image, speckle noise which is inherent property of ultrasound imaging modality, operator dependency of the modality, etc.). Applying these methods addresses the issue of contrast enhancement, despeckling and issues occurring due to operator dependency for accurate segmentation of the supraspinatus tendon (Gupta [Intro p.3]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Lin et al. ("Using Deep Learning in Ultrasound Imaging of Bicipital Peritendinous Effusion to Grade Inflammation Severity," in IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 4, pp. 1037-1045, April 2020) teaches an automated Bicipital peritendinous effusion (BPE) recognition system for classifying inflammation into the following categories-normal and mild, moderate, and severe. An ultrasound image serves as the input in the proposed system; the system determines whether the ultrasound image contains biceps. If the image depicts biceps, then the system predicts BPE severity.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to James F. McDonald III whose telephone number is (571)272-7296. The examiner can normally be reached M-F; 8AM-6PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Koharski can be reached at 5712727230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JAMES FRANKLIN MCDONALD III
Examiner
Art Unit 3797
/CHRISTOPHER KOHARSKI/Supervisory Patent Examiner, Art Unit 3797