Detailed Action
1. Claims 1-2, 4-11, and 13-20 are pending in this Application.
Notice of Pre-AIA or AIA Status
2. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to amendment
3. Applicant’s response to the last Office Action filed on 08/26/2025 has been entered and made of record.
4. Claims 1, 2, 4, 9-10, 14-16, and 19 have been amended
Response to Argument
5. The Applicant’s argument filed on 11/24/2025 is fully consider. For Examiner response see discussion below.
a. Based on the Applicant’s amendment and argument the objection of claims 14 and 19 expressly withdrawn.
b. The applicant argues “Zhang is not in the same field of endeavor as the claimed invention. Zhang is in the field of environmental monitoring, and more specifically, to a method for identifying unused aboveground soil pollution risk areas; whereas the claimed invention is in the field of medical imaging to assess the nature of a medical condition. Note, claim 1 has been amended to make abundantly clear that the recited image is a "medical image".
As to above argument [a], Examiner respectfully disagree with the Applicant’s argument for the following reason.
First both Bengtsson and Zhang are directed to image processing. Specifically
Bengtsson disclosed various types of image processing (see section [0073], , [0079], [0084]- [0089]). For example, Bengtsson disclosed “ image data received from the imaging system 160 may correspond to the whole body or multiple regions of the body. Metadata, automated alignments and/or image processing may indicate, for each image, to which region the image corresponds (see [0073])..
On the other hand Zhang disclosed The method involves obtaining a texture feature image reflecting texture features of each remote sensing image according to a monitoring area gray scale of each pixel point in a two-phase remote sensing image. Monitoring area gradation and texture characteristic change are obtained based on gray difference of the two-phase remote sensing image and texture difference of the texture feature image.
neural networks for image and lesion metabolism analysis.
It is considered possible for a person having ordinary skill in the art (such as an engineer familiar with machine vision, image processing, and sensor integration) to modify a camera system designed for soil analysis to capture and process medical image. While the target subjects differ significantly (inanimate soil vs. living), both applications share foundational principles of digital image processing, optical imaging, and multispectral/color analysis.
Thus, based on the above discussion Bengtsson and Zhang are combinable.
b. The Applicant’s argue “ Zhang addresses the problem of identifying unused aboveground soil pollution risk areas. Thus, Zhang is not reasonably pertinent to the problem faced by the inventors of the claimed invention of improving the masking of regions in medical images. Zhang does not even disclose masking.
As to above argument [b], Examiner respectfully disagrees with the Applicant’s argument because as described in the Office Action, limitation “generate, based on the first image texture feature values, a mask; and apply the mask to the received input image to generate a masked image.” is addressed by Bengtsson but not by Zhang.
c. The Applicant’s substantially argue “Bengtsson also does not disclose to generate a mask based on such first image texture feature values comprising one or more of a correlation, energy, homogeneity, gradient, or entropy value.”
As to above argument [c], Examiner respectfully disagrees with the Applicant’s argument for the following reasons.
As discussed above in section [b], “generate, based on the first image texture feature values, a mask; and apply the mask to the received input image to generate a masked image.” is addressed by Bengtsson. On the other hand the limitation ”texture feature values comprising one or more of a correlation, energy, homogeneity, gradient, or entropy value.” disclosed by Zhang. Thus the combination teaches the above limitation.
In response to Applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references.
See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). As noted above, the Zhang reference was relied upon to teach elements that applicant indicated were not disclosed by Bengtsson the reference.
Farther energy, homogeneity, gradient, and entropy are all recognized properties and features used to analyze images of both soil image and medical image processing. These parameters are common texture features (often derived from Gray Level Co-occurrence Matrices, or GLCM) used to quantify structural, disorder, or surface patterns in both natural materials.
How energy, homogeneity, gradient, and entropy are recognized image analysis features that can be interchanged (or applied interchangeably) to analyze both soil images and medical images discuss in more detail below:
i. Regarding energy:
energy related to Soil Images reflects the uniformity of the soil structure and particle distribution. On the other hand energy related to medical image such as tumor image represents the regularity or uniformity of the tumor surface pattern. A high energy value indicates a smooth, orderly surface, whereas lower energy might indicate rougher, aged, or uneven tumor surface textures.
ii Regarding homogeneity
homogeneity of soil image is used to evaluate the uniformity of soil, pore space, or the distribution of aggregates (clods) in a soil sample. homogeneity medical Images is used to evaluates the smoothness of tumor image, specifically looking at the uniformity of color or the texture of the tumor image surface.
iii Regarding gradient
the gradient of the soil images is used to identify soil particle boundaries, edge, and the structural complexity of pore networks in tomographic images.
the gradient of the medical images is used to detect edges and boundaries, such as irregularity , fine lines, or boundaries of lesions.
iv Regarding Entropy
Entropy of soil Images is used to measures the disorder, complexity, or randomness of the spatial arrangement of soil particles, often used in characterizing fabric networks.
Entropy of medical images is used to measures the disorder or "roughness" of the tumor surface. Increased entropy is strongly associated with decreased tumor skin elasticity, indicating the presence of wrinkles or uneven texture
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103, which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
6. Claims 1 -2, 4-7 9-11 and 13-18 are rejected under 35 U.S.C. 103(a) as being unpatentable over BENGTSSON et al., (hereafter BENGTSSON), US-provisional-application US 62880898 , filed on 07/31/2019 ,in view of ZHANG et al., (hereafter ZHANG), CN108805920 A, pub. 11/13/2018. ( For examination purpose US 20210401392 A1, filed on 09/13/2021, is utilized instead of US 62880898 )
As to claim 1, BENGTSSON teaches an apparatus for masking textured regions in a medical image ( Fig.1, [0057], [0082], Automated tumor segmentation system ( see Fig.1). The system includes a two-dimensional segmentation mask 220 is input into a feature extractor 225 (e.g., the feature extractor 120 as described with respect to FIG. 1) and the feature extractor 225 extracts relevant features from the two-dimensional segmentation mask 220. The relevant features include texture features (e.g., contrast, dissimilarity, cluster shade, cluster prominence, etc.), the apparatus comprising:
a processor ([0023], the system includes one or more data processors and a non-transitory computer readable storage medium containing instructions ) configured to:
receive an input image having a plurality of pixels ( Fig.1, [ 0081] –[0082],
the two-dimensional segmentation mask 220 is input into a feature extractor 225 (e.g., the feature extractor 120 as described with respect to FIG. 1) and the feature extractor 225 extracts relevant features from the two-dimensional segmentation mask 220. The input image corresponds to the two-dimensional segmentation mask);
determine, for each pixel of the plurality of pixels, a first image texture feature value, comprising computing grey level co-occurrence matrix (GLCM) statistics; and determining the first image texture feature value based on one of the GLCM statistics ([0082], Texture features may be extracted using a Gray level co-occurrence matrix (GLCM) or similar techniques. Shape features may be extracted using region property functions or similar techniques. Prognostic signatures may be extracted using k-means clustering or similar technique ); generate, based on the first image texture feature values, a mask ([0083], as discussed above refined two-dimensional segmentation mask corresponds to said mask); and
apply the mask to the received input image to generate a masked image ( [0083], the classifier 230 transforms the relevant features and combines the two-dimensional segmentation mask(s) 220 into a final masked image 235. The masked image corresponds to the final masked image).
However, it is noted that BENGTSSON does not specifically teach “the determined first image texture feature value comprising one or more of a correlation, energy, homogeneity, gradient, or entropy value”
On the other hand in the same field of endeavor ZHANG teaches the determined first image texture feature value comprising one or more of a correlation, energy, homogeneity, gradient, or entropy value (claims 6 and 7, capturing sub-images for each captured images using gray-level co-occurrence matrix method to calculate the texture statistic, and the texture statistics as the texture feature value of the corresponding sub-image central pixel, where the texture statistic comprises the following features: energy, contrast, differentiated moment and entropy).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the gray-level co-occurrence matrix (GLM) taught by BENGTSSON by incorporate a texture statistic that comprises the energy, contrast, differentiated moment and entropy taught by ZHANG,
The suggestion/motivation for doing so would have been allows user of BENGTSSON
to quantifying image detail and complexity that include measuring image random noise based on entropy. A higher entropy value indicates greater complexity and a richer set of details within an image, suggesting higher quality.
As to claim 2, BENGTSSON teaches the processor is configured to provide the masked image for display on a display ( claim 9, [0020], [0023], system comprising the convolutional neural network architecture; providing the final imaged mask; and receiving, by the user, one or more of the final imaged mask, the TMTV, the MTV, and the number of lesions on a display of a computing device. Further the system is provided that includes one or more data processors and a non-transitory computer readable storage medium containing instructions which, when executed on the one or more data processors, cause the one or more data processors to perform).
As to claim 4, BENGTSSON teaches the processor is further configured to: determine, for each pixel of the plurality of pixels, a plurality of second image texture feature values ([0082],[0134],, [0134], The relevant features may include texture features which can help in further classification of pixels within the two-dimensional segmentation mask 220.); and
combine the first image texture feature values and the second image texture feature values wherein the generation of the mask is based on the first image texture feature values and the plurality of second image texture feature values ([0083] The relevant features extracted from the feature extractor 225 and the two-dimensional segmentation mask(s) 220 are input into a classifier 230 (e.g., the classifier 130 as described with respect to FIG. 1) and the classifier 230 transforms the relevant features and combines the two-dimensional segmentation mask(s) 220 into a final masked image 235. The classifier 230 may use the relevant features to refine the classification of pixels in the two-dimensional segmentation mask(s) 220 and combined (e.g., using an average or other statistical function(s)) the two-dimensional segmentation mask(s) 220 to generate a final masked image 235).
As to claim 5, ZHANG teaches wherein the first image texture feature values are determined based on a first grey level co-occurrence matrix statistic, and at least one of the plurality of second image texture feature values is determined based on a second grey level co-occurrence matrix statistic (claim 6, determining the time phase of the two grey scale of the remote sensing image, selecting multiple characteristic quantities as texture statistics, using the sliding window to the remote sensing image capturing sub-images, for each captured images using gray-level co-occurrence matrix method to calculate the texture statistic. and taking said texture statistic as the texture feature value of the corresponding sub-image central pixel, obtaining the texture feature values of all pixel points, for any one pixel point of each feature amount of the texture feature value of the pixel point in the weighted sum, as the new grey value of the pixel point so as to obtain texture feature image).
As to claim 6, BENGTSSON teaches a method of generating mask based on image texture feature values (as discuss in claim 1 above) but fails to teach “apply a first weight to the first image texture feature values; and apply a second weight to the second image texture feature values; wherein the generation of the mask is based on the weighted first and second image texture feature values.”
On the other hand ZHANG teaches apply a first weight to the first image texture feature values; and apply a second weight to the second image texture feature values; wherein the generation of the (Claim 6, using the sliding window to the remote sensing image capturing sub-images, for each captured images using gray-level co-occurrence matrix method to calculate the texture statistic. and taking said texture statistic as the texture feature value of the corresponding sub-image central pixel, obtaining the texture feature values of all pixel points, for any one pixel point of each feature amount of the texture feature value of the pixel point in the weighted sum, as the new grey value of the pixel point so as to obtain texture feature image.)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate a well- known method of generating new grey value image by combining two gray value images using weighted sum taught by ZHANG into BENGTSSON,
The suggestion/motivation for doing so would have been allows user of BENGTSSON to generate high quality texture image by reducing noise, enhancing detail and improving the contrast.
As to claim 7, ZHANG teaches the input image comprise a phase image (claim1, according to the monitoring area gray scale of each pixel point in two-phase remote sensing image to obtain texture feature image reflects the texture feature of each remote sensing image of; The grey difference of two time phase remote sensing image and texture difference of texture characteristic image.
Claim 9 is rejected the same as claim 1 except claim 9 is directed to a method claim. All the limitation of claim 9 is addressed in claim 1. Thus, argument analogous to that presented above for claim 1 is applicable to claim 9.
Claims 10 and 14 are rejected the same as claim 2 except claims 10 and 14 are directed to a method claim and a system claim respectively. All the limitation of claims 10 and 14 are addressed in claim 2. Thus, argument analogous to that presented above for claim 2 is applicable to claims 10 and 14.
Claim 11 is rejected the same as claim 4 except claim 11 is directed to a method claim. All the limitation of claim 11 is addressed in claim 4. Thus, argument analogous to that presented above for claim 4 is applicable to claim 11.
Claim 13 is rejected the same as claim 6 except claim 13 is directed to a method claim. All the limitation of claim 13 is addressed in claim 6. Thus, argument analogous to that presented above for claim 6 is applicable to claim 13.
As to claim 15, BENGTSSON teaches non-transitory computer readable medium storing instructions that, when executed by one or more processors, cause the one or more processors (claim 9. A computer-program product tangibly embodied in a non-transitory machine-readable storage medium, including instructions configured to cause one or more data processors to perform actions including:…..)
Regarding the remaining limitations of claim 15, all the remaining limitations are similar to that of claim 1.Thus, the remaining limitations of claim 15 are set forth and rejected as per discussion for claim 1.
Regarding claims 16 and 19 all the claim limitations are set forth and rejected as per discussion for claims 2 and 15.
Regarding claim 17 all the claim limitations are set forth and rejected as per discussion for claims 4 and 15.
Regarding claim 18, all the claim limitations are set forth and rejected as per discussion for claims 6 and 15.
Regarding claim 19, all the claim limitations are set forth and rejected as per discussion for claims 2 and 15.
9. Claims 8 and 20 are rejected under 35 U.S.C. 103(a) as being unpatentable over BENGTSSON), US-provisional-application US 62880898 ,in view of ZHANG, CN108805920 A, further in view of JP WO2008102898, pub. 05/27/2010.
As to claim 8, BENGTSSON teaches method of generating the mask and first image texture feature value (see claim 1 above).
However, BENGTSSON does not specifically teach the underline section of the limitation “determining that the first image texture feature value for a given pixel of the plurality of pixels meets or exceeds a defined threshold value, generating the mask in respect of the given pixel.”
On the other hand JP WO2008102898 teaches determining that the first image texture feature value for a given pixel of the plurality of pixels meets or exceeds a defined threshold value, generating the mask in respect of the given pixel ( page 13 5th par., and last par., The texture determination process determines a texture for each small region around the target pixel of the reference image, selects a pixel whose texture is greater than or equal to a predetermined threshold around the target pixel of the reference image, and uses the selected pixel to Generate a value mask. For the texture determination process, various methods such as dispersion of pixel values in a small area equal to or greater than a threshold value can be used. Note that the mask generated by the texture determination process described above is a binary mask, but the texture determination process of the present invention is not limited thereto, and a multi-value mask may be generated. The multi-value mask is set so that the value of the multi-value mask increases as the value of the value increase)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate multi-value mask taught by JP WO2008102898 into modified BENGTSSON.
The suggestion/motivation for doing so would have been allows user of modified BENGTSSON to generate high quality texture mask. It is known that the multi-value masks assign each pixel a specific value, corresponding to its object class or category. This allows for a more detailed and accurate representation of object boundaries and shapes.
Regarding claim 20, all the claim limitations are set forth and rejected as per discussion for claims 8 and 15.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Contact Information
Any inquiry concerning this communication or earlier communication from the examiner should be directed to Mekonen Bekele whose telephone number is (469) 295-9077.The examiner can normally be reached on Monday -Friday from 9:00AM to 6:50 PM Eastern Time.
If attempt to reach the examiner by telephone are unsuccessful, the examiner’s supervisor Eng, George can be reached on (571) 272-7495.The fax phone number for the organization where the application or proceeding is assigned is 571-237-8300. Information regarding the status of an application may be obtained from the patent Application Information Retrieval (PAIR) system. Status information for published application may be obtained from either Private PAIR or Public PAIR.
Status information for unpublished application is available through Privet PAIR only.
For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have question on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866.217-919 (tool-free)
/MEKONEN T BEKELE/Primary Examiner, Art Unit 2699