DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The Amendment filed 03/02/2026 overcomes the following:
Objection to Claims 1, 7 and 10 for minor informalities; and
Interpretation of claims 10-15 under 35 USC 112(f).
Response to Arguments
Applicant’s arguments filed 03/02/2026 have been fully considered but they are not persuasive.
In particular, regarding independent claims 1 and 10, on pages 7-9 of the Remarks, Applicant asserts that Agaian does not disclose the features of “generating mask information indicating morphology patterns of the tissue” and “providing the mask information to highlight portions of the first image information associated with morphology of the tissue”. Examiner respectfully disagrees.
Regarding Applicant’s assertion, Examiner notes that Applicant’s published Specification at ¶0118 specifies that tissue morphological features include "tumor area, tumor length and tumor percentage".
Agaian, at ¶0029, states “The outputs of an exemplary computer-aided diagnosis system for prostate cancer provide information about the disease severity and the location and extension of cancerous tissue. In particular, the output of the system may provide a Gleason grade, a localized map of Gleason grades, the percentage of each Gleason grade in each slide, a Gleason score identifying the dominant grade and secondary grade, and/or a Gleason score identifying multiple grades for the data sample” (emphasis added). Additionally, Agaian at ¶0199 states “The results of the computer-aided diagnosis system (i.e. detected cancerous regions, cancer grade, localization, extent, area occupied by cancer, tumor volume, etc.), are treated as a second opinion and are presented to the human pathologist evaluation for approval” (emphasis added). From at least these paragraphs, it is clear that an output of detected cancerous regions, localization maps, extent, area occupied by cancer, and tumor volume within the digital whole slide image in Agaian correspond to the elements of mask information and tissue morphology as described in Applicant’s own specification. Examiner additionally notes that the claims do not further specify what features comprise the claimed “tissue morphology”. Therefore, the rejection of independent claims 1 and 10 and their respective dependent claims is maintained below.
On pages 8-9 of the Remarks, Applicant further asserts that Agaian does not disclose the feature “classifying a whole slide image of the pathological slide” as recited in claims 1 and 10.
Examiner respectfully disagrees.
Agaian at ¶0029 states “In particular, the output of the system may provide a Gleason grade, a localized map of Gleason grades, the percentage of each Gleason grade in each slide, a Gleason score identifying the dominant grade and secondary grade, and/or a Gleason score identifying multiple grades for the data sample” (emphasis added). This indicates that the output of the classification system is associated with the entire sample, i.e., the whole slide image. Furthermore, Agaian at ¶0190 states “Furthermore, the computed histogram is used to quantify the area occupied by each Gleason grade in a histopathology image corresponding to a region of interest or a biopsy whole-slide” (emphasis added). It is clear from this disclosure that the classification algorithm of Agaian is applied to the whole slide image to yield an output classification for the whole slide image. Examiner further notes that while the claim recites “classifying a whole slide image”, it does not further specify what the output of that classification is. For example, is the classification of the WSI a single slide score, a vector of values corresponding to all of the image tiles, or some other value? For these reasons, the rejection of claims 1-15 as set forth in the previous Office action are maintained.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-2, 4, 6-11 and 13-15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US PG PUB. 2016/0253466A1 (hereinafter “Agaian”).
Regarding claim 1, Agaian discloses a method for classifying morphology based on pathological slides (Agaian, Fig.4-5, Fig. 8, ¶0029, 0099-0148, 0198-0199) comprising:
a. obtaining first image information associated with a first pathological slide of tissue, wherein the pathological slide is divided into a plurality of tiles and the first image information is associated with a first tile of the plurality of tiles and second image information is associated with a second tile of the plurality of tiles (Agaian, ¶0029, 0099-0148, 0198-0199, Figs. 4-5 and Fig. 8; “When processing whole-slides, a grid of size s×s is placed over the image, and the classification process described below is performed once per resulting block.”);
b. classifying the first image information using a first machine learning algorithm trained using a first training set, where the first image information is an input, and the machine learning algorithm provides first classification information associated with the first image information associated with morphology of the tissue (Agaian, ¶0029, 0099-0148, 0198-0199, Figs. 4-5 and Fig. 8; “feature vectors for histopathology images are based on texture, morphology, color, and graph features” and “The generated features are then subject to classification, resulting in a biopsy diagnosis for cancer presence, classification, grade, or score”);
c. generating mask information indicating morphology patterns of the tissue based on the first classification information provided in the classifying step (Agaian, ¶0029, 0099-0148, 0198-0199, Figs. 4-5 and Fig. 8; “the output of the system may provide a Gleason grade, a localized map of Gleason grades, the percentage of each Gleason grade in each slide” wherein “feature vectors for histopathology images are based on texture, morphology, color, and graph features” and “The generated features are then subject to classification, resulting in a biopsy diagnosis for cancer presence, classification, grade, or score… The results of the computer-aided diagnosis system (i.e. detected cancerous regions, cancer grade, localization, extent, area occupied by cancer, tumor volume, etc.)..are presented”);
d. providing the mask information to a user interface, wherein the user interface is configured to display the first image information and the mask information to highlight portions of the first image information associated with morphology of the tissue (Agaian, ¶0029, 0099-0148, 0198-0199, Figs. 4-5 and Fig. 8; “The results of the computer-aided diagnosis system (i.e. detected cancerous regions, cancer grade, localization, extent, area occupied by cancer, tumor volume, etc.), are treated as a second opinion and are presented to the human pathologist”);
e. repeating steps (a) to (d) for the second image information and respective image information associated with each tile of the plurality of tiles (Agaian, ¶0029, 0099-0148, 0198-0199, Figs. 4-5 and Fig. 8; “the classification process described below is performed once per resulting block”); and
f. classifying a whole slide image of the pathological slide associated with the first image information, second image information and respective image information based on the first image information, the second image information and the respective image information associated with each tile of the plurality of tiles (Agaian, ¶0029, 0148, 0190-0194; “The generated features are then subject to classification, resulting in a biopsy diagnosis for cancer presence, classification, grade, or score” wherein “The outputs of an exemplary computer-aided diagnosis system for prostate cancer provide information about the disease severity and the location and extension of cancerous tissue. In particular, the output of the system may provide a Gleason grade, a localized map of Gleason grades, the percentage of each Gleason grade in each slide, a Gleason score identifying the dominant grade and secondary grade, and/or a Gleason score identifying multiple grades for the data sample”).
Regarding claim 2, claim 1 is incorporated, and Agaian further discloses wherein the first image information is obtained from a database (Agaian, ¶0144; “Various simulations in accordance with this embodiment of the present invention were run on a database of 71 color images of Hematoxylin and Eosin (H&E)-stained prostate tissue samples.”).
Regarding claim 4, claim 1 is incorporated, and Agaian further discloses wherein the first image information is provided in a format compatible with the machine learning algorithm (Agaian, ¶0027; “The features extracted using the methods mentioned along with other color, texture, and morphology features are employed to train, test and calibrate data classifiers. The results of several machine learning tools trained on diverse pattern data are combined to produce a robust predictor that accurately detect various types and grades or scores of cancerous lesions.”).
Regarding claim 6, claim 1 is incorporated, and Agaian further discloses wherein the mask information includes mask information highlighting at least one of cellular patterns and histology patterns (Agaian, ¶0029; “The outputs of an exemplary computer-aided diagnosis system for prostate cancer provide information about the disease severity and the location and extension of cancerous tissue. In particular, the output of the system may provide a Gleason grade, a localized map of Gleason grades, the percentage of each Gleason grade in each slide, a Gleason score identifying the dominant grade and secondary grade, and/or a Gleason score identifying multiple grades for the data sample.”).
Regarding claim 7, claim 1 is incorporated, and Agaian further discloses wherein the whole slide image classification is based on a whole slide image histogram that provides a vector associated with the whole slide image and is provided as an input to a second machine learning algorithm trained by prior whole slide image histograms to provide a whole slide image classification (Agaian, ¶0184, 0190-0194; “First, a regular grid is placed on histopathology images containing fairly homogeneous Gleason grades and the blocks that best represent the image grade are stored as codeword candidates. Each block is subject to feature extraction for description. Textural features in spatial and wavelet domain and fractal-like features can be used to represent blocks; the most discriminative sets of computed features become members of the codeword dictionary...Next, a histogram containing the rate of recurrence of each codeword within an image is constructed by matching blocks using a correlation or any other distance measurement. The resulting histogram is the feature vector, which is the input to a learning algorithm for image classification. Furthermore, the computed histogram is used to quantify the area occupied by each Gleason grade in a histopathology image corresponding to a region of interest or a biopsy whole-slide. A Gleason score with the two or three most frequent Gleason patterns is an output of the system if the most frequent pattern(s) in the histogram match the most probable classes according to the prediction of the multi-classifier system.”)
Regarding claim 8, claim 7 is incorporated, and Agaian further discloses storing the first image information, the second image information, the respective image information, the mask information and the whole slide image classification in memory configured to store objects and text (Agaian, ¶0144; “Various simulations in accordance with this embodiment of the present invention were run on a database of 71 color images of Hematoxylin and Eosin (H&E)-stained prostate tissue samples.”).
Regarding claim 9, claim 1 is incorporated, and Agaian further discloses wherein the first training set is stored in memory configured to store objects and text (Agaian, ¶0144; “Various simulations in accordance with this embodiment of the present invention were run on a database of 71 color images of Hematoxylin and Eosin (H&E)-stained prostate tissue samples.”).
Claim 10 recites a system having features corresponding to the elements recited in method claim 1, the rejection of which is applicable here.
Claim 11 recites a system having features corresponding to elements recited in method claim 2, the rejection of which is applicable here.
Claim 13 recites a system having features corresponding to the elements recited in method claim 6, the rejection of which is applicable here.
Claim 14 recites a system having features corresponding to the elements recited in method claim 8, the rejection of which is applicable here.
Claim 15 recites a system having features corresponding to the elements recited in method claim 9, the rejection of which is applicable here.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Agaian, as applied to claim 1 above, in view of US PG PUB. 2020/0258223 A1 (hereinafter “Yip”; applicant-submitted prior art).
Regarding claim 3, claim 1 is incorporated, and Agaian does not expressly teach wherein the first image information is obtained from a cloud storage system, but, in an analogous field of endeavor, Yip does as follows.
Yip teaches wherein the first image information is obtained from a cloud storage system (Yip, ¶0121; “the system 102 may be integrated with a histopathology imaging system, such as a digital H&E stain imaging system, e.g. to allow for expedited biomarker analysis and reporting at the imaging station. Indeed, any of the functions described in the techniques herein may be distributed across one or more network accessible devices, including cloud-based devices; and ¶0406 – “In some examples, the system 3016 is cloud based, and stores generated images from (or instead of) the database 3014.”).
Yip is considered analogous art because it pertains to histopathology image analysis. Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify method taught by Agaian to obtain the tile images from a cloud storage system, as taught by Yip, in order to allow pathologists to more efficiently access, view and manipulate histopathology images with various classification overlays (Yip, ¶0406).
Claims 5 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Agaian, as applied to claims 1 and 10 above, in view of “Scalable Storage of Whole Slide Images and Fast Retrieval of Tiles Using Apache Spark” (hereinafter “Barron”; published 2018; applicant-submitted prior art).
Regarding claim 5, claim 1 is incorporated, and Agaian does not expressly teach the limitations as further claimed, but, in an analogous field of endeavor, Barron does as follows.
Barron teaches wherein the first image information includes slide ID information associated with a respective slide associated with the first image information and tile location information associated with a position of the tile in the respective slide (Barron, Section 2.2; “A tile is extracted from a WSI using OpenSlide.7 Each tile is denoted by a variable-length record [fileID, imageLevel, tileIndex, tileWidth, tileHeight, tileBytes], where fileID is the name of the WSI and tileBytes denotes the actual tile content. Once all the records are created, our system can store the records in Spark SQL”).
Barron is considered analogous art because it pertains to retrieving whole slide image tiles for image analytics purposes. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method taught by Agaian to include obtaining the image tiles with associated information including the whole slide image file ID and tile index indicating a respective tile’s position within the WSI, as taught by Barron, in order to increase computational efficiency and reduce time required for image retrieval (Barron, Introduction).
Claim 12 recites a system having features corresponding to the steps recited in method claim 5. Therefore, the recited elements of Claim 12 are mapped to the proposed combination in the same manner as the corresponding elements in Claim 5. Additionally, the rationale and motivation to combine the Agaian and Barron references presented in the rejection of Claim 5 apply to this claim.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The additionally cited Barker reference pertains to coarse-to-fine whole slide image classification based on initial tile-level classifications.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SAMAH A BEG whose telephone number is (571)270-7912. The examiner can normally be reached M-F 9 AM - 5 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, HENOK SHIFERAW can be reached at 571-272-4637. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SAMAH A BEG/Primary Examiner, Art Unit 2676