DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of claims: claims 1-20 are examined below.
Response to Arguments
Applicant's arguments filed 10/21/2025 have been fully considered but they are not persuasive.
Applicant remark – (pages 2-3) Applicant argue that lack of teaching of claim element/concept/language regarding the combination of normal and abnormal patches of H&E stain, as well as training using these tiles. Please read the Remarks for more detail.
Examiner response – Examiner respectfully disagree. Naik et al (US 2021/0280311) teaches the function of the attention model that are used in figure 2 and paragraph 0026, figure 3 and 0043-0044, and figure 4 and paragraph 0050 for training as well, where attention model is used to combines the tiles feature for a weighted sum of the tiles feature vectors (normal and abnormal) within the feature vectors to produce the aggregated feature vector. Please see the Office Action below for further detail.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Naik et al (US 2021/0280311).
Claim 1:
Naik et al (US 2021/0280311) anticipated the following subject matter:
A method for building a model for determining hormone receptor status via whole slide images (WSIs) of hematoxylin and eosin (H&E) stain of a biopsy of a subject, comprising:
(a) obtaining a plurality of WSIs of H&E stain of the biopsy, in which each WSIs comprises a hormone receptor information (figure 3 and 0039-0043, figure 3 step 310 teach H&E image of positive and negative receptor status, where 0016 teaches H&E-stained whole slide image);
(b) dividing each of the WSIs of step (a) into a plurality of patches (figure 3 part 320-330 teach partitions 320 the H&E stain image into a plurality of image tiles);
(c) classifying the normal and abnormal H&E stain in each of the patches of step (b) by performing tiles extraction (figure 5 and 0054-0056 teaches analytic on the tile for positive and false positive (negative) with cross-validation; 0026 teaches feature vector for H&E stain image are extracted);
(d) selecting and combining the classified patches of step (c) that exhibit the abnormal H&E stain to produce a combined image of each of the WSIs of H&E stain (0004-0005 teaches training of prediction model from second set of H&E stain images (combined images) from tissue sample having negative receptor status; 0018 teaches train ML (machine learning) with negative label; figure 2 and 0026 detail attention model 240 that combines the tiles feature for a weighted sum of the tiles feature vectors (normal and abnormal) within the feature vectors to produce the aggregated feature vector, 0043-0044 teaches figure 3 with attention model to generate aggregated feature vector for stain tiles in the sample subset, figure 4 and 0050 detail the use of attention model for training of the attention weight of the tile feature vectors); and
(e) training a plurality of combined images independently produced from step (d) with the aid of the hormone receptor information of step (a) thereby establishing the model (in addition to 0004, paragraph 0044 teach train with featurization model with batch collated on hard negative in regard to hormone receptors status for H&E stain image),
wherein the hormone receptor information of step (a) comprises a positive or negative expression of a hormone receptor selected from the group consisting of an estrogen receptor (ER), a progesterone receptor (PR), and/or a combination thereof (0004-0005 and 0014 teaches consideration regarding the tumor cells grow in the presence of estrogen (ER) and/or progesterone (PR), such as breast cancer).
Claim 2:
The method of claim 1, wherein in step (e), the plurality of combined images is trained by performing a vector-regularized complex matrix factorization (CMF) method, which comprises:(e-1) obtaining a complex matrix from the complex values of each combined images (0026 teach produce aggregate (combine or add together) feature vector of the tiles); (e-2) converting the complex matrix into a complex column vector for each combined images (0026 teaches generate feature matrix from the aggregate vector of tiles); and(e-3) classifying each combined images into the positive or negative expression of the hormone receptor based on the similarities among the complex column vector obtained in step (e-2) (0026 teaches use of MIL attention model 240 (classifying) for discriminatory (similarities) in predicting base on positive and potential positive instances in a test H&E stain image) .
Claim 3:
The method of claim 2, wherein step (e-3) is carried out by performing k-nearest neighbors (k-NN) algorithm (0032 teaches use of regression and supervised which are the components of k-NN).
Claim 4:
The method of claim 1, wherein steps (c), (d), and (e) are carried out by deep learning algorithms (0032 teaches machine-learning model such as regression algorithms, classifier algorithms, decision tree algorithms, mixture models, a convolutional neural network, or another supervised learning (deep learning) approach, where one ordinary skill in the art understand the different models and can include deep learning model onto the list, where deep model learning is a general model and known by one ordinary in the art that is within machine-learning mentioned above).
Claim 5:
Naik et al teach:
The method of claim 1, wherein the subject has or is suspected of having a breast cancer (0004-0005 and 0014 teaches consideration regarding the tumor cells grow in the presence of estrogen (ER) and/or progesterone (PR), such as breast cancer).
Claim 6:
Naik et al (US 2021/0280311) anticipated the following subject matter:
A method for determining a hormone receptor status based on a whole slide image (WSI) of hematoxylin and eosin (H&E) stain of a biopsy of a subject, comprising:
(a) dividing the WSI of H&E stain into a plurality of patches (figure 3 and 0039-0043, figure 3 step 310 teach H&E image of positive and negative receptor status, where 0016 teaches H&E-stained whole slide image);
(b) selecting and combining the patches that exhibit an abnormal H&E stain to produce a test image by performing tiles extraction (0004-0005 teaches training of prediction model from second set of H&E stain images (combined images) from tissue sample having negative receptor status; 0018 teaches train ML (machine learning) with negative label; figure 2 and 0026 detail attention model 240 that combines the tiles feature for a weighted sum of the tiles feature vectors (normal and abnormal) within the feature vectors to produce the aggregated feature vector, 0043-0044 teaches figure 3 with attention model to generate aggregated feature vector for stain tiles in the sample subset, figure 4 and 0050 detail the use of attention model for training of the attention weight of the tile feature vectors); and
(c) determining the hormone receptor status by processing the test image produced in step (b) within the model established by the method of claim 1,wherein the hormone receptor status comprises a positive or negative expression of a hormone receptor selected from the group consisting of an estrogen receptor (ER), a progesterone receptor (PR), and/or a combination thereof (0004-0005 and 0014 teaches consideration regarding the tumor cells grow in the presence of estrogen (ER) and/or progesterone (PR), such as breast cancer).
Claim 7:
The method of claim 6, wherein in step (c), the test image is processed by performing a vector- regularized complex matrix factorization (CMF) method, comprising:(c-1) obtaining a complex matrix from the complex values of the test image (0026 teach produce aggregate (combine or add together) feature vector of the tiles); (c-2) converting the complex matrix into a complex column vector for the test image (0026 teaches generate feature matrix from the aggregate vector of tiles, where anatomy of a matrix are columns); and (c-3) classifying the test image into the positive or negative expression of the hormone receptor based on the absolute distance between the complex column vector of the test image obtained in step (c-2) and those of the combined images in the model (0026 teaches use of MIL attention model 240 (classifying) for discriminatory (similarities) in predicting base on positive and potential positive instances in a test H&E stain image).
Claim 8:
The method of claim 7, wherein step (c-3) is carried out by performing k-nearest neighbors (k-NN) algorithm (0032 teaches use of regression and supervised which are the components of k-NN).
Claim 9:
The method of claim 8, wherein the hormone receptor status further comprises an expression intensity of the hormone receptor (0015 teaches detecting intensity by expressed color, stain, percentage or cell, or presence/absence of stain; 0025 teaches intensity by optimal threshold pixel; 0035 teaches mean pixel intensity of stain images; 0042 teaches intensity between first and second set (positive and negative)).
Claim 10:
The method of claim 9, wherein the vector-regularized CMF method further comprises (c-4) determining the expression intensity of the hormone receptor in the test image based on the ratio between numbers of complex column vectors that are respectively corresponding to the positive and negative expression in the combined images of the model (0015 teaches intensity by percentage and 0042 teaches intensity between first and second set (positive and negative), where both are ways to view ratio).
Claim 11:
The method of claim 6, wherein steps (b) and (c) are carried out by deep learning algorithms (0032 teaches machine-learning model such as regression algorithms, classifier algorithms, decision tree algorithms, mixture models, a convolutional neural network, or another supervised learning approach, where one ordinary skill in the art understand the different models and can include deep learning model onto the list, where deep model learning is a general model and known by one ordinary in the art).
Claim 12:
The method of claim 6, wherein the subject has or is suspected of having a breast cancer (0004-0005 and 0014 teaches consideration regarding the tumor cells grow in the presence of estrogen (ER) and/or progesterone (PR), such as breast cancer).
Claim 13.
Naik et al (US 2021/0280311) anticipated the following subject matter:
A system (0005 teaches system perform method with a processor) for identifying a hormone receptor status of a subject, comprising:
an image collecting unit configured to collect one or more candidate whole slide images (WSIs) of hematoxylin and eosin (H&E) stain of a biopsy from the subject (figure 3 and 0039-0043, figure 3 step 310 teach H&E image of positive and negative receptor status, where 0016 teaches H&E-stained whole slide image);
a server configured to store a model established by the method of claim 1, and to receive the one or more candidate WSIs of H&E stain transmitted from the image collecting unit (0023 teaches communication pathways between the client 110 devices, the sensor 120, the application server 130, and the database server 140); and
a processor (0004-0005 teaches system with method comprise processor) programmed with instructions to execute a method for determining the hormone receptor status of the one or more candidate WSIs of H&E stain (figure 3 and 0039-0043, figure 3 step 310 teach H&E image of positive and negative receptor status, where 0016 teaches H&E-stained whole slide image) transmitted from the server, wherein the method comprises,
(a) dividing each of the candidate WSIs of H&E stain into a plurality of patches (figure 3 part 320-330 teach partitions 320 the H&E stain image into a plurality of image tiles);
(b) selecting and combining the patches respectively expressing abnormal H&E stains to produce a test image by performing tiles extraction (0004-0005 teaches training of prediction model from second set of H&E stain images (combined images) from tissue sample having negative receptor status; 0018 teaches train ML (machine learning) with negative label; figure 2 and 0026 detail attention model 240 that combines the tiles feature for a weighted sum of the tiles feature vectors (normal and abnormal) within the feature vectors to produce the aggregated feature vector, 0043-0044 teaches figure 3 with attention model to generate aggregated feature vector for stain tiles in the sample subset, figure 4 and 0050 detail the use of attention model for training of the attention weight of the tile feature vectors); and
(c) determining the hormone receptor status by processing the test image produced in step (b) with the aid of the model stored in the server (paragraph 0044 teach train with featurization model with batch collated on hard negative in regard to hormone receptors status for H&E stain image; 0021 teaches trained using labeled image (aid of a model) from the server), wherein the hormone receptor status comprises a positive or negative expression of a hormone receptor selected from the group consisting of an estrogen receptor (ER), a progesterone receptor (PR), and/or a combination thereof (0004-0005 and 0014 teaches consideration regarding the tumor cells grow in the presence of estrogen (ER) and/or progesterone (PR)).
Claim 14:
The system of claim 13, wherein in step (c) of the method, the test image is processed by performing a vector-regularized complex matrix factorization (CMF) method, comprising:(c-1) obtaining a complex matrix from the complex values of the test image (0026 teach produce aggregate (combine or add together) feature vector of the tiles); (c-2) converting the complex matrix into a complex column vector for the test image (0026 teaches generate feature matrix from the aggregate vector of tiles); and (c-3) classifying the test image into the positive or negative expression of the hormone receptor based on the absolute distance between the complex column vector of the test image obtained in step (c-2) (0026 teaches use of MIL attention model 240 (classifying) for discriminatory (similarities) in predicting base on positive and potential positive instances in a test H&E stain image) and those of the combined images in the model stored in the server (paragraph 0044 teach train with featurization model with batch collated on hard negative in regard to hormone receptors status for H&E stain image; 0021 teaches trained using labeled image (stored model) from the server).
Claim 15:
The system of claim 14, wherein step (c-3) is carried out by performing k-nearest neighbors (k-NN) algorithm (0032 teaches use of regression and supervised which are the components of k-NN).
Claim 16:
The system of claim 14, wherein the hormone receptor status further comprises an expression intensity of the hormone receptor (0015 teaches detecting intensity by expressed color, stain, percentage or cell, or presence/absence of stain; 0025 teaches intensity by optimal threshold pixel; 0035 teaches mean pixel intensity of stain images; 0042 teaches intensity between first and second set (positive and negative)).
Claim 17:
The system of claim 16, wherein the vector-regularized CMF method further comprises (c-4) determining the expression intensity of the hormone receptor in the test image based on the ratio between numbers of complex column vectors that are respectively corresponding to the positive and negative expression in the combined images within the model stored in the server (0015 teaches intensity by percentage and 0042 teaches intensity between first and second set (positive and negative), where both are ways to view ratio).
Claim 18:
The system of claim 13, wherein steps (b) and (c) are carried out by deep learning algorithms (0032 teaches machine-learning model such as regression algorithms, classifier algorithms, decision tree algorithms, mixture models, a convolutional neural network, or another supervised learning (deep learning) approach, where one ordinary skill in the art understand the different models and can include deep learning model onto the list, where deep model learning is a general model and known by one ordinary in the art).
Claim 19:
Naik et al (US 2021/0280311) anticipated the following subject matter:
A method for determining and treating a breast cancer in a subject in need thereof, comprising:
(a) obtaining a whole slide image (WSI) of hematoxylin and eosin (H&E) stain from a biopsy of the subject (figure 3 and 0039-0043, figure 3 step 310 teach H&E image of positive and negative receptor status, where 0016 teaches H&E-stained whole slide image);
(b) determining a hormone receptor status of the subject (0004, paragraph 0044 teach train with featurization model with batch collated on hard negative in regard to hormone receptors status for H&E stain image) by using the method of claim 8; and
(c) administering an anti-cancer treatment to the subject based on the hormone receptor status of step (b), wherein,the hormone receptor status comprises a positive or negative expression of a hormone receptor selected from the group consisting of an estrogen receptor (ER), a progesterone receptor (PR), and/or a combination thereof (0004-0005 teaches training of prediction model from second set of H&E stain images (combined images) from tissue sample having negative receptor status; 0018 teaches train ML (machine learning) with negative label; figure 2 and 0026 detail attention model 240 that combines the tiles feature for a weighted sum of the tiles feature vectors (normal and abnormal) within the feature vectors to produce the aggregated feature vector, 0043-0044 teaches figure 3 with attention model to generate aggregated feature vector for stain tiles in the sample subset, figure 4 and 0050 detail the use of attention model for training of the attention weight of the tile feature vectors), and an expression intensity (0015 teaches detecting intensity by expressed color, stain, percentage or cell, or presence/absence of stain; 0025 teaches intensity by optimal threshold pixel; 0035 teaches mean pixel intensity of stain images; 0042 teaches intensity between first and second set (positive and negative)) thereof; and
the anti-cancer treatment is selected from the group consisting of a surgery, a radiofrequency ablation, a systemic chemotherapy, a transarterial chemoembolization (TACE), an immunotherapy, a targeted drug therapy, a hormone therapy, and a combination thereof (0020 teaches treatment such as surgery, chemotherapy, biological therapy, radiation therapy, or some combination thereof).
20. The method of claim 19, wherein the subject is a human (figure 6 and 0057-0065 teaches predict ability for human epidermal growth factor).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
FENG et al (US 2022/0319704) teaches METHODS FOR CHARACTERIZING AND TREATING A CANCER TYPE USING CANCER IMAGES – paragraph 0123 teach cancer sample on the slide is stained (step 920), preferably (but not necessarily) with hematoxylin and eosin (H&E), again following standard clinical practice. An image (typically digital) of the slide (e.g. a whole slide image (WSI)) is then taken, where 0289 for signature analysis preform using indel feature with Bayesian variant of non-negative matrix factorization
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TSUNG-YIN TSAI whose telephone number is (571)270-1671. The examiner can normally be reached 7am-4pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bhavesh Mehta can be reached at (571) 272-7453. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TSUNG YIN TSAI/Primary Examiner, Art Unit 2656