DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Applicant’s amendment was received on 10/16/25 and has been entered and made of record. Currently, claims 1-19 are pending.
Information Disclosure Statement
The listing of references in the PCT international search report is not considered to be an information disclosure statement (IDS) complying with 37 CFR 1.98. 37 CFR 1.98(a)(2) requires a legible copy of: (1) each foreign patent; (2) each publication or that portion which caused it to be listed; (3) for each cited pending U.S. application, the application specification including claims, and any drawing of the application, or that portion of the application which caused it to be listed including any claims directed to that portion, unless the cited pending U.S. application is stored in the Image File Wrapper (IFW) system; and (4) all other information, or that portion which caused it to be listed. In addition, each IDS must include a list of all patents, publications, applications, or other information submitted for consideration by the Office (see 37 CFR 1.98(a)(1) and (b)), and MPEP § 609.04(a), subsection I. states, “the list ... must be submitted on a separate paper.” Therefore, the references cited in the international search report have not been considered. Applicant is advised that the date of submission of any item of information in the international search report will be the date of submission of the IDS for purposes of determining compliance with the requirements for the IDS with 37 CFR 1.97, including all timing statement requirements of 37 CFR 1.97(e). See MPEP § 609.05(a).
Drawings
The drawings were received on 10/16/25. These drawings are accepted.
Applicant’s amendment to Fig. 1 has overcome the objection set forth in the previous Office Action and has therefore been withdrawn.
Claim Rejections - 35 USC § 112
Applicant’s amendment to claim 6 has overcome the rejection set forth in the previous Office Action and has therefore been withdrawn.
Response to Arguments
Applicant's arguments filed 10/16/25 have been fully considered but they are not persuasive.
The applicant asserts Otsuka (US 2013/0071002) does not teach a pre-trained algorithm automatically determines a disease from a virtually stained tissue sample and determines whether a specific chemical stain is required based on the number of pixels impacted by the stain in a digitally stained image, not based on a disease indication, whether generated by an Al algorithm or otherwise. The Examiner respectfully disagrees as the combination of Otsuka and Yip (WO 2020/0198380) discloses the above-mentioned features. Particularly, Otsuka discloses following the creation of a digitally-stained image, the controller 218 causes the staining-necessity determining device 230 to process the image data corresponding to the digitally-stained image and to calculate, at step 708, a figure of merit or index representing a specific-staining rating R corresponding to the digitally-stained image. The figure of merit is based on image data representing such change of optical characteristics of a sample in response to digital staining that is indicative of the presence of the pathological disease (para 38). The value of the specific-staining rating R is compared with a predetermined specific-staining reference (threshold rating value). The threshold rating value is defined for a given type of tissue and a given specific dye based on empirically-collected training data (stored, for example, at the server 250 of FIG. 2), that represents changes in optical characteristics of numerous types of biological specimen having various diseases to different specific dyes. When the rating R is equal to or exceeds the threshold rating value, the computer-implemented staining-necessity determination device 230 produces an output such as, for example, a visual indicator delivered to the display device 246 that notifies the user that a specific staining of the specimen with a corresponding specific dye is required (para 39). The computer-assisted evaluation of the digitally-stained image(s) produces a specific-staining rating or score value associated with the evaluated digitally-stained image. In one embodiment, the score or rating is a number representing a ratio of the area of the imaged tissue affected by the malady specified at the step of preliminary clinical diagnosis. For example, the staining-necessity resolving device 230 calculates the number of pixels of the tissue-portion of the digitally-stained image that exhibit the same spectral characteristics as those that are affected by the disease specified at the initial clinical diagnosis (para 45).
Yip discloses an imaging-based biomarker prediction system is formed of a deep learning framework configured and trained to directly learn from histopathology slides and predict the presence of biomarkers in medical images. The deep learning frameworks may be configured and trained to analyze medical images and identify biomarkers that indicate the presence of a tumor, a tumor state/condition, or information about a tumor of the tissue sample (para 84). The present techniques provide for machine learning assisted histopathology image review that includes automatically identifying and contouring a tumor region, and/or characteristics of regions or cell types within a region (for example, lymphocytes, PD-L1 positive cells, tumors having a high degree of tumor budding, etc.), counting cells within that tumor region, and generating a decision score to improve the efficiency and the objectivity of pathology slide review (para 94). As used herein, "histopathology images" refers to digital (including digitized) images of microscopic histopathology developed tissue. Examples include images of histological stained specimen tissue, where histological staining is a process undertaken in the preparation of sample tissues to aid in microscopic study. In some examples, the histopathology images are digital images of hematoxylin and eosin stain (H&E) stained histopathology slides (para 108). FIG. 1 illustrates a prediction system 100 capable of analyzing digital images of histopathology slides of a tissue sample and determining the likelihood of biomarker presence in that tissue, where biomarker presence indicates a predictive tumor presence, a predicted tumor state/condition, or other information about a tumor of the tissue sample, such as a possibility of clinical response through the use of a treatment associated with the biomarker (para 111).
The applicant appears to argue the preliminary diagnosis described by Otsuka somehow precludes a diagnosis by an artificial intelligence algorithm. First, the preliminary diagnosis is performed automatically (para 26) and it is used to determine the type of virtually staining to be performed on the tissue sample. The specification describes obtaining a tissue sample that is suspected of having some type of disease or abnormality through a biopsy, or other similar means. The specification is silent to how a virtual or digital type of staining is determined. The specification does state a stainer may stain the slice of the tissue block using any staining protocol (para 62). The specification also states in some implementations, the machine learning algorithm may use other inputs in addition to the virtual stained images in generating the first diagnosis. Example additional source(s) of data which can be used as input(s) include: patient history, clinical notes, and/or other testing data (para 95). The entire point of Otsuka is to confirm an initial diagnosis obtained through the use of trained algorithms to automatically order physical chemical staining of a tissue sample that was virtually stained, so the diagnosis can be verified by a histologist. Yip discloses an artificial intelligence algorithm used to determine a disease based on stained tissue samples. Thus, the combination of Otsuka and Yip disclose the invention set forth in claims 1, 11, and 16.
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to combine the artificial intelligence algorithm, as described by Yip, with the system of Otsuka. Otsuka discloses a pre-trained automatic decision algorithm without specifically mentioning artificial intelligence and/or machine learning. Based on the teachings of Yip, it would have been obvious to one of ordinary skill in the art would to utilize an artificial intelligence algorithm as the pre-trained automatic decision algorithm.
The suggestion/motivation for doing so would have been to provide a robust algorithm and increase proficiency and efficiency of diagnosis thereby saving the time of a histopathologist and aiding in proper diagnosis.
Claim Rejections - 35 USC § 103
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
Claims 1-19 are rejected under 35 U.S.C. 103(a) as being unpatentable over Otsuka et al. (US 2013/0071002) in view of Yip et al. (WO 2020/198380).
Regarding claim 1, Otsuka discloses an image analysis apparatus, comprising:
a memory coupled to an imaging device (see Fig. 2, memory 214); and
a hardware processor (see Fig. 2, controller 218) coupled to the memory and configured to:
receive image data from the imaging device, the image data representative of a tissue sample in a first state (see Fig. 2 and paras 27, 34, and 37, images of sample tissue are obtained via image acquisition subsystem 210),
perform virtual staining of the tissue sample based on the image data to generate one or more virtual stained images of the tissue sample (see Fig. 2 and paras 28-29 and 37, image virtual-staining device 226 performs virtual staining on the image of the tissue sample),
execute an algorithm using the one or more virtual stained images of the tissue sample as an input, the algorithm configured to generate a first diagnosis comprising an indication of a disease based on the tissue sample (see paras 37-41 and 45, a pre-trained algorithm automatically determines a disease from the virtually stained tissue sample),
automatically identify one or more types of chemical stains based on the indication of the disease generated by the algorithm (see paras 38-41, 45, 47, 50, and 53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required), and
automatically order chemical staining of the tissue sample in the first state based on the identified one or more types of chemical stains (see paras 38-41, 45, 47, 50, and 53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required, automatic ordering of the chemical staining is also performed).
Otsuka does not disclose expressly an artificial intelligence algorithm.
Yip discloses execute an artificial intelligence algorithm using the one or more virtual stained images of the tissue sample as an input, the artificial intelligence algorithm configured to generate a first diagnosis comprising an indication of a disease based on the tissue sample (see paras 84, 93-94, 108-111, and 176, an artificial intelligence algorithm is used to determine disease of tissue samples).
Regarding claim 11, Otsuka discloses a method of diagnosing a disease based on a tissue sample, comprising:
performing virtual staining of the tissue sample in a first state to generate one or more virtual stained images of the tissue sample (see Fig. 2 and paras 28-29 and 37, image virtual-staining device 226 performs virtual staining on the image of the tissue sample);
executing an algorithm using the virtual stained images of the tissue sample as an input (see paras 37-41 and 45, a pre-trained algorithm automatically determines a disease from the virtually stained tissue sample);
automatically generating a first diagnosis comprising an indication of a disease based on an output of the algorithm, the first diagnosis comprising the one or more virtual stained images of the tissue sample (see paras 37-41 and 45, a pre-trained algorithm automatically determines a disease from the virtually stained tissue sample);
automatically determining, based on the first diagnosis, at least one assay for chemical staining of the tissue sample in the first state (see paras 47, 50, and 52-53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required, an assay is chosen for the chemical staining); and
generating a set of the one or more virtual stained images of the tissue sample from the virtual staining and one or more chemical stained images of the tissue sample from the chemical staining (see Fig. 8C and paras 50-53, virtual stained and chemical stained images are displayed to a user).
Otsuka does not disclose expressly an artificial intelligence algorithm.
Yip discloses execute an artificial intelligence algorithm using the one or more virtual stained images of the tissue sample as an input, the artificial intelligence algorithm configured to generate a first diagnosis comprising an indication of a disease based on the tissue sample (see paras 84, 93-94, 108-111, and 176, an artificial intelligence algorithm is used to determine disease of tissue samples).
Regarding claim 16, Otsuka discloses an image analysis apparatus, comprising:
a memory coupled to an imaging device (see Fig. 2, memory 214); and
a hardware processor (see Fig. 2, controller 218) coupled to the memory and configured to:
obtain image data from the imaging device, the image data representative of a tissue sample (see Fig. 2 and paras 27, 34, and 37, images of sample tissue are obtained via image acquisition subsystem 210),
perform virtual staining of the tissue sample based on the image data to generate one or more virtual stained images of the tissue sample (see Fig. 2 and paras 28-29 and 37, image virtual-staining device 226 performs virtual staining on the image of the tissue sample),
execute an algorithm using the one or more virtual stained images of the tissue sample as an input, the algorithm configured to generate a first diagnosis comprising an indication of a disease based on the tissue sample (see paras 37-41 and 45, a pre-trained algorithm automatically determines a disease from the virtually stained tissue sample), and
obtain, based on indication of the disease, one or more images of the same tissue sample having a chemical stain (see paras 28, 38-41, 45, 47, 50, and 53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required, the chemical staining is performed on the same tissue sample that was virtually stained).
Otsuka does not disclose expressly an artificial intelligence algorithm.
Yip discloses execute an artificial intelligence algorithm using the one or more virtual stained images of the tissue sample as an input, the artificial intelligence algorithm configured to generate a first diagnosis comprising an indication of a disease based on the tissue sample (see paras 84, 93-94, 108-111, and 176, an artificial intelligence algorithm is used to determine disease of tissue samples).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to combine the artificial intelligence algorithm, as described by Yip, with the system of Otsuka. Otsuka discloses a pre-trained automatic decision algorithm without specifically mentioning artificial intelligence and/or machine learning. Based on the teachings of Yip, it would have been obvious to one of ordinary skill in the art would to utilize an artificial intelligence algorithm as the pre-trained automatic decision algorithm.
The suggestion/motivation for doing so would have been to provide a robust algorithm and increase proficiency and efficiency of diagnosis thereby saving the time of a histopathologist and aiding in proper diagnosis.
Therefore, it would have been obvious to combine Yip with Otsuka to obtain the invention as specified in claims 1, 11, and 16.
Regarding claim 2, Otsuka further discloses wherein the hardware processor is further configured to:
receive one or more chemically stained images, and generate a set of the one or more virtual stained images of the tissue sample from the virtual staining and the one or more chemically stained images of the tissue sample from the chemical staining (see Fig. 8C and paras 47, 50, and 52-53, virtual stained and chemical stained images are displayed to a user).
Regarding claim 3, Otsuka further discloses wherein the hardware processor is further configured to: generate a first diagnosis based on the virtual staining of the tissue sample in the first state, the first diagnosis comprising the one or more virtual stained images of the tissue sample (see paras 37-41 and 45, a pre-trained algorithm automatically determines a disease from the virtually stained tissue sample).
Regarding claim 4, Otsuka further discloses wherein the hardware processor is further configured to: determine at least one assay for the chemical staining of the tissue sample in the first state, wherein the order of the chemical staining includes an indication of the at least one assay to be used in the chemical staining of the tissue sample (see paras 47, 50, and 52-53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required, an assay is chosen for the chemical staining).
Regarding claim 5, Otsuka further discloses wherein the hardware processor is further configured to: generate a first diagnosis based on the virtual staining of the tissue sample in the first state, wherein the chemical staining of the tissue sample using the at least one assay is configured to differentiate between different types of the disease indicated by the first diagnosis (see paras 38, 37-41, 45, 47, 50, and 52-53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required, an assay is chosen for the chemical staining).
Regarding claim 6, Otsuka further discloses wherein: the memory stores a machine learning algorithm configured to follow a decision tree that selects the at least one assay based on the disease indicated by the first diagnosis (see paras 47, 50, and 52-53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required, a decision tree is used to determine what chemical staining should be performed based on the suspected disease present in the virtual stained image).
Regarding claims 7 and 12, Otsuka further discloses wherein the chemical staining is performed on the same tissue sample used in the virtual staining (see para 28, the chemical staining is performed on the same tissue sample that was virtually stained).
Regarding claims 8 and 13, Otsuka further discloses wherein the hardware processor is further configured to perform the virtual staining and the generating of the set of the one or more images without storing the tissue sample (see Fig. 8B and para 51, after a virtual staining is performed it may be determined that no disease is present and therefore no chemical staining is required, thus no storing of the tissue sample is necessary).
Regarding claim 9, Otsuka further discloses wherein: the imaging device is configured to generate the image data using coverslipless imaging, and the chemical staining is imaged using coverslipless imaging (see Fig. 4 and paras 34-35, no coverslip is used during imaging).
Regarding claim 10, Otsuka further discloses wherein automatically identifying the one or more types of chemical stains and automatically ordering the chemical staining of the tissue sample are performed without receiving user input (see para 48, automatic identifying and ordering of chemical stains is performed).
Regarding claim 14, Otsuka further discloses generating image data of the tissue sample using an image device, wherein the image data generated by the image device is used as an input for the virtual staining (see Fig. 2 and paras 27-29, 34, and 37, images of sample tissue is obtained via image acquisition subsystem 210, image virtual-staining device 226 performs virtual staining on the image of the tissue sample).
Regarding claim 15, Otsuka further discloses wherein the generating of the image data and the chemical staining are performed using coverslipless imaging (see Fig. 4 and paras 34-35, no coverslip is used during imaging).
Regarding claim 17, Otsuka further discloses wherein the tissue sample is directed to undergo the chemical stain after the hardware processor performs virtual staining of the tissue sample (see paras 38-41, 45, 47, 50, and 53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required).
Regarding claim 18, Otsuka further discloses wherein the hardware processor is further configured to cause the ordering of the chemical staining of the tissue based on the one or more virtual stained images of the tissue sample (see paras 38-41, 45, 47, 50, and 53, based on a rating/score of the virtually stained image an automatic determination is made as to whether or not a chemical staining is required, automatic ordering of the chemical staining is also performed).
Regarding claim 19, Otsuka further discloses wherein the hardware processor is further configured to generate a first diagnosis based on the virtual staining of the tissue sample (see paras 37-41 and 45, a pre-trained algorithm automatically determines a disease from the virtually stained tissue sample).
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARK R MILIA whose telephone number is (571)272-7408. The examiner can normally be reached Monday-Friday, 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Akwasi Sarpong can be reached at 571-270-3438. The fax number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MARK R MILIA/ Primary Examiner, Art Unit 2681