DETAILED ACTION
Notice of AIA Status
The present application is being examined under the AIA the first inventor to file provisions.
Examiners Remarks
The office did not give claim 6 a claim interpretation under 112(f) since classifier does not have and is not a generic placeholder and is further described as Artificial intelligence which is a computer algorithm.
Claim Objections
Claims 1, 3, 6, and 8 are objected to because of the following informalities:
In claim 1, Line 21 the term “repeating the step of analyzing the whole microscopic image” should be changed to “repeating the or typographical/grammar issues to avoid clarity issues to prevent a rejection under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph.
In claim 1 and 3, please remove the –‘s between lines in claims 1 and 3 to avoid clarity issues.
In claim 3, Line 1-2 the term “the step of analyzing the quality of the microscopic image” should be changed to “the of the quality of the microscopic image” or typographical/grammar issues to avoid clarity issues to prevent a rejection under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph.
In claim 6, Line 1-2 the term “performing the step of classifying the cells by” should be changed to “performing the of the cells by” or typographical/grammar issues to avoid clarity issues to prevent a rejection under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph.
In claim 8, Line 4 the term “configured to perform the steps of the method according to claim 1” should be changed to “configured to perform or typographical/grammar issues to avoid clarity issues to prevent a rejection under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph. Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2 and 6-8, are rejected under 35 U.S.C 103 as being unpatentable over Chung et al. (US 20230326018 A1) hereafter referenced as Chung in view of Cosatto et al. (US 20220028068 A1) hereafter referenced as Cosatto, Reunanen et al. (US 20220284580 A1) hereafter referenced as Reunanen, and Goede et al. (US 20230011031 A1) hereafter referenced as Goede.
Regarding claim 1, Chung explicitly teaches a computer-implemented method for identifying cervical cancer cells (Fig. 1, Paragraph [0009]- Chung discloses there is provided a method for examining cells performed by a computer, the method comprising: acquiring image data of cells acquired from a region suspected to be a cancer lesion.),
the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells (Fig. 1, Paragraph [0027]- Chung discloses smearing means observing cells in a tissue using a microscope. Further in Fig. 2, Paragraph [0060-61]- Chung discloses the tissue may be provided to the tissue analysis device 200 without treatment, or may be provided after smearing and H&E (hematoxylin & eosin) staining. the tissue extracted by fine needle aspiration may be captured by an imaging device or the like and provided to the tissue analysis device 200 in the form of image data.);
- analyzing a quality of the microscopic image (Fig. 4, Paragraph [0086]- Chung discloses the controller 26 may calculate the number of cells with nuclei, excluding red blood cells, based on the data generated in step S420. When the number is greater than or equal to a predetermined reference number, it may be determined that the amount of cells is sufficient.);
- detecting that the quality of the microscopic image is satisfactory (Fig. 4, Paragraph [0086]- Chung discloses the controller 26 may calculate the number of cells with nuclei, excluding red blood cells, based on the data generated in step S420. When the number is greater than or equal to a predetermined reference number, it may be determined that the amount of cells is sufficient.);
- in response to detecting that the quality of the image is satisfactory, classifying the cells imaged on the microscopic image to indicate potentially cancerous cells (Fig. 4, Paragraph [0089]- Chung discloses in a case in which it is determined that the amount of the cells is sufficient, the controller 26 may determine that the cells contain the cancer-probable cells, if there is at least one cancer-probable cell in the cells.),
Chung fails to explicitly teach wherein each potentially cancerous cell is assigned with a type descriptor indicating a cell type and a type probability indicating a probability that the cell is of a particular type; - analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells; and - generating a final report that identifies an overall probability that the microscopic image comes from a potentially cancerogenous patient.
However, Cosatto explicitly teaches wherein each potentially cancerous cell is assigned with a type descriptor indicating a cell type and a type probability indicating a probability that the cell is of a particular type (Fig. 4, Paragraph [0033]- Cosatto discloses classifier model 408 may use cell location information output by the detection model 406 to generate probabilities that each cell is cancerous.);
- analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells (Fig. 2, Paragraph [0026]- Cosatto discloses a final score for each cell may be determined at block 214, for example as a weighted sum of the probabilities of the cells and the structures. The weights may be hyperparameters of the model. In the case of combining 2 scores, a single hyperparameter a may be used, and the final score S may be determined as follow: S=α*s.sub.1+(1−α)*s.sub.2, where s.sub.1 and s.sub.2 are outputs of the low-resolution and high-resolution model for a particular cell. The final score is used to determine if a cell is a tumor cell or a non-tumor cell, based on a threshold T as follow: if S<T, the cell is non-tumor, otherwise it is a tumor cell.);
and - generating a final report that identifies an overall probability that the microscopic image comes from a potentially cancerogenous patient (Fig. 1, Paragraph [0021]- Cosatto discloses the slide analysis 108 may generate a TCR report 110 that characterizes the information gleaned from the slide 102, for example including TCR, locations of cancerous cells, etc. This TCR report 110 may be used by medical professionals to help diagnose a patient, to identify a type and extent of a cancer, to identify a course of treatment, etc. Further in Fig. 2, Paragraph [0026]- Cosatto discloses a final report may be generated to collect cell information and to calculate the TCR for each processing tile, and for the entire slide or user-selected regions.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Cosatto wherein each potentially cancerous cell is assigned with a type descriptor indicating a cell type and a type probability indicating a probability that the cell is of a particular type; - analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells; and - generating a final report that identifies an overall probability that the microscopic image comes from a potentially cancerogenous patient.
Wherein having Chung’s system for examining cells wherein each potentially cancerous cell is assigned with a type descriptor indicating a cell type and a type probability indicating a probability that the cell is of a particular type; - analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells; and - generating a final report that identifies an overall probability that the microscopic image comes from a potentially cancerogenous patient.
The motivation behind the modification would have been to allow for a more accurate and faster system, since both Chung and Cosatto are both systems that perform examination on cells. Wherein Chung’s system wherein improved prediction performance, while Cosatto’s system reduces inaccuracies and increases speed of the system. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Cosatto et al. (US 20220028068 A1) Paragraph [0003 and 0023].
Chung in view of Cosatto fails to explicitly teach - presenting, on a single screen of a graphical user interface of a computer system:- an overall image representing box configured to display the microscopic image with a zoom in and zoom out functionality; and - a plurality of cell identification boxes configured to display enlarged images of the potentially cancerogenous cells;- receiving on said single screen of the graphical user interface an expert input, wherein the expert input indicates correction of cell type descriptors and type probabilities for at least some of the potentially cancerogenous cells.
However, Reunanen explicitly teaches - presenting, on a single screen of a graphical user interface of a computer system (Fig. 20, Paragraph [0135]- Reunanen discloses block 564 may direct the analyzer processor 100 to send signals to the display 16 for causing the display 16 to display a review image or window 600 as shown in FIG. 20 to the user, the review image 600 including the pixel square 240, which was identified at block 562 and contextual pixels and/or pixel squares surrounding the pixel square 240.):
- an overall image representing box configured to display the microscopic image with a zoom in and zoom out functionality (Fig. 19, Paragraph [0143]- Reunanen discloses block 564 may include code for directing the analyzer processor 100 to allow a user to zoom in and out and thereby adjust the image context shown, such that the default image context width and height is something that is not fixed.);
and - a plurality of cell identification boxes configured to display enlarged images of the potentially cancerogenous cells (Fig. 20, Paragraph [0140]- Reunanen discloses the size of the AIA A area 602 may be chosen based on the size of the subimage used. For example, in various embodiments, block 564 may direct the analyzer processor 100 to cause the AIA A area 602 to have a width about 1 to 10 times the width of the subimage 242 and a height about 1 to 10 times the height of the subimage 242.);
- receiving on said single screen of the graphical user interface an expert input, wherein the expert input indicates correction of cell type descriptors and type probabilities for at least some of the potentially cancerogenous cells (Fig. 4, Paragraph [0131]- Reunanen discloses for each of the one or more sample image elements displayed, receive user input. In some embodiments, the user input may include a user-provided indication that a displayed sample image element represents one of the one or more sample properties.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Reunanen - presenting, on a single screen of a graphical user interface of a computer system:- an overall image representing box configured to display the microscopic image with a zoom in and zoom out functionality; and - a plurality of cell identification boxes configured to display enlarged images of the potentially cancerogenous cells;- receiving on said single screen of the graphical user interface an expert input, wherein the expert input indicates correction of cell type descriptors and type probabilities for at least some of the potentially cancerogenous cells.
Wherein having Chung’s system for examining cells wherein - presenting, on a single screen of a graphical user interface of a computer system:- an overall image representing box configured to display the microscopic image with a zoom in and zoom out functionality; and - a plurality of cell identification boxes configured to display enlarged images of the potentially cancerogenous cells;- receiving on said single screen of the graphical user interface an expert input, wherein the expert input indicates correction of cell type descriptors and type probabilities for at least some of the potentially cancerogenous cells.
The motivation behind the modification would have been to allow for a more a faster and more accurate training and review, since both Chung and Reunanen are both systems that perform examination on cells. Wherein Chung’s system wherein improved prediction performance, while Reunanen’s system increases the accuracy and speed of training and reviewing. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Reunanen et al. (US 20220284580 A1) Paragraph [0190-191].
Chung in view of Cosatto and Reunanen fails to explicitly teach - repeating the step of analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells, including the cell descriptors and probabilities corrected via the expert input.
However, Goede explicitly teaches - repeating the step of analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells, including the cell descriptors and probabilities corrected via the expert input (Fig. 9, Paragraph [0074]- Goede discloses generating third medical clinical data related to the third ROI and fourth medical clinical data related to the fourth ROI, entering, into the machine learning classifier, the third and fourth annotation data and the third and fourth medical clinical data, and in response to the entering, automatically determining, by the machine learning classifier, an updated medical recommendation for the patient related to the medical condition of the patient. Further in Fig. 1, Paragraph [0025]- Goede discloses each medical image may be further processed by a medical professional annotating a region of interest (ROI) that relates to the particular medical condition in the medical image, and one or more medical professionals (who may be different from the medical professional who performed the annotating of the medical image) may further generate clinical data related to the ROI.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto and Reunanen of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Goede - repeating the step of analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells, including the cell descriptors and probabilities corrected via the expert input.
Wherein having Chung’s system for examining cells wherein - repeating the step of analyzing the whole microscopic image to determine an overall probability that the microscopic image comes from a potentially cancerogenous patient, based on distribution of the potentially cancerogenous cells, including the cell descriptors and probabilities corrected via the expert input.
The motivation behind the modification would have been to allow for a more accurate system, since both Chung and Goede are both systems that perform medical diagnosis. Wherein Chung’s system wherein improved prediction performance, while Goede’s system increases the accuracy and may improve treatment outcome. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Goede et al. (US 20230011031 A1) Paragraph [0049 and 0073].
Regarding claim 2, Chung in view of Cosatto, Reunanen, and Goede teaches the method according to claim 1, Chung further teaches further comprising, after analyzing a quality of the microscopic image, detecting that the quality of the microscopic image is not satisfactory and in response to detecting that the quality of the image is not satisfactory, outputting an indication that the image is not diagnostic (Fig. 3, Paragraph [0093]- Chung discloses the output unit 28 may output a message that the amount of the acquired cells is not sufficient and a message about re-acquisition of cells. Accordingly, cells may be reacquired in the region suspected to be a cancer lesion.).
Regarding claim 6, Chung in view of Cosatto, Reunanen, and Goede teaches the method according to claim 1, Chung fails to explicitly teach comprising performing the step of classifying the cells by means of a cells classifier that is an artificial intelligence module and training the cells classifier based on received expert input.
However, Cosatto explicitly teaches comprising performing the step of classifying the cells by means of a cells classifier that is an artificial intelligence module (Fig. 1, Paragraph [0017]- Cosatto discloses to detect cancerous cells, a machine learning model may be used that includes two deep neural networks.)
and training the cells classifier based on received expert input (Fig. 3, Paragraph [0027]- Cosatto discloses this training data may include a set of regions of interest from a set of patients' scanned tissue sample slides, representative of a particular condition as encountered in clinical practice. The regions of interest may be annotated by domain experts, such as pathologists, to identify the location of all cells, including identification of regions that include a tumor.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto, Reunanen, and Goede of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Cosatto wherein comprising performing the step of classifying the cells by means of a cells classifier that is an artificial intelligence module and training the cells classifier based on received expert input.
Wherein having Chung’s system for examining cells wherein comprising performing the step of classifying the cells by means of a cells classifier that is an artificial intelligence module and training the cells classifier based on received expert input.
The motivation behind the modification would have been to allow for a more accurate and faster system, since both Chung and Cosatto are both systems that perform examination on cells. Wherein Chung’s system wherein improved prediction performance, while Cosatto’s system reduces inaccuracies and increases speed of the system. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Cosatto et al. (US 20220028068 A1) Paragraph [0003 and 0023].
Regarding claim 7, Chung in view of Cosatto, Reunanen, and Goede teaches the method according to claim 1, Chung in view of Reunanen and Goede fails to explicitly teach wherein the final report further identifies a list of potentially cancerogenous cells along with the type descriptor and the type probability.
However, Cosatto explicitly teaches wherein the final report further identifies a list of potentially cancerogenous cells along with the type descriptor and the type probability (Fig. 2, Paragraph [0026]- Cosatto discloses a final report may be generated to collect cell information and to calculate the TCR for each processing tile, and for the entire slide or user-selected regions. The report may include any appropriate level of detail, and may include any information derived from the cell information, such as local and global TCR, as well as a statement of the likelihood that the slide shows cancerous tissue.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto, Reunanen, and Goede of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Cosatto wherein the final report further identifies a list of potentially cancerogenous cells along with the type descriptor and the type probability.
Wherein having Chung’s system for examining cells wherein the final report further identifies a list of potentially cancerogenous cells along with the type descriptor and the type probability.
The motivation behind the modification would have been to allow for a more accurate and faster system, since both Chung and Cosatto are both systems that perform examination on cells. Wherein Chung’s system wherein improved prediction performance, while Cosatto’s system reduces inaccuracies and increases speed of the system. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Cosatto et al. (US 20220028068 A1) Paragraph [0003 and 0023].
Regarding claim 8, Chung in view of Cosatto, Reunanen, and Goede teaches configured to perform the steps of the method according to claim 1, Chung further teaches a computer-implemented system comprising at least one non-transitory processor- readable storage medium (Fig. 1, Paragraph [0012]- Chung discloses a computer readable recording medium to record computer programs for executing the method may be additionally provided.) that stores at least one of processor-executable instructions or data (Fig. 1, Paragraph [0099]- Chung discloses the medium to be stored refers not to a medium storing data for a short time but to a medium that stores data semi-permanently, like a register, cache, memory, and the like, and means a medium readable by a device.) and at least one processor communicably coupled to at least one non-transitory processor-readable storage medium (Fig. 1, Paragraph [0098]- Chung discloses the code may further include additional information necessary for the processor of the computer to execute the functions or memory reference-related code for whether the media should be referenced in which location (address) of the internal or external memory of the computer.).
Claim 3 is rejected under 35 U.S.C 103 as being unpatentable over Chung et al. (US 20230326018 A1) hereafter referenced as Chung in view of Cosatto et al. (US 20220028068 A1) hereafter referenced as Cosatto, Reunanen et al. (US 20220284580 A1) hereafter referenced as Reunanen, Goede et al. (US 20230011031 A1) hereafter referenced as Goede, Zahniser et al. (US 20100208961 A1) hereafter referenced as Zahniser, and Chen et al. (US 20220318979 A1) hereafter referenced as Chen.
Regarding claim 3, Chung in view of Cosatto, Reunanen, and Goede teaches the method according to claim 1, Chung further teaches wherein the step of analyzing the quality of the microscopic image comprises (Fig. 1, Paragraph [0032]- Chung discloses the first deep learning model 110 may include various criteria (i.e., parameters), and may add new criteria (i.e., parameters) through an input image analysis. The parameters may include, for example, cellularity, adequacy of specimen, amount of blood, and diagnosis.):
- counting cells within the whole microscopic image to determine a number of cells imaged within the microscopic image (Fig. 1, Paragraph [0033]- Chung discloses the cytoplasm may be determined by the number of cells per slide. For example, if the number of cells per slide is less than 100, it may be determined to be unsuitable, if the number of cells per slide is greater than or equal to 100 and less than 1000, it may be determined to be good, and if the number of cells per slide is greater than 1000, it may be determined to be acceptable.);
- classifying the microscopic image as satisfactory if (Fig. 4, Paragraph [0086]- Chung discloses the controller 26 may calculate the number of cells with nuclei, excluding red blood cells, based on the data generated in step S420. When the number is greater than or equal to a predetermined reference number, it may be determined that the amount of cells is sufficient.):
- the number of cells imaged within the microscopic image is higher than a cells number threshold (Fig. 1, Paragraph [0033]- Chung discloses the cytoplasm may be determined by the number of cells per slide. For example, if the number of cells per slide is less than 100, it may be determined to be unsuitable, if the number of cells per slide is greater than or equal to 100 and less than 1000, it may be determined to be good, and if the number of cells per slide is greater than 1000, it may be determined to be acceptable.);
Chung fails to explicitly teach- dividing the microscopic image into fragments and performing at least one of the following tests per at least some fragments.
However, Cosatto explicitly teaches - dividing the microscopic image into fragments and performing at least one of the following tests per at least some fragments (Fig. 6, Paragraph [0049]- Cosatto discloses the new slide image is processed by patch generator 610, which may identify processing tiles and may divide the new slide image into pixel patches at appropriate resolutions for the models.):
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto, Reunanen, and Goede of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Cosatto - dividing the microscopic image into fragments and performing at least one of the following tests per at least some fragments.
Wherein having Chung’s system for examining cells wherein -dividing the microscopic image into fragments and performing at least one of the following tests per at least some fragments
The motivation behind the modification would have been to allow for a more accurate and faster system, since both Chung and Cosatto are both systems that perform examination on cells. Wherein Chung’s system wherein improved prediction performance, while Cosatto’s system reduces inaccuracies and increases speed of the system. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Cosatto et al. (US 20220028068 A1) Paragraph [0003 and 0023].
Chung in view of Cosatto, Reunanen, and Goede fails to explicitly teach - determining whether the particular fragment is in focus- an area of image containing cells that is in focus is higher than a focus threshold.
However, Zahniser explicitly teaches - determining whether the particular fragment is in focus (Fig. 1, Paragraph [0048]- Zahniser discloses the focus score or reading obtained in the above manner may be used in a number of ways. First, the method may be used to determine whether a single digital image is accepted or rejected. For example, if the displacement falls outside a pre-determined threshold value, then the image may be rejected.);
- an area of image containing cells that is in focus is higher than a focus threshold (Fig. 1, Paragraph [0048]- Zahniser discloses the focus score or reading obtained in the above manner may be used in a number of ways. First, the method may be used to determine whether a single digital image is accepted or rejected. For example, if the displacement falls outside a pre-determined threshold value, then the image may be rejected.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto, Reunanen, and Goede of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Zahniser - determining whether the particular fragment is in focus- an area of image containing cells that is in focus is higher than a focus threshold.
Wherein having Chung’s system for examining cells wherein - determining whether the particular fragment is in focus- an area of image containing cells that is in focus is higher than a focus threshold.
The motivation behind the modification would have been to allow for system that is both efficient and accurate, since both Chung and Zahniser are both systems that perform image analysis on cells. Wherein Chung’s system wherein improved prediction performance, while Zahniser’s system maximizes efficiency without introducing a risk of reduced accuracy. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Zahniser et (US 20100208961 A1) Paragraph [0011].
Chung in view of Cosatto, Reunanen, Goede, and Zahniser fails to explicitly teach and - determining whether the particular fragment contains an object that is non-diagnostic; and - an area of the image containing non-diagnostic objects is lower than a non-diagnostic area threshold.
However, Chen explicitly teaches and - determining whether the particular fragment contains an object that is non-diagnostic (Fig. 1, Paragraph [0043]- Chen discloses the level of image quality may be a percentage, with 100% representing the highest image quality and 0% representing the lowest image quality. An artifact prediction metric above a predefined limit for the whole-slide image may be interpreted as representing that the whole-slide image includes an artifact (e.g., blurry region, air bubble, tissue fold, pen mark, cracked slide).);
and - an area of the image containing non-diagnostic objects is lower than a non-diagnostic area threshold (Fig. 1, Paragraph [0025]- Chen discloses evaluating the condition may include determining a quantity or portions of regions or tiles in an image are associated with an artifact prediction metric that is above a metric threshold and then determining whether the quantity exceeds a predefined limit. When it is determined that the condition is satisfied (i.e., the quantity exceeds the predefined limit), the image may be discarded.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto, Reunanen, Goede, and Zahniser of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Chen and - determining whether the particular fragment contains an object that is non-diagnostic; and - an area of the image containing non-diagnostic objects is lower than a non-diagnostic area threshold.
Wherein having Chung’s system for examining cells wherein and - determining whether the particular fragment contains an object that is non-diagnostic; and - an area of the image containing non-diagnostic objects is lower than a non-diagnostic area threshold.
The motivation behind the modification would have been to allow for system faster and more accurate, since both Chung and Chen are both systems that perform image analysis on cells. Wherein Chung’s system wherein improved prediction performance, while Chen’s system increases speed and reduces inaccuracies of the system. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Chen et (US 20220318979 A1) Paragraph [0084].
Claim 4 is rejected under 35 U.S.C 103 as being unpatentable over Chung et al. (US 20230326018 A1) hereafter referenced as Chung in view of Cosatto et al. (US 20220028068 A1) hereafter referenced as Cosatto, Reunanen et al. (US 20220284580 A1) hereafter referenced as Reunanen, Goede et al. (US 20230011031 A1) hereafter referenced as Goede, and Yoshihara et al. (US 20130188857 A1) hereafter referenced as Yoshihara.
Regarding claim 4, Chung in view of Cosatto, Reunanen, and Goede teaches the method according to claim 1, Chung in view of Cosatto, Reunanen, and Goede fails to explicitly teach wherein the graphical user interface further comprises a cell counter box configured to indicate the number of potentially cancerogenous cells corresponding to a particular type descriptor.
However, Yoshihara explicitly teaches wherein the graphical user interface further comprises a cell counter box configured to indicate the number of potentially cancerogenous cells corresponding to a particular type descriptor (Fig. 2, Paragraph [0034]- Yoshihara discloses in a display data generation section (display data generation unit) 208, send data is generated based on a count value obtained by counting the number of cancer cells with each staining intensity by the cancer cell counter 207, the tissue sample image, or the mark.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto, Reunanen, and Goede of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Yoshihara wherein the graphical user interface further comprises a cell counter box configured to indicate the number of potentially cancerogenous cells corresponding to a particular type descriptor.
Wherein having Chung’s system for examining cells wherein the graphical user interface further comprises a cell counter box configured to indicate the number of potentially cancerogenous cells corresponding to a particular type descriptor.
The motivation behind the modification would have been to allow for system to more easily display information to a user, since both Chung and Yoshihara are both systems that perform image analysis on cells. Wherein Chung’s system wherein improved prediction performance, while Yoshihara’s system allows for ease of displaying information to the user. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Yoshihara et (US 20130188857 A1) Paragraph [0031].
Claim 5 is rejected under 35 U.S.C 103 as being unpatentable over Chung et al. (US 20230326018 A1) hereafter referenced as Chung in view of Cosatto et al. (US 20220028068 A1) hereafter referenced as Cosatto, Reunanen et al. (US 20220284580 A1) hereafter referenced as Reunanen, Goede et al. (US 20230011031 A1) hereafter referenced as Goede, and Madabhushiet al. (US 20180253841 A1) hereafter referenced as Madabhushi.
Regarding claim 5, Chung in view of Cosatto, Reunanen, and Goede teaches the method according to claim 1, Chung in view of Cosatto, Reunanen, and Goede fails to explicitly teach wherein the graphical user interface further comprises a summary diagnosis box to indicate the overall probability that the microscopic image comes from a potentially cancerogenous patient.
However, Madabhushi explicitly teaches wherein the graphical user interface further comprises a summary diagnosis box to indicate the overall probability that the microscopic image comes from a potentially cancerogenous patient (Fig. 5, Paragraph [0058]- Madabhushi discloses the cancer treatment plan circuit 551 may control personalized medicine device 560 to display the classification, the probability, the image, a feature map, or the cancer treatment plan on a computer monitor, a smartphone display, a tablet display, or other displays.).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of Chung in view of Cosatto, Reunanen, and Goede of having a computer-implemented method for identifying cervical cancer cells, the method comprising: - receiving a microscopic image of a specimen imaging a plurality of cells with the teachings of Madabhushi wherein the graphical user interface further comprises a summary diagnosis box to indicate the overall probability that the microscopic image comes from a potentially cancerogenous patient.
Wherein having Chung’s system for examining cells wherein the graphical user interface further comprises a summary diagnosis box to indicate the overall probability that the microscopic image comes from a potentially cancerogenous patient.
The motivation behind the modification would have been to allow for more accurate displaying of the results, since both Chung and Madabhushi are both systems that perform image analysis on cells. Wherein Chung’s system wherein improved prediction performance, while Madabhushi’s system allows for greater accuracy. Please see Chung et al. (US 20230326018 A1), Paragraph [0039] and Madabhushi et (US 20180253841 A1) Paragraph [0059].
Conclusion
Listed below are the prior arts made of record and not relied upon but are considered
pertinent to applicant`s disclosure.
Kawagishi et al. (US 20120054652 A1)- A diagnosis support apparatus includes a display control unit configured to display an input GUI which receives input of a plurality of findings regarding a subject, a deducing unit configured to deduce a diagnosis of the subject on the basis of the findings input through the input GUI, and a determining unit configured to determine whether one of the plurality of input findings supports the deduction or not. In this case the display control unit changes the display form of the input GUI which receives input of the determined finding in accordance with the determination result........................Please see Fig. 1. Abstract.
CHE et al. (CN 115100647 A)- The invention claims a cervical cancer cell identification method and device, wherein the method comprises: obtaining the cell microscopic image of the pathological sample; inputting the cell micro-image into the outline identification model, outputting the mark of the outline of the target cell; cutting the cell microscopic image based on the mark, obtaining the image of each target cell; extracting the cell nucleus feature and cytoplasm feature in the image of each target cell, obtaining the feature vector corresponding to each target cell; inputting the characteristic vector sequentially the cervical cancer cell identification model, obtaining the confidence of the target cell corresponding to each group of characteristic vector, wherein the confidence level represents the probability of the target cell is cervical cancer cell. through the outline identification model and cervical cancer cell identification model, performing two-stage treatment on the cell microscopic image, firstly, obtain the outline of the single suspected cervical cancer cell, then identifying the cervical cancer cell of the monomer cell, realizing the balance of accuracy and efficiency of cervical cancer cell identification.........................Please see Fig. 1. Abstract.
CHOI et al. (US 20210090248 A1)- Provided is a method of diagnosing cervical cancer using an artificial intelligence-based medical image analysis, which is performed by a computer, the method including obtaining an image of cervical cells of an object; pre-processing the image; identifying one or more cells in the pre-processed image; determining whether the identified one or more cells are normal; and diagnosing whether the object has cervical cancer on the basis of a result of determining whether the identified one or more cells are normal.........................Please see Fig. 1. Abstract.
Heckenbach et al. (US 20240354649 A1)- Techniques for predicting health conditions of patients using machine learning are described. Nuclei of cells from an image of a biological sample of the patient are identified. Cell type regions are identified that include one or more cells that are shown in the image. Cell types of the identified nuclei are identified based in part on locations of the nuclei relative to some or all of the identified cell type regions. Sets of scores for the identified nuclei are predicted using prediction models associated with different cell types. A set of aggregate scores for each of the cell types is generated using the sets of scores. A cancer risk or health outcome is estimated by applying at least some of the set of aggregate scores to a risk model..........................Please see Fig. 1. Abstract.
Reunanen et al. (US 20220284580 A1)- A method of facilitating image analysis in pathology involves receiving a sample image representing a sample for analysis, the sample image including sample image elements, causing one or more functions to be applied to the sample image to determine a plurality of property specific confidence related scores, each associated with a sample image element and a respective sample property and representing a level of confidence that the associated element represents the associated sample property, sorting a set of elements based at least in part on the confidence related scores, producing signals for causing one or more of the set of elements to be displayed to a user in an order based on the sorting, for each of the one or more elements displayed, receiving user input, and causing the user input to be used to update the one or more functions. Other methods, systems, and computer-readable media are disclosed..........................Please see Fig. 1. Abstract.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LUCIUS C.G. ALLEN whose telephone number is (703)756-5987. The examiner can normally be reached Mon - Fri 8-5pm (EST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at (571)272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LUCIUS CAMERON GREEN ALLEN/Examiner, Art Unit 2673
/CHINEYERE WILLS-BURNS/Supervisory Patent Examiner, Art Unit 2673