Detailed Action
1. Claims 1-14 are pending in this Application.
Notice of Pre-AIA or AIA Status
2. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless -
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
3. Claims 1-4 and 9-14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by
TADA et al., (hereafter TADA), US 20200279368 A1, pub , 05/11/2018
As to claim 1, TADA teaches An image processing device comprising: at least one memory configured to store instructions; and at least one processor configured to execute the instructions (Abstract, claim 36, Fig.7 and [0003], [0063], a diagnosis support system, a diagnosis support program and a computer-readable recording medium having the diagnosis support program stored therein) to:
acquire a set value of a first index indicating an accuracy relating to a lesion analysis, the set value being selected by a user from a plurality of options displayed by a display device ( Table 1 and [0139], specifically teaches a specificity true positive rate a/(a+c), where the specificity is a probability that the test result of a patient having no disease, a specificity true negative rate expressed as d/(b+d), specificity false negative rate and a and specificity false positive rate and the false-positive rate. The specificity rates are display on the Table 1. The first index corresponds to the specificity rates)
acquire, for each of plural models which infer a lesion, a predicted value of a second index, which is an index of the accuracy other than the first index, on an assumption that the set value of the first index is satisfied ([00139]-[0140] the sensitivity and the specificity are calculated each time the cut-off value is changed, and the calculated values are plotted with the abscissa representing the sensitivity and the ordinate representing the false-positive rate (=1-specificity), an ROC curve (Receiver Operating Characteristic: receiver operation characteristic curve) is provided (refer to FIG. 4 and FIG. 5). The predicted value of a second index corresponds to the sensitivity values corresponds to spasticity values as shown in Fig.4 and 5, see also par., [237]), infer the lesion included in an endoscopic image of an examination target, based on the predicted value of the second index and the plural models (discuss above based on the positive predictive value (PPV), a negative predictive value (NPV), and the specificity value of the CNN a medical personal can determine the malignant lesion and non-malignant lesion).
As to claim 2, TADA teaches wherein, for each of plural models, the at least one processor is further configured to execute the instructions to: acquire correspondence information indicative of a correspondence relation between the first index and the second index ([0139]-[140], the sensitivity and the specificity are calculated each time the cut-off value is changed, and the calculated values are plotted with the abscissa representing the sensitivity and the ordinate representing the false-positive rate (=1-specificity), an ROC curve (Receiver Operating Characteristic: receiver operation characteristic curve) is provided (refer to FIG. 4 and FIG. 5). The correspondence information corresponds to the ROC (Receiver Operating Characteristic) curve that describe sensitivity as a function of specificity as shown in Figs. 4-5), and acquire, as the predicted value, a value of the second index corresponding to the set value according to the correspondence information (as discus above the ROC curve provide the predictive ( statistical) values of sensitivity as a function of the predictive values of the specificity)
As to claim 3, TADA teaches the correspondence information indicates an ROC curve, an LROC curve, an FROC curve, or a PR curve (as discussed in claims 1 and 2 above the ROC curve (Receiver Operating Characteristic: receiver operation characteristic curve) is provided, see FIG. 4 and FIG. 5).
As to claim 4, TADA teaches curves of the plural models on two-dimensional coordinates with the first index and the second index as coordinate axes include intersecting points at which the curves intersect each other([0246], the AUC calculated from the ROC curve plotted from all the images (FIG. 20A) was 0.85. The AUC calculated for an ROC curve plotted from high-magnification images (FIG. 20B) was 0.90, and the AUC for an ROC curve plotted from low-magnification images (FIG. 20C) was 0.72. As shown in the graph the three curves have at least intersection point if the curves drawn on a single x-y ( i.e., specificity- sensitivity) plane).
As to claim 11, TADA teaches each of the plural model is a model obtained through a machine learning in which sets of the endoscopic image and correct answer data are used as training data, the correct answer data indicating an inference result to be outputted by the model when the endoscopic image is inputted to the model ([0237], Sensitivity=Number of images that the CNN correctively diagnosed as malignant/Number of histologically proven esophageal cancer lesions PPV=Number of images that the CNN correctly diagnosed as malignant/Number of lesions diagnosed by the CNN as esophageal cancer. NPV=Number of images that the CNN correctly diagnosed as non-malignant lesion(no more than one ECS image was diagnosed as malignant)/Number of lesions diagnosed by the CNN as non-malignant lesion. Specificity=Number of images that the CNN correctly diagnosed as non-malignant lesion/Histologically proven non-malignant lesion).
As to claim 12, TADA teaches the at least one processor is further configured to further execute the instructions to: output, to the display device, a coping method to support examiner's decision making, the coping method determined based on an inference result of a subject and a model ([0141], . A total of 33 endoscopic examiners performed esophagogastroduodenoscopy (hereinafter referred to as “EGD”) using an endoscope ). The indications for EGD were referral from a primary care physician for evaluation of various epigastric symptoms, positive results from gastric disease screening by barium meal, abnormal serum preparation)
Claim 13 is rejected the same as claim 1 except claim 13 is method claim . All the limitations of claim 13 are addressed in claim 1. Thus, argument analogous to that presented above for claim 1 is applicable to claim 13.
Claim 14 is rejected the same as claim 1 except claim 14 is directed to a computer program claim. All the limitations of claim 14 are addressed in claim 1. Thus, argument analogous to that presented above for claim1 is applicable to claim 14.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
4. Claims 5-10 are rejected under 35 U.S.C. 103 as being unpatentable over TADA, US 20200279368 A1in view of M Shahbaz Ayyaz et al. ( hereafter M Shahbaz), “Hybrid Deep Learning Model for Endoscopic Lesion Detection and Classification Using Endoscopy Videos” Diagnostics, pub. 25 December 2021.
Regarding claim 5, while TADA teaches the limitation of claim 1, but fails to teach the limitation of claim 5.
On the other hand in the same field of endeavor M Shahbaz teaches select a model with a best accuracy indicated by the predicted value from the plural models, and infer the lesion based on the selected model and the endoscopic image(Abstract, sections 4.3-4.4 . M Shahbaz specifically teaches a method of compering different machine learning classifiers, named cubic SVM, quadratic SVM, linear SVM, fine KNN, cosine KNN, fine tree, bagged tree, coarse tree, and naïve Bayes, for detection and classification of endoscopic lesion. These different machine learnings are weighted based on the accuracy of feature extraction, feature selection and classification as explained in section 4.2, 4.3 and 4.4 respectively. The authors observed that cubic SVM outclassed all other classifiers, with 99.8% accuracy, followed by bagged tree and linear SVM, with 98.8% and 98.6%, ).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate a method of comparing and selectin a machina learning classifiers among cubic SVM, quadratic SVM, linear SVM, fine KNN, cosine KNN, fine tree, bagged tree, coarse tree, and naïve Bayes to select the best machina leaning model for detecting and classifying endoscopic lesions taught by M Shahbaz into TADA
The suggestion/motivation for doing so would have been to allow user of TADA to select the baste machina learning case by case bases since no single machina leaning algorithm performs best on every problem or dataset.
As to claim 6, M Shahbaz teaches generate an inference result obtained by weighting, based on the predicted value, each inference result based on the endoscopic image by the plural models(sections 4.2-4.4, as discussed in calim5 above M Shahbaz teaches a performance designed to weight different machine learning classifiers, named cubic SVM, quadratic SVM, linear SVM, fine KNN, cosine KNN, fine tree, bagged tree, coarse tree, and naïve Bayes, for detection and classification of endoscopic lesion. These different machine learnings are weighted based on the accuracy of feature extraction, feature selection and classification as explained in section 4.2, 4.3 and 4.4 respectively ).
As to claim 7, M Shahbaz teaches execute a first mode or a second mode while switching between the first mode and the second mode based on a predetermined condition (section 6, M Shahbaz teaches a deep-learning-based model following the feature fusion framework to classify four different stomach diseases. We selected two different CNN models (VGG19 and Alexnet) to extract features and then used transfer learning on the feature vectors before using them as feature extractors. In feature selection, a heuristic GA selected rich information from the extracted feature vectors. ), wherein, in the first mode, the at least one processor is configured to execute the instructions to select a model with a best accuracy indicated by the predicted value from the plural models, and infer the lesion based on the selected model and the endoscopic image (Section 6, The hybridization helped to improve the accuracy of the proposed model, due to strong predictor values. We obtained 99.8% accuracy using the cubic SVM classifier on the given dataset.), and wherein, in the second mode, the at least one processor is configured to execute the instructions to generate an inference result obtained by weighting, based on the predicted value, each inference result based on the endoscopic image by the plural models (sections 4.3-4.4 . M Shahbaz specifically teaches a method of compering different machine learning classifiers, named cubic SVM, quadratic SVM, linear SVM, fine KNN, cosine KNN, fine tree, bagged tree, coarse tree, and naïve Bayes, for detection and classification of endoscopic lesion. These different machine learnings are weighted based on the accuracy of feature extraction, feature selection and classification as explained in section 4.2, 4.3 and 4.4 respectively. The authors observed that cubic SVM outclassed all other classifiers, with 99.8% accuracy, followed by bagged tree and linear SVM, with 98.8% and 98.6%,).
As to claim 8, M Shahbaz teaches execute the first mode or the second mode while switching between the first mode and the second mode based on whether or not a presence or absence of detection of a disorder other than a target disorder of the lesion analysis ( method of calculating classification results using machine learning classifiers after feature extraction and the feature selection process. To obtain the absolute accuracy of the proposed methodology, the Authors provided the hybridized selected features to different machine learning classifiers.. To obtain the final accuracy of the proposed methodology, The authors provided the hybridized selected features to different machine learning classifiers. The analysis of these classifiers was conducted through well-known performance metrics, including sensitivity, precision, accuracy, f1 score, true-negative rate (TNR), true-positive rate (TPR), false-negative rates (FNR), and false-positive rates (FPR). They observed that the proposed cubic SVM outclassed the other classifiers, with an accuracy of 99.8%.).
As to claim 9, M Shahbaz to the at least one processor is further configured to execute the instructions to: execute the first mode or the second mode while switching between the first mode and the second mode based on an external input ( as discussed in claim 8 above teaches a method of compering different machine learning classifiers,. To obtain the absolute accuracy of the proposed methodology, the authors provided the hybridized selected features to different machine learning classifiers, thus the hybridized structure allows to switch from one machine learning mode to another machine leaning model
in order to select features).
As to claim 10, M Shahbaz teaches the at least one processor is further configured to execute the instructions to: select a predetermined number of models with top accuracies indicated by the predicted value from the plural models, and generates an inference result obtained by weighting, based on the predicted value, each inference result by the predetermined number of models (Abstract, sections 4.3-4.4 . M Shahbaz specifically teaches a method of compering different machine learning classifiers, named cubic SVM, quadratic SVM, linear SVM, fine KNN, cosine KNN, fine tree, bagged tree, coarse tree, and naïve Bayes, for detection and classification of endoscopic lesion. These different machine learnings are weighted based on the accuracy of feature extraction, feature selection and classification as explained in section 4.2, 4.3 and 4.4 respectively. The authors observed that cubic SVM outclassed all other classifiers, with 99.8% accuracy, followed by bagged tree and linear SVM, with 98.8% and 98.6%, ).
Contact Information
Any inquiry concerning this communication or earlier communication from the examiner should be directed to Mekonen Bekele whose telephone number is (469) 295-9077.The examiner can normally be reached on Monday-Friday from 9:00AM to 6:50 PM Eastern Time.
If attempt to reach the examiner by telephone are unsuccessful, the examiner’s supervisor Eng, George can be reached on (571) 272-7495.The fax phone number for the organization where the application or proceeding is assigned is 571-237-8300. Information regarding the status of an application may be obtained from the patent Application Information Retrieval (PAIR) system. Status information for published application may be obtained from either Private PAIR or Public PAIR. Status information for unpublished application is available through Privet PAIR only.
For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have question on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866.
/MEKONEN T BEKELE/Primary Examiner, Art Unit 2699