DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The Information Disclosure Statement filed 02/01/2024 has been considered by examiner.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. All Claims have been examined using the effective filing date of 08/02/2021.
Claim Interpretation
Claim 20 has been rejected under 35 U.S.C 112(b) (see below) it has been examined as if depending from claim 19.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 20 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 20 recites the limitation "The non-transitory computer readable medium of claim 18…". There is insufficient antecedent basis for this limitation in the claim. Claim 18 is a dependent claim of claim 12, a system claim, in side of which is a “non-transitory memory”. Claim 19 is an independent claim that is directed towards a “non-transitory computer readable medium”.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-5,8-15, and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Golden (WO 2019103912 A2) in view of Barnes (CN 108028077 A).
With respect to claim 1, Golden teaches a method of predicting histopathology of lesions of a patient, the method comprising: detecting at least one lesion in a medical image of the patient (“trains a fully convolutional neural network (CNN) model to: classify if the entire input anatomical structure contains a lesion candidate; or segment lesion candidates utilizing the received learning data” page 5 paragraph 6 – page 6 paragraph 1); performing radiomics-derived quantitative analysis on the retrieved medical images to train an artificial neural network (ANN) classification model (“Finally, a CNN model could be used to directly predict the similarity of a query lesion to other lesions in the database. For example, if a training data set was created that consisted of a set of query lesions and their quantitative similarity to some or all lesions within a database of lesions, a model could be trained on that data set. That model would then be able to predict similarity for a new query lesion to lesions from the database.” Page 31 paragraph 3 and “Training is cyclical process and includes repeated loading of batches of training data from the database at 2704, followed by a standard CNN training iteration 2706. The standard CNN training iteration 2706 includes a forward pass of image data through the network, calculation of a loss function, and updating the weights of the CNN model using backpropagation [LeCun 1998] For implementations in which the model is supervised, loss is calculated with respect to the network’s output and the ground truth label. For implementations in which the model is unsupervised, loss is calculated with respect to some other metric, such as the inter cluster distance of predicted results.” Page 32 paragraph 7); and applying the at least one lesion to the trained ANN classification model to predict a histological nature of the at least one lesion (“In at least one implementation, a pre-trained CNN model for classification of lesions at 5414 is used to classify lesions at 5416. This CNN model evaluates image patches centered on the proposed location at 5406 and infers metadata about the lesion in question. This metadata can include, but is not limited to, the features of the lesion, including one or more of size, shape, margin, opacity, or heterogeneity, the location of the lesion within the body, the relationship to surrounding lesions and tissue properties surrounding the lesion, the malignancy, or the cancerous subtype of the lesion. The CNN model optionally uses the segmentation generated by the CNN model at 5410 and stored at 5412 to help the classifications.” Page 45 last paragraph and first line of page 46).
Golden does not teach extracting image findings from a radiology report describing the medical image, including the at least one lesion, using a natural language processing (NLP) algorithm; retrieving demographic and clinical data of the patient from at least one of a picture archiving and communication system (PACS) or a radiology information system (RIS); identifying a plurality of similar patients based on the extracted image findings and the demographic and clinical data of the patient; creating a similar patient cohort by aggregating data from the identified plurality of similar patients, wherein the aggregated data includes demographic data, clinical data and medical images of the similar patients, respectively; retrieving the medical images from the similar patient cohort
Barnes teaches extracting image findings from a radiology report describing the medical image, including the at least one lesion, using a natural language processing (NLP) algorithm (“The machine algorithm may have the following input, such as data from all patient EMR, PAC, AP-LIS, CP-LIS loading, a PDF, medical notes, documents, image metadata, mainly in oncology and normal medical body covered in the system (e.g., Snored, NCI Thesaurus, ICD-10, Radlex, etc.). by the NLP technology, capable of extracting an attribute from the patient data, such as demographic information, functional status, family history, cancer staging and feature or tissue, biologic marker, genetic variation, diagnostic testing and result mapping to the additional search query of oncology.” Page 35 paragraphs 3 and 4); retrieving demographic and clinical data of the patient from at least one of a picture archiving and communication system (PACS) (“The machine algorithm may have the following input, such as data from all patient EMR, PAC, AP-LIS, CP-LIS loading, a PDF, medical notes, documents, image metadata, mainly in oncology and normal medical body covered in the system (e.g., Snored, NCI Thesaurus, ICD-10, Radlex, etc.). by the NLP technology, capable of extracting an attribute from the patient data, such as demographic information, functional status, family history, cancer staging and feature or tissue, biologic marker, genetic variation, diagnostic testing and result mapping to the additional search query of oncology.” Page 35 paragraphs 3 and 4); identifying a plurality of similar patients based on the extracted image findings and the demographic and clinical data of the patient (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5); creating a similar patient cohort by aggregating data from the identified plurality of similar patients, wherein the aggregated data includes demographic data, clinical data and medical images of the similar patients, respectively (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5 and figs 16-20 for GUI screenshots); retrieving the medical images from the similar patient cohort (“FIG. 20 shows another search results page displayed by the search engine 62. search results page 220 can be similar to in FIG. 5, the search result page 80, and may have one or more portions providing different information and function to the user. a result portion 220 of search results page 82, attribute selecting part 84, a display part 222 and range selection section 88. display section 222 provided for the medical personnel to compare capability of other attributes of the patient, the other attributes including but not limited to radiological imaging, pathological imaging and molecular imaging. the user can select image button from the display section 80 or the display section 222 " to access the display section 222. Similarly, the display section 80 can be accessed from the display section 80 or the display section 222 " chart button selected by the user.” Page 23 paragraphs 3 and 4)
Barnes is analogous art in the same field of endeavor as the current invention. Barnes is directed towards a system that uses artificial intelligence to extract and analyze patient data (“The machine algorithm may have the following input, such as data from all patient EMR, PAC, AP-LIS, CP-LIS loading, a PDF, medical notes, documents, image metadata, mainly in oncology and normal medical body covered in the system (e.g., Snored, NCI Thesaurus, ICD-10, Radlex, etc.). by the NLP technology, capable of extracting an attribute from the patient data, such as demographic information, functional status, family history, cancer staging and feature or tissue, biologic marker, genetic variation, diagnostic testing and result mapping to the additional search query of oncology.” Page 35 paragraphs 3 and 4). A person of ordinary skill in the art before the effective filing date of the claimed invention would have found it obvious to combine the lesion analysis process of Golden with the aggregated patient data search of Barnes by utilizing Barnes’ lesion based search mechanics to allow for the comparison of patient lesion data, following the segmentation, identification, and determination processes of Golden, with the expectation that doing so would lead to improved patient care (“One advantage of the present application is to make the medical staff capable of visualization, relevant patient information, cooperation and action of insight to improve patient care.” Page 9 paragraph 2 And “Another advantage of this application is that a plurality of subject care team can on the clinical patient data associated with the current best evidence for integration to inform clinical decision and improve nursing quality of patients.” Page 9 paragraph 7).
With respect to claim 2, Golden and Barnes teach the method of claim 1. Golden further teaches it further comprising: determining medical diagnosis and medical treatment of the patient for the at least one lesion based on the predicted histological nature of the at least one lesion (“In at least one implementation, a pre-trained CNN model for classification of lesions at 5414 is used to classify lesions at 5416. This CNN model evaluates image patches centered on the proposed location at 5406 and infers metadata about the lesion in question. This metadata can include, but is not limited to, the features of the lesion, including one or more of size, shape, margin, opacity, or heterogeneity, the location of the lesion within the body, the relationship to surrounding lesions and tissue properties surrounding the lesion, the malignancy, or the cancerous subtype of the lesion. The CNN model optionally uses the segmentation generated by the CNN model at 5410 and stored at 5412 to help the classifications.” Page 45 last paragraph and first line of page 46).
With respect to claim 3, Golden and Barnes teach the method of claim 1. Golder further teaches wherein the at least one lesion is detected by image segmentation (“trains a fully convolutional neural network (CNN) model to: classify if the entire input anatomical structure contains a lesion candidate; or segment lesion candidates utilizing the received learning data” page 5 paragraph 6 – page 6 paragraph 1).
With respect to claim 4, Golden and Barnes teach the method of claim 1, Barnes further teaches wherein identifying the plurality of similar patients comprises: searching a clinical database of patients using a query having search terms indicative of the demographic and clinical data of the patient (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5 and figs 16-20 for GUI screenshots); and identifying patients in the clinical database matching a predetermined number or percentage of the search terms as similar patients (see figure 16 and “In other embodiments, the results section 196 can classify the patient list based on specific attributes of the match, or results portion 196 can display a patient list of those patients which matches the search query of all or a predefined number of attributes.” Page 22 paragraph 1 lines 14-16).
With respect to claim 5, Golden and Barnes teach the method of claim 3. Barnes further teaches wherein the clinical database comprises at least one of electronic medical records (EMR) database (“integrated search tool to search engine 62, which enables medical personnel can use the aggregation information of the EMR system 20 and information system 22 to search database 35. search engine 62 can be used for automated similar patient search and manual interaction similar patient search.” Page 20 last line – page 21 lines 1-2).
With respect to claim 8, Golden and Barnes teach the method of claim 1, Golden teaches wherein the clinical database is updated to reference the medical images in the separate imaging database (“The method described herein auto-triages disparate data streams (e.g., EMR data, imaging data, genotype data, phenotype data, etc.) and sends the data to the right algorithms and/or endpoints for processing and/or analysis.” Page 46 last paragraph And “The system can automatically report findings and their characterizations based on standard reporting templates and inputs created by both automated systems or users. The automatic report can be edited and supplemented by the user. In one case, the report is created as a simple paragraph with text describing the findings. This can be done by populating fields in a paragraph with the findings, or via natural language processing (NLP) methods of creating text. The automatic report can be structured so that findings are presented based on urgency and severity. The automatic report can also be a graphical report containing tables and images that describe the evolution of the findings over time.” Page 42 paragraphs 7-8 And “Report sent to people (e.g., clinicians, patients, etc.) and/or archiving (e.g., EMR, PACS, RIS, etc.).” page 48 line 6).
Barnes further teaches wherein the demographic and clinical data are stored in a clinical database (“In one embodiment, the information system accessed by the workflow tool can include, but are not limited to EMR (electronic medical record), PACS (picture archiving and communication system) and RIS (radioactive information system), digital pathology (DP) CL-LIS (clinical laboratory-laboratory information system), AP-LIS (anatomical pathology laboratory information system) and next generation sequencing (NGS) system. medical staff preparing committee meeting workflow tool may be used to summarize related information (such as tumor information) and patient demographic data from heterogeneous information system.” Page 9 last paragraph), and the medical images of the similar patients are stored in an imaging database separate from the clinical database ( “In one embodiment, the information system accessed by the workflow tool can include, but are not limited to EMR (electronic medical record), PACS (picture archiving and communication system) and RIS (radioactive information system), digital pathology (DP) CL-LIS (clinical laboratory-laboratory information system), AP-LIS (anatomical pathology laboratory information system) and next generation sequencing (NGS) system. medical staff preparing committee meeting workflow tool may be used to summarize related information (such as tumor information) and patient demographic data from heterogeneous information system.” Page 9 last paragraph).
With respect to claim 9, Golden and Barnes teach the method of claim 1. Golden teaches the medical images of the similar patients are stored in the clinical database in association with the demographic and clinical data (“The method described herein auto-triages disparate data streams (e.g., EMR data, imaging data, genotype data, phenotype data, etc.) and sends the data to the right algorithms and/or endpoints for processing and/or analysis.” Page 46 last paragraph And “The system can automatically report findings and their characterizations based on standard reporting templates and inputs created by both automated systems or users. The automatic report can be edited and supplemented by the user. In one case, the report is created as a simple paragraph with text describing the findings. This can be done by populating fields in a paragraph with the findings, or via natural language processing (NLP) methods of creating text. The automatic report can be structured so that findings are presented based on urgency and severity. The automatic report can also be a graphical report containing tables and images that describe the evolution of the findings over time.” Page 42 paragraphs 7-8 And “Report sent to people (e.g., clinicians, patients, etc.) and/or archiving (e.g., EMR, PACS, RIS, etc.).” page 48 line 6). Barnes teaches wherein the demographic and clinical data are stored in a clinical database (“In one embodiment, the information system accessed by the workflow tool can include, but are not limited to EMR (electronic medical record), PACS (picture archiving and communication system) and RIS (radioactive information system), digital pathology (DP) CL-LIS (clinical laboratory-laboratory information system), AP-LIS (anatomical pathology laboratory information system) and next generation sequencing (NGS) system. medical staff preparing committee meeting workflow tool may be used to summarize related information (such as tumor information) and patient demographic data from heterogeneous information system.” Page 9 last paragraph).
With respect to claim 10, Golden and Barnes teach the method of claim 1, Golden further teaches wherein performing the radiomics-derived quantitative analysis on the retrieved medical images to train the ANN classification model comprises: performing segmentation of the medical images of the similar patients (“The segmentation methodology of the present disclosure utilizes customized fully convolutional neural networks for end-to-end 3D training and segmentation. This deep learning approach is able to learn a huge number of features representative of the training data presented to it, resulting in superior generalization performance. Furthermore, the network is able to consider full spatial context for all lesion candidates that need to be segmented at the intrinsic resolution of the scan “ page 21 paragraph 4 and “Once the image data of the query lesion is loaded, the trained CNN model 2906 is used to extract image features from the image data at 2908. The image features and clinical features are then used to calculate the similarity 2914 between the query lesion and lesions from the CBIR database 2912.” Page 32 paragraph 2); homogenizing the medical images with respect to one or more of pixel spacing (“To learn features at a variety of spatial scales, the input data are resampled to different real-world spacing per pixel and combine the learned latent features.” Page 20 paragraph 5); performing radiomic feature extraction on the homogenized medical images (“Once the image data of the query lesion is loaded, the trained CNN model 2906 is used to extract image features from the image data at 2908. The image features and clinical features are then used to calculate the similarity 2914 between the query lesion and lesions from the CBIR database 2912.” Page 32 paragraph 2); and performing feature selection and dimension reduction to reduce features to be used for training the ANN classification model for the similar patient cohort (“The general idea behind fully convolutional networks (FCNs) is to use a downsampling path to learn relevant features at a variety of spatial scales followed by an upsampling path to combine the features for pixelwise prediction. The downsampling path generally includes convolution and pooling layers” page 23 last paragraph).
With respect to claim 11, Golden and Barnes teach the method of claim 1, Golden further teaches wherein applying the at least one lesion to the trained ANN classification model to predict the histological nature of the at least one lesion comprises predicting malignancy of the at least one lesion (“In at least one implementation, a pre-trained CNN model for classification of lesions at 5414 is used to classify lesions at 5416. This CNN model evaluates image patches centered on the proposed location at 5406 and infers metadata about the lesion in question. This metadata can include, but is not limited to, the features of the lesion, including one or more of size, shape, margin, opacity, or heterogeneity, the location of the lesion within the body, the relationship to surrounding lesions and tissue properties surrounding the lesion, the malignancy, or the cancerous subtype of the lesion. The CNN model optionally uses the segmentation generated by the CNN model at 5410 and stored at 5412 to help the classifications. “ page 45 last paragraph – page 46 line 1).
With respect to claim 12, Golden teaches a system for predicting histological nature for lesions of a patient (“In at least one implementation, a pre-trained CNN model for classification of lesions at 5414 is used to classify lesions at 5416. This CNN model evaluates image patches centered on the proposed location at 5406 and infers metadata about the lesion in question. This metadata can include, but is not limited to, the features of the lesion, including one or more of size, shape, margin, opacity, or heterogeneity, the location of the lesion within the body, the relationship to surrounding lesions and tissue properties surrounding the lesion, the malignancy, or the cancerous subtype of the lesion. The CNN model optionally uses the segmentation generated by the CNN model at 5410 and stored at 5412 to help the classifications. “ page 45 last paragraph – page 46 line 1), the system comprising: at least one processor (“A machine learning system may be summarized as including at least one nontransitory processor-readable storage medium that stores at least one of processor-executable instructions or data” page 5 last paragraph); and a non-transitory memory storing instructions (“A machine learning system may be summarized as including at least one nontransitory processor-readable storage medium that stores at least one of processor-executable instructions or data” page 5 last paragraph) that, when executed by the processor, cause the at least one processor to: detect at least one lesion in a medical image of the patient (“trains a fully convolutional neural network (CNN) model to: classify if the entire input anatomical structure contains a lesion candidate; or segment lesion candidates utilizing the received learning data” page 5 paragraph 6 – page 6 paragraph 1); perform radiomics-derived quantitative analysis on the retrieved medical images to train an artificial neural network (ANN) classification model (“Finally, a CNN model could be used to directly predict the similarity of a query lesion to other lesions in the database. For example, if a training data set was created that consisted of a set of query lesions and their quantitative similarity to some or all lesions within a database of lesions, a model could be trained on that data set. That model would then be able to predict similarity for a new query lesion to lesions from the database.” Page 31 paragraph 3 and “Training is cyclical process and includes repeated loading of batches of training data from the database at 2704, followed by a standard CNN training iteration 2706. The standard CNN training iteration 2706 includes a forward pass of image data through the network, calculation of a loss function, and updating the weights of the CNN model using backpropagation [LeCun 1998] For implementations in which the model is supervised, loss is calculated with respect to the network’s output and the ground truth label. For implementations in which the model is unsupervised, loss is calculated with respect to some other metric, such as the inter cluster distance of predicted results.” Page 32 paragraph 7); apply the at least one lesion to the trained ANN classification model to predict a histological nature of the at least one lesion (“In at least one implementation, a pre-trained CNN model for classification of lesions at 5414 is used to classify lesions at 5416. This CNN model evaluates image patches centered on the proposed location at 5406 and infers metadata about the lesion in question. This metadata can include, but is not limited to, the features of the lesion, including one or more of size, shape, margin, opacity, or heterogeneity, the location of the lesion within the body, the relationship to surrounding lesions and tissue properties surrounding the lesion, the malignancy, or the cancerous subtype of the lesion. The CNN model optionally uses the segmentation generated by the CNN model at 5410 and stored at 5412 to help the classifications.” Page 45 last paragraph and first line of page 46); medical diagnosis and/or medical treatment of the patient for the at least one lesion is determined based on the predicted histological nature of the at least one lesion (“In at least one implementation, a pre-trained CNN model for classification of lesions at 5414 is used to classify lesions at 5416. This CNN model evaluates image patches centered on the proposed location at 5406 and infers metadata about the lesion in question. This metadata can include, but is not limited to, the features of the lesion, including one or more of size, shape, margin, opacity, or heterogeneity, the location of the lesion within the body, the relationship to surrounding lesions and tissue properties surrounding the lesion, the malignancy, or the cancerous subtype of the lesion. The CNN model optionally uses the segmentation generated by the CNN model at 5410 and stored at 5412 to help the classifications. “ page 45 last paragraph – page 46 line 1).
Golden does not teach a graphical user interface (GUI) enabling a user to interface with the processor; at least one database storing demographic and clinical data and medical images of a plurality of patients; extract image findings from a radiology report describing the medical image, including the at least one lesion, using a natural language processing (NLP) algorithm; retrieve demographic and clinical data of the patient from the at least one database; identify similar patients from among the plurality of patients by searching the at least one database based on the extracted image findings and the demographic and clinical data of the patient; create a similar patient cohort by aggregating data from the identified similar patients,wherein the aggregated data includes demographic and clinical data and medical images of the similar patients, respectively;retrieve the medical images from the similar patient cohort and displaying the predicted histological nature of the at least one lesion on the GUI.
Barnes teaches a graphical user interface (GUI) enabling a user to interface with the processor (“FIG. 16-18 shows an exemplary screen shot of the display automatic similar patient search engine 62 search page of the GUI. As shown in FIG. 16, can be selected by the user a tool application interface 132 of application or from similar patient search icon 192 starts automatic similar patient search. the corresponding current patient clinical attributes then selected viewing of search engine 62 to initiate a similar patient search query. clinical attribute search engine 62 selected can be displayed in a query portion 194. query portion 194 defined by the search engine 62 for a search of attributes and allows the user to edit an attribute search.” Page 21 last line and page 22 first paragraph); at least one database storing demographic and clinical data and medical images of a plurality of patients (“The machine algorithm may have the following input, such as data from all patient EMR, PAC, AP-LIS, CP-LIS loading, a PDF, medical notes, documents, image metadata, mainly in oncology and normal medical body covered in the system (e.g., Snored, NCI Thesaurus, ICD-10, Radlex, etc.). by the NLP technology, capable of extracting an attribute from the patient data, such as demographic information, functional status, family history, cancer staging and feature or tissue, biologic marker, genetic variation, diagnostic testing and result mapping to the additional search query of oncology.” Page 35 paragraphs 3 and 4);extract image findings from a radiology report describing the medical image, including the at least one lesion, using a natural language processing (NLP) algorithm (“The machine algorithm may have the following input, such as data from all patient EMR, PAC, AP-LIS, CP-LIS loading, a PDF, medical notes, documents, image metadata, mainly in oncology and normal medical body covered in the system (e.g., Snored, NCI Thesaurus, ICD-10, Radlex, etc.). by the NLP technology, capable of extracting an attribute from the patient data, such as demographic information, functional status, family history, cancer staging and feature or tissue, biologic marker, genetic variation, diagnostic testing and result mapping to the additional search query of oncology.” Page 35 paragraphs 3 and 4); retrieve demographic and clinical data of the patient from the at least one database (“The machine algorithm may have the following input, such as data from all patient EMR, PAC, AP-LIS, CP-LIS loading, a PDF, medical notes, documents, image metadata, mainly in oncology and normal medical body covered in the system (e.g., Snored, NCI Thesaurus, ICD-10, Radlex, etc.). by the NLP technology, capable of extracting an attribute from the patient data, such as demographic information, functional status, family history, cancer staging and feature or tissue, biologic marker, genetic variation, diagnostic testing and result mapping to the additional search query of oncology.” Page 35 paragraphs 3 and 4); identify similar patients from among the plurality of patients by searching the at least one database based on the extracted image findings and the demographic and clinical data of the patient (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5); create a similar patient cohort by aggregating data from the identified similar patients,wherein the aggregated data includes demographic and clinical data and medical images of the similar patients, respectively (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5 and figs 16-20 for GUI screenshots); retrieve the medical images from the similar patient cohort (“FIG. 20 shows another search results page displayed by the search engine 62. search results page 220 can be similar to in FIG. 5, the search result page 80, and may have one or more portions providing different information and function to the user. a result portion 220 of search results page 82, attribute selecting part 84, a display part 222 and range selection section 88. display section 222 provided for the medical personnel to compare capability of other attributes of the patient, the other attributes including but not limited to radiological imaging, pathological imaging and molecular imaging. the user can select image button from the display section 80 or the display section 222 " to access the display section 222. Similarly, the display section 80 can be accessed from the display section 80 or the display section 222 " chart button selected by the user.” Page 23 paragraphs 3 and 4) and displaying the predicted histological nature of the at least one lesion on the GUI (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5 and figs 16-20 for GUI screenshots);
Barnes is analogous art in the same field of endeavor as the current invention. Barnes is directed towards a system that uses artificial intelligence to extract and analyze patient data (“The machine algorithm may have the following input, such as data from all patient EMR, PAC, AP-LIS, CP-LIS loading, a PDF, medical notes, documents, image metadata, mainly in oncology and normal medical body covered in the system (e.g., Snored, NCI Thesaurus, ICD-10, Radlex, etc.). by the NLP technology, capable of extracting an attribute from the patient data, such as demographic information, functional status, family history, cancer staging and feature or tissue, biologic marker, genetic variation, diagnostic testing and result mapping to the additional search query of oncology.” Page 35 paragraphs 3 and 4). A person of ordinary skill in the art before the effective filing date of the claimed invention would have found it obvious to combine the lesion analysis process of Golden with the aggregated patient data search of Barnes by utilizing Barnes’ lesion based search mechanics to allow for the comparison of patient lesion data, following the segmentation, identification, and determination processes of Golden, with the expectation that doing so would lead to improved patient care (“One advantage of the present application is to make the medical staff capable of visualization, relevant patient information, cooperation and action of insight to improve patient care.” Page 9 paragraph 2 And “Another advantage of this application is that a plurality of subject care team can on the clinical patient data associated with the current best evidence for integration to inform clinical decision and improve nursing quality of patients.” Page 9 paragraph 7).
With respect to claim 13, Golden and Barnes teaches the system of claim 12, and render obvious all further claim limitations in consideration of claim 3 due to the substantial similarities between claim 13 and claim 3, with claim 12 being a system that preforms the method of claim 1.
With respect to claim 14, Golden and Barnes teaches the system of claim 12, Barnes further teaches wherein the instructions cause the at least one processor to identify the similar patients by: searching the at least one database using a query having search terms indicative of the demographic and clinical data of the patient (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5 and figs 16-20 for GUI screenshots); and identifying patients in the at least one database matching a predetermined number or percentage of the search terms as similar patients (see figure 16 and “In other embodiments, the results section 196 can classify the patient list based on specific attributes of the match, or results portion 196 can display a patient list of those patients which matches the search query of all or a predefined number of attributes.” Page 22 paragraph 1 lines 14-16).
With respect to claim 15, Golden and Barnes teach the system of claim 14 and render obvious all additional limitations in consideration of claim 5 because claim 15 is substantially similar to claim 5, with claim 12 being a system that preforms the method of claim 1.
With respect to claim 17, Golden and Barnes teach the system of claim 12 and render obvious all additional limitations in consideration of claim 10 because claim 17 is substantially similar to claim 10, with claim 12 being a system that preforms the method of claim 1.
With respect to claim 18, Golden and Barnes teach the system of claim 12 and render obvious all additional limitations in consideration of claim 11 because claim 18 is substantially similar to claim 11, with claim 12 being a system that preforms the method of claim 1.
With respect to claim 19, Golden and Barnes render obvious all limitations in consideration of claim 12 because claim 19 is substantially similar to claim 12, with claim 19 being directed towards a non-transitory computer readable medium substantially similar to the non-transitory memory in claim 12. Additionally Golden teaches a non-transitory computer readable medium storing instructions (“A machine learning system may be summarized as including at least one nontransitory processor-readable storage medium that stores at least one of processor-executable instructions or data” page 5 last paragraph) for predicting the histological nature of lesions of a patient (“In at least one implementation, a pre-trained CNN model for classification of lesions at 5414 is used to classify lesions at 5416. This CNN model evaluates image patches centered on the proposed location at 5406 and infers metadata about the lesion in question. This metadata can include, but is not limited to, the features of the lesion, including one or more of size, shape, margin, opacity, or heterogeneity, the location of the lesion within the body, the relationship to surrounding lesions and tissue properties surrounding the lesion, the malignancy, or the cancerous subtype of the lesion. The CNN model optionally uses the segmentation generated by the CNN model at 5410 and stored at 5412 to help the classifications. “ page 45 last paragraph – page 46 line 1) that are executed by one or more processors(“A machine learning system may be summarized as including at least one nontransitory processor-readable storage medium that stores at least one of processor-executable instructions or data” page 5 last paragraph).
With respect to claim 20, Golden and Barnes teach the non-transitory computer readable medium of claim 19 (and the non-transitory memory of claim 18), and render obvious all claim limitations in consideration of claim 14, due to the substantial similarities of claim 14 and claim 20.
Claims 6-7 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Golden and Barnes as applied to claim 1 above, and further in view of Colley (US 20210090694 A1).
With respect to claim 6, Golden and Barnes teach the method of claim 1, but do not teach further limitations. Colley teaches wherein extracting the image findings from the radiology report using the NLP algorithm comprises applying domain-specific contextual embeddings (“Similarly, language models trained over clinical documents that are specific to a particular disease state, such as cancer, may perform better in medical-based OCR text cleaning tasks upon disease state-related clinical documents than language models trained over clinical documents that are not specific to a particular disease state. By providing a training set having millions of clinical documents that are similar to the documents submitted for OCR, an exemplary language model may be trained over in-domain text that many traditional NLP sources do not have access to, resulting in a more robust language model.” Paragraph 1564).
Colley is analogous art in the same field of endeavor as the claimed invention. Colley is directed towards systems that acquire and utilize patient data (“The present invention relates to systems and methods for obtaining and employing data related to patient characteristics, such as physical, clinical, or genomic characteristics, as well as diagnosis, treatments, and treatment efficacy to provide a suite of tools to healthcare providers, researchers, and other interested parties enabling those entities to develop new insights utilizing disease states, treatments, results, genomic information and other clinical information to improve overall patient healthcare.” Paragraph 0081). A person of ordinary skill in the art before the effective filing date of the claimed invention would have found it obvious to combine the system of Golden and Barnes by utilizing the NLP teachings of Colley in place of the NLP engine of Barnes, with the expectation that doing so would lead to better patient care (“The present invention relates to systems and methods for obtaining and employing data related to patient characteristics, such as physical, clinical, or genomic characteristics, as well as diagnosis, treatments, and treatment efficacy to provide a suite of tools to healthcare providers, researchers, and other interested parties enabling those entities to develop new insights utilizing disease states, treatments, results, genomic information and other clinical information to improve overall patient healthcare.” Paragraph 0081).
With respect to claim 7, Golden and Barnes teach the method of claim 1. Barnes further teaches wherein the demographic and clinical data comprise at least age, gender and race of the patient, and past and current medical diagnoses and treatments (“search engine can be used for executing automatic similar patient search. similar patient search for automation based on search query mode for the specific diagnosis of medical personnel, such as a type of cancer. search engine can but can be edited based on specific clinical features to make diagnosis (with the specific cancer type) automatic search query of a patient, the clinical features can include, but are not limited to, age, gender, biological labeling, BIRADS (breast imaging reporting and data system) classifying, staging information, previous treatment, result, and family history.” Page 10 paragraph 5 and figs 16-20 for GUI screenshots).
Barnes does not teach wherein the demographic and clinical data comprises race of the patient.
Colley teaches wherein the race of the patient is important clinical and demographic data (“Sixth, it is known that cancer state factors (e.g., diagnosed cancer, location of cancer, cancer stage, other cancer characteristics, other user conditions (e.g., age, gender, weight, race, genetics, habits (e.g., smoking, drinking, diet)), other pertinent medical conditions (e.g., high blood pressure, other diseases, etc.), medications, other pertinent medical history, current side effects of cancer treatments and other medications, etc.) and combinations of those factors render some treatments more efficacious for one patient than other treatments or for one patient as opposed to other patients. Awareness of those factors and their effects is extremely important and difficult to master and apply, especially under the pressure of time constraints when delay can appreciably affect treatment efficacy and even treatment options and when there are new insights into treatment efficacy all the time.” Paragraph 0064).
Colley is analogous art in the same field of endeavor as the claimed invention. Colley is directed towards systems that acquire and utilize patient data (“The present invention relates to systems and methods for obtaining and employing data related to patient characteristics, such as physical, clinical, or genomic characteristics, as well as diagnosis, treatments, and treatment efficacy to provide a suite of tools to healthcare providers, researchers, and other interested parties enabling those entities to develop new insights utilizing disease states, treatments, results, genomic information and other clinical information to improve overall patient healthcare.” Paragraph 0081). A person of ordinary skill in the art before the effective filing date of the claimed invention would have found it obvious to combine the system of Golden and Barnes by utilizing the cancer state factors of Colley in combination with the clinical search query features of Barnes, with the expectation that doing so would lead to better patient care (“The present invention relates to systems and methods for obtaining and employing data related to patient characteristics, such as physical, clinical, or genomic characteristics, as well as diagnosis, treatments, and treatment efficacy to provide a suite of tools to healthcare providers, researchers, and other interested parties enabling those entities to develop new insights utilizing disease states, treatments, results, genomic information and other clinical information to improve overall patient healthcare.” Paragraph 0081).
With respect to claim 16, Golden and Barnes teach the system of claim 12 and in view of Colley render obvious all claim limitations in consideration of claim 7, due to the substantial similarities of claims 16 to claim 7, with claim 12 being a system that preforms the method of claim 1.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to REBECCA C WILLIAMS whose telephone number is (571)272-7074. The examiner can normally be reached M-F 7:30am - 4:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew W Bee can be reached at (571)270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/REBECCA COLETTE WILLIAMS/Examiner, Art Unit 2677
/ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677