DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of claims: claims 1-15 are examined below.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 8/8/2023 was filed and considered. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Arguments
Applicant's arguments filed 11/4/2025 have been fully considered but they are not persuasive.
Applicant remark – (pages 13-18) Applicant argued the lack of teaching by TAKEBE et al (US 2048/0268263) regarding claim limitation: “…a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value, and a processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value.” Please see the Remarks for further detail.
Examiner response – An update search Galperin (US 2009/0082637) teaches such claim limitation in figures 12 and 0157-0159 with further details in 0146 as well as 0153. The combine teaching of TAKEBE et al (US 2048/0268263) in view of Galperin (US 2009/0082637) address these claim limitations. This addresses the Remarks regarding lack of this teaching in all the dependent claims 1, 3, 5, 7, 9, 11 and 13-15. This addresses the same arguments by the Remarks in the dependent claims for pages 18-22 as well. Please see the Office Action below for further detail.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-15 are rejected under 35 U.S.C. 103 as being unpatentable over TAKEBE et al (US 2048/0268263) in view of Galperin (US 2009/0082637).
Claim 1:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
An image diagnosis support device (figure 1 part 112-1 teaches diagnosis support unit) comprising:
a processor configured to execute a program for performing an image processing on a target image (0003 teaches scanned image data for diagnosis of diseases with target diagnosis for tumor regions); and
a memory configured to store a result of the image processing, wherein the processor is configured to execute (figure 1 and 0040-0043 teaches use of storage apparatus 114 with support programs for functions of diagnosis)
a processing of inputting an image (figure 1 and 0040-0041 teaches CT apparatus 111 couple with image processing apparatus 112 teaches inputted scanned images to the image processing apparatus 112),
a processing of extracting a feature of an object from the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities),
a processing of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value, and a processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value.
Galperin teaches following subject matter: a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)), and a processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 2:
TAKEBE et al further teaches:
The image diagnosis support device according to claim 1, wherein in the processing of creating the feature dictionary, the processor creates the feature dictionary using a feature of any layer in a machine learning network (0004 teaches creating database, required to extract, as case data, image data of a given region, where figure 1 part 112-1 and paragraph 0049-0050 teaches learning model such as support vector machine to determine whether region data to include tumor regions; 0034 teaches using of learning model (machine learning network); figure 5 and 0069-0040 teaches learning model; 0075-0078 further detail learning model for class determination of extracted region data 611).
Claim 3:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
An image diagnosis support (figure 1 part 112-1 teaches diagnosis support unit) device comprising:
a processor configured to execute a program for performing an image processing on a target image (0003 teaches scanned image data for diagnosis of diseases with target diagnosis for tumor regions); and
a memory configured to store a result of the image processing, wherein the processor is configured to execute (figure 1 and 0040-0043 teaches use of storage apparatus 114 with support programs for functions of diagnosis)
a processing of inputting an image (figure 1 and 0040-0041 teaches CT apparatus 111 couple with image processing apparatus 112 teaches inputted scanned images to the image processing apparatus 112),
a processing of extracting a feature of an object from the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities),
a processing of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114),
a processing of determining, using the feature dictionary, whether the target image is underfitted (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions (underfitted) with plurality of feature quantities).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value, and a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using a determination result as to whether the target image is underfitted, the classification value, and the feature similarity classification value.
Galperin teaches following subject matter: a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)), and a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using a determination result as to whether the target image is underfitted, the classification value, and the feature similarity classification value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 4:
TAKEBE et al further teaches:
The image diagnosis support device according to claim 3, wherein in the processing of creating the feature dictionary, the processor creates the feature dictionary using a feature of any layer in a machine learning network (0004 teaches creating database, required to extract, as case data, image data of a given region, where figure 1 part 112-1 and paragraph 0049-0050 teaches learning model such as support vector machine to determine whether region data to include tumor regions; 0034 teaches using of learning model (machine learning network); figure 5 and 0069-0040 teaches learning model; 0075-0078 further detail learning model for class determination of extracted region data 611).
Claim 5:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
An image diagnosis support (figure 1 part 112-1 teaches diagnosis support unit) device comprising:
a processor configured to execute a program for performing an image processing on a target image (0003 teaches scanned image data for diagnosis of diseases with target diagnosis for tumor regions); and
a memory configured to store a result of the image processing, wherein the processor is configured to execute (figure 1 and 0040-0043 teaches use of storage apparatus 114 with support programs for functions of diagnosis)
a processing of inputting an image (figure 1 and 0040-0041 teaches CT apparatus 111 couple with image processing apparatus 112 teaches inputted scanned images to the image processing apparatus 112),
a processing of extracting a feature of an object from the target image (xxx0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities
a processing of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value, a processing of calculating a similarity between the target image and the training image using the feature dictionary, and presenting an classification reason for the target image using the calculated similarity, and
a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value.
Galperin teaches following subject matter: a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value, a processing of calculating a similarity between the target image and the training image using the feature dictionary, and presenting an classification reason for the target image using the calculated similarity (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)), and a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 6:
TAKEBE et al further teaches:
The image diagnosis support device according to claim 5, wherein in the processing of creating the feature dictionary, the processor creates the feature dictionary using a feature of any layer in a machine learning network (0004 teaches creating database, required to extract, as case data, image data of a given region, where figure 1 part 112-1 and paragraph 0049-0050 teaches learning model such as support vector machine to determine whether region data to include tumor regions; 0034 teaches using of learning model (machine learning network); figure 5 and 0069-0040 teaches learning model; 0075-0078 further detail learning model for class determination of extracted region data 611).
Claim 7:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
An image diagnosis support (figure 1 part 112-1 teaches diagnosis support unit) method for classifying a desired object in a target image, the image diagnosis support method comprising:
a processor, configured to execute a program for performing an image processing on the target image, executing a step of inputting an image obtained by imaging an object (0003 teaches scanned image data for diagnosis of diseases with target diagnosis for tumor regions; figure 1 and 0040-0041 teaches CT apparatus 111 couple with image processing apparatus 112 teaches inputted scanned images to the image processing apparatus 112);
the processor executing a step of extracting a feature of the object in the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities);
the processor executing a step of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
the processor executing a step of classifying the target image based on the feature and calculating an classification value; the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value; and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value.
Galperin teaches following subject matter: the processor executing a step of classifying the target image based on the feature and calculating an classification value; the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)); and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 8:
TAKEBE et al further teaches:
The image diagnosis support method according to claim 7, wherein in the step of creating the feature dictionary, the feature dictionary is created using a feature of any layer in a machine learning network (0004 teaches creating database, required to extract, as case data, image data of a given region, where figure 1 part 112-1 and paragraph 0049-0050 teaches learning model such as support vector machine to determine whether region data to include tumor regions; 0034 teaches using of learning model (machine learning network); figure 5 and 0069-0040 teaches learning model; 0075-0078 further detail learning model for class determination of extracted region data 611).
Claim 9:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
An image diagnosis support (figure 1 part 112-1 teaches diagnosis support unit) method for classifying a desired object in a target image, the image diagnosis support method comprising:
a processor, configured to execute a program for performing an image processing on the target image, executing a step of inputting an image obtained by imaging an object (0003 teaches scanned image data for diagnosis of diseases with target diagnosis for tumor regions; figure 1 and 0040-0041 teaches CT apparatus 111 couple with image processing apparatus 112 teaches inputted scanned images to the image processing apparatus 112);
the processor executing a step of extracting a feature of the object in the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities);
the processor executing a step of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114);
the processor executing a step of determining, using the feature dictionary, whether the target image is underfitted (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions (underfitted) with plurality of feature quantities).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
the processor executing a step of classifying the target image based on the feature and calculating an classification value; the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value; and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using a determination result as to whether the target image is underfitted, the classification value, and the feature similarity classification value.
Galperin teaches following subject matter: the processor executing a step of classifying the target image based on the feature and calculating an classification value; the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)); and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using a determination result as to whether the target image is underfitted, the classification value, and the feature similarity classification value value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 10:
TAKEBE et al further teaches:
The image diagnosis support method according to claim 9, wherein in the step of creating the feature dictionary, the feature dictionary is created using a feature of any layer in a machine learning network (0004 teaches creating database, required to extract, as case data, image data of a given region, where figure 1 part 112-1 and paragraph 0049-0050 teaches learning model such as support vector machine to determine whether region data to include tumor regions; 0034 teaches using of learning model (machine learning network); figure 5 and 0069-0040 teaches learning model; 0075-0078 further detail learning model for class determination of extracted region data 611).
Claim 11:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
An image diagnosis support method (figure 1 part 112-1 teaches diagnosis support unit) for classifying a desired object in a target image, the image diagnosis support method comprising:
a processor, configured to execute a program for performing an image processing on the target image, executing a step of inputting an image obtained by imaging an object (0003 teaches scanned image data for diagnosis of diseases with target diagnosis for tumor regions; figure 1 and 0040-0041 teaches CT apparatus 111 couple with image processing apparatus 112 teaches inputted scanned images to the image processing apparatus 112);
the processor executing a step of extracting a feature of the object in the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities);
the processor executing a step of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114);
the processor executing a step of classifying the target image based on the feature and calculating an classification value (0004-0005 teaches boundary surfaces are classify case data include tumor and not including tumor regions; 0008 teaches classifying a group of image data from plurality of types of feature into classes with degree of separation relative to reference value; 0100-0101 teaches value for the feature space in position of the extracted region data 611).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value ; the processor executing a step of calculating a similarity between the target image and the training image using the feature dictionary, and presenting an classification reason for the target image using the calculated similarity; and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value
Galperin teaches following subject matter: the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value ; the processor executing a step of calculating a similarity between the target image and the training image using the feature dictionary, and presenting an classification reason for the target image using the calculated similarity (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)); and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 12:
TAKEBE et al further teaches:
The image diagnosis support method according to claim 11, wherein in the step of creating the feature dictionary, the feature dictionary is created using a feature of any layer in a machine learning network (0004 teaches creating database, required to extract, as case data, image data of a given region, where figure 1 part 112-1 and paragraph 0049-0050 teaches learning model such as support vector machine to determine whether region data to include tumor regions; 0034 teaches using of learning model (machine learning network); figure 5 and 0069-0040 teaches learning model; 0075-0078 further detail learning model for class determination of extracted region data 611).
Claim 13:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
An image diagnosis support (figure 1 part 112-1 teaches diagnosis support unit) method for classifying a desired object in a target image, the image diagnosis support method comprising:
a processor, configured to execute a program for performing an image processing on the target image (0003 teaches scanned image data for diagnosis of diseases with target diagnosis for tumor regions), executing a step of inputting an image obtained by imaging an object (figure 1 and 0040-0041 teaches CT apparatus 111 couple with image processing apparatus 112 teaches inputted scanned images to the image processing apparatus 112);
the processor executing a step of extracting a feature of the object in the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities);
the processor executing a step of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114);
the processor executing a step of determining, using the feature dictionary, whether the target image is underfitted (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions (underfitted) with plurality of feature quantities);
the processor executing a step of classifying the target image based on the feature and calculating an classification value (0004-0005 teaches boundary surfaces are classify case data include tumor and not including tumor regions; 0008 teaches classifying a group of image data from plurality of types of feature into classes with degree of separation relative to reference value; 0100-0101 teaches value for the feature space in position of the extracted region data 611).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value; the processor executing a step of calculating a similarity between the target image and the training image using the feature dictionary, and presenting an classification reason for the target image using the calculated similarity; and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using a determination result as to whether the target image is underfitted, the classification value, and the feature similarity classification value.
Galperin teaches following subject matter: the processor executing a step of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value; the processor executing a step of calculating a similarity between the target image and the training image using the feature dictionary, and presenting an classification reason for the target image using the calculated similarity (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)); and the processor executing a step of determining presence or absence of the object and a likelihood of the object for the target image using a determination result as to whether the target image is underfitted, the classification value, and the feature similarity classification value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 14:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
A remote diagnosis support (figure 1 part 112-1 teaches diagnosis support unit) system comprising:
a server including an image diagnosis support device including a processor configured to execute a program for performing an image processing on a target image (figure 1 and paragraph 0047 teaches network 140 between client and servers); and
a memory configured to store a result of the image processing, the processor executing a processing of inputting an image obtained by imaging an object (figure 1 and 0040-0043 teaches use of storage apparatus 114 with support programs for functions of diagnosis),
a processing of extracting a feature of the object in the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities),
a processing of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114),
an image acquisition device including an imaging device that images image data, wherein the image acquisition device transmits the image data to the server (figure 1 and 0040-0041 teaches CT imaging system electrically coupled to each other),
the server processes the received image data by the image diagnosis support device, stores an image of the determined object and a determination result in the memory, and transmits the image of the determined object and the determination result to the image acquisition device (figure 1 teaches diagnosis support unit 112-1, stores in databased 114-1 and 114-2 and transmit to display 122), and
the image acquisition device displays the received image of the determined object and the determination result on a display device (figure 1 teaches network 140 to display 122).
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value, and a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value.
Galperin teaches following subject matter: a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)), and a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Claim 15:
TAKEBE et al (US 2048/0268263) teaches the follow subject matter:
A network contract service providing system comprising (figure 1 part 140 network):
a server including an image diagnosis support (figure 1 part 112-1 teaches diagnosis support unit) device including a processor configured to execute a program for performing an image processing on a target image (figure 1 and paragraph 0047 teaches network 140 between client and servers); and
a memory configured to store a result of the image processing, the processor executing a processing of inputting an image obtained by imaging an object (figure 1 and 0040-0043 teaches use of storage apparatus 114 with support programs for functions of diagnosis),
a processing of extracting a feature of the object in the target image (0004 teaches extract case data includes tumor regions; 0045-0049 teaches extract region data including tumor regions and image data not including tumor regions with plurality of feature quantities),
a processing of extracting a feature of a training image and creating a feature dictionary (0004 teaches create a database (dictionary) storing case data extracted from image data in the given region including tumor region and case data not include tumor regions into respective classes; 0042 teaches image processing apparatus 112 stores the CT images taken by the CT apparatus in a CT image data storage database (DB) of the coupled storage apparatus 114),
an image acquisition device including an imaging device that images image data and the image diagnosis support device (figure 1 teaches diagnosis support unit 112-1, stores in databased 114-1 and 114-2 and transmit to display 122), wherein
the image acquisition device transmits the image data to the server (figure 1 and 0040-0041 teaches CT imaging system electrically coupled to each other),
the server processes the received image data by the image diagnosis support device, stores an image of the determined object, a classifier, and the feature dictionary in the memory, and transmits the image of the determined object, the classifier, and the feature dictionary to the image acquisition device (figure 1 teaches diagnosis support unit 112-1, stores in databased 114-1 and 114-2 and transmit to display 122),
the image acquisition device stores the received image of the determined object, classifier, and feature dictionary (figure 1 teaches stored 114, classifier in 112-1 and dictionary 114-1 and 114-2), and
the image diagnosis support device in the image acquisition device determines an object in an image newly imaged by the imaging device using the classifier and the feature dictionary, and displays a result of the determination on a display device (figure 1 teaches diagnosis support unit 112-1, stores in databased 114-1 and 114-2 and transmit to display 122)
TAKEBE et al teaches all the subject matter above, but not the following subject matter:
a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value and a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value.
Galperin teaches following subject matter: a processing of classifying the target image based on the feature and calculating an classification value, a processing of classifying a feature similarity of the target image using the feature dictionary and calculating a feature similarity classification value (figure 12 and 0157-0159 teaches numerical classification score (classification value) to indicate disease likelihood or assessment, where 0146 detail object/target is classify by a score base on numerical similarity and feature computation base on classified or assessed in database (dictionary)) and a determination processing of determining presence or absence of the object and a likelihood of the object for the target image using the classification value and the feature similarity classification value (0153 teaches where object/target (lesion) base on values to rank or label 1 for presence of the disease and 0 for its absence).
TAKEBE et al and Galperin are both in the field image analysis especially using machine learning to assess and classify medical images of target area such that the combine outcome is predictable
Therefore it would have been obvious to one having ordinary skill before the effective filing date to modify TAKEBE et al by Galperin regarding classification score to assessing the likelihood of potential disease states is facilitated by computing a numerical score which is an assessment or is indicative of the likelihood that a particular diagnosis condition would be correct and advance the condition of the assessment as disclosed by Galperin in 0146.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Bredno et al (US 2016/0335478) teaches ADAPTIVE CLASSIFICATION FOR WHOLE SLIDE TISSUE SEGMENTATION – 0034 teaches image region classifier module may determine and/or output a confidence score, e.g. a confidence score indicative of the confidence of the classifying of a respective image region. As such, any classifying of an individual image region may have a respective confidence score, and any confidence score may relate to the classifying of a respective, individual image region. The confidence score may be representative of a probability that the classifying of the respective image region is correct.
WODLINGER et al (US 2018/0333140) teaches SYSTEM COMPRISING INDICATOR FEATURES IN HIGH-RESOLUTION MICRO-ULTRASOUND IMAGES – paragraph 0272 teaches Features in the images by: a) obtaining a high resolution micro-ultrasound image of prostate tissue that corresponds to tissue, which has also been biopsied and has been graded for a stage of cancer, ranging from benign to the highest grade; b) segmenting the region of the image corresponding to the biopsied tissue on the basis of contrasting areas or groups of pixels in the image, wherein each area or group of pixels constitutes a feature; and c) characterizing and providing a unique label to all detectable features and/or combination of features only in the region of the image that corresponds to biopsied tissue, ii) repeat steps a) through d) until all recurring patterns have been included in the list of Possible Features; iii) generate a Table of Candidate Features by: d) reading the micro-ultrasound images in the area corresponding to tissue, which has been graded for a stage of cancer
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TSUNG-YIN TSAI whose telephone number is (571)270-1671. The examiner can normally be reached 7am-4pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bhavesh Mehta can be reached at (571) 272-7453. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TSUNG YIN TSAI/Primary Examiner, Art Unit 2656