DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments, see Remarks pages 08-12, filed 12/03/2025, with respect to the Rejections of Claims 1-20 under 35 U.S.C. 101 have been fully considered and are persuasive. The Rejections of Claims 1-20 under 35 U.S.C. 101 have been withdrawn.
Applicant's arguments, see Remarks pages 12-15, filed 12/03/2025, with respect to the Rejections of amended claims 1, 16, and 20 under 35 U.S.C. 103 have been fully considered but they are not persuasive.
On page 14 of Remarks, Applicant argues:
PNG
media_image1.png
219
715
media_image1.png
Greyscale
Examiner respectfully disagrees.
In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
Paragraph 0082 of Matsubara discloses “the ratio R/G is compared with the first threshold value Th1 and the second threshold value Th2, and if the ratio R/G is equal to or less than the first threshold value Th1, it is determined that mucus or the like is being displayed, and if the ratio R/G is equal to or greater than the second threshold value Th2, it is determined that blood is being displayed,” wherein the first and second threshold values, used to evaluate whether blood is present within the image, constitute an extraction range defined by at least two reference values.
In addition, paragraph 0148 & 0152 of Yaguchi disclose “The threshold value determination section 1012 compares the calculated lesion coverage with a given threshold value. The lesion area included in the determination target image is sufficiently covered by the lesion area included in the reference image when the lesion coverage is equal to or larger than the threshold value. In this case, it is determined that the determination target image can be deleted. The degree by which the lesion area cannot be observed due to deletion of the determination target image is high when the lesion coverage is less than the threshold value. In this case, it is determined that the determination target image cannot be deleted…When the step S104 is performed for the second or subsequent time (i.e., when the step S104 is performed after the step S106), the determination target image that has been determined to be allowed to remain by the deletion determination process in the step S106 is selected as the next reference image.” Wherein once a target image is able to satisfy the lesion coverage threshold value, the reference image is replaced by the target image, thus updating the threshold value.
Therefore, as is further disclosed below in the rejection of Claim 1 under 35 U.S.C. 103, it would have been obvious to one of ordinary skill in the art, prior to the effective filing date of the claimed invention, to incorporate the known technique of updating the thresholding/reference values based on an in-range image taught by Yaguchi by updating the second reference value associated with blood detection disclosed by Matsubara with the R/G ratio value of an extracted lesion image.
On page 14 of Remarks, Applicant argues:
PNG
media_image2.png
350
747
media_image2.png
Greyscale
Examiner respectfully disagrees.
Paragraph 0119 of Kitamura discloses “After the loop-A process is performed by taking each organ type as the processing organ type, and a criterion for each organ type is created, the series of in-vivo images acquired at Step hl and recorded in the recording unit 14b are sequentially read out one by one. Then, the lesion-area detecting unit 19 performs the lesion-area detection process by taking the read in-vivo image as the processing target (Step h19). In the lesion-area detection process, values calculated for the organ type of the processing target image through the criterion creation process at Step h15 are used as the criterion in the hue direction and the criterion in the saturation direction,” wherein the captured in-vivo images are processed in order to determine the organ type the endoscope has reached, based on stored criterion. Wherein once the organ has been determined, the criterion for detecting a lesion area is modified based on the determined organ. Wherein paragraph 0116 of Kitamura describes the organ types as being the esophagus, the stomach, the small intestine, and the large intestine, which are all organs.
Therefore, Kitamura discloses the limitation “determine whether or not the endoscope has reached an organ in the subject.”
As per claim(s) 16 and 20, arguments made in rejecting claim(s) 1 are analogous.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-6, 8, and 15-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kimoto (JP2006293237A) in view of Matsubara (JP2012147928A), Tanaka et al. (US2012076374A1) hereinafter referenced as Tanaka, Yaguchi (US2015/0199590A1), and Kitamura (US2010/0208047A1).
Regarding claim 1, Kimoto discloses: An image processing device, comprising: a processor comprising hardware (Kimoto: 0029: “Information input via the operation unit 15 is transmitted to the control unit 16, and the control unit 16 controls each component to perform predetermined processing based on the information input via the operation unit 15.”), the processor being configured to: capture an image that is captured by an endoscope during an operation on a subject, wherein image data of the captured image is recorded in a memory as an image of interest (Kimoto: 0021: “The primary function of the receiving device 4 is to receive a wireless signal from the capsule endoscope 3 via a receiving antenna selected from among the receiving antennas 5a to 5h, and to generate predetermined image data by performing predetermined processing, and to record the generated image data.”).
Kimoto does not disclose expressly: determine an evaluation value of a captured image that is captured by an endoscope during an operation on a subject, determine whether or not the evaluation value is included in an extraction range recorded in the memory, wherein the extraction range is defined by at least two reference values,
determine the evaluation value is included in the extraction range when the evaluation value satisfies a condition associated with the at least two reference values,
extract, from the memory, the captured image as an image of interest including a
lesion when the processor has determined that the evaluation value is included in the extraction range.
Matsubara discloses: determining an evaluation value of a captured image, wherein determining the evaluation value comprises comparing every pixel in the image data (Matsubara: 0077: “the ratio R/G can be calculated as the ratio of the R signal of a red pixel to the G signal of a green pixel adjacent to each other in a CCD…Furthermore, the ratio R/G calculated for each pixel in this way may be averaged over the entire image and used as the ratio R/G referred to in the first and second embodiments. The ratio R/G may be the ratio of the integrated value of the entire image data of the R signal to the integrated value of the G signal..”), determining whether or not the evaluation value is included in an extraction range recorded in the memory, wherein the extraction range is defined by at least two reference values (Matsubara: 0082: “the ratio R/G is compared with the first threshold value Th1 and the second threshold value Th2, and if the ratio R/G is equal to or less than the first threshold value Th1, it is determined that mucus or the like is being displayed, and if the ratio R/G is equal to or greater than the second threshold value Th2, it is determined that blood is being displayed.”),
determining the evaluation value is included in the extraction range when the evaluation value satisfies a condition associated with the at least two reference values, and extracting, from the memory, the captured image as an image of interest including a lesion when the processor has determined that the evaluation value is included in the extraction range (Matsubara: 0058: “if the ratio R/G is equal to or greater than the second threshold value Th2, the DSP 52 determines that bleeding is occurring in the subject.”; Wherein the image containing bleeding constitutes an image including a lesion).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to identify the images containing lesions in the receiving device disclosed by Kimoto by implementing the R/G calculation algorithms taught by Matsubara. The suggestion/motivation for doing so would have been “The present invention has been made in consideration of the above-mentioned problems, and aims to provide an electronic endoscope system that can distinguish between mucus and blood” (Matsubara: 0009; Wherein the identification of issues, like bleeding, may notify others of any potential health issues). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results.
Kimoto in view of Matsubara does not disclose expressly: determining an evaluation value of a captured image, wherein determining the evaluation value comprises comparing every pixel in the image data with a reference pixel.
Tanaka discloses: method for analyzing an image and extracting a candidate region of interest estimated to contain a linear or massive structure (Tanaka: Abstract), wherein the linear or massive structure may correspond to MCE (Marginal Crypt Epithelium), a pit pattern, or blood vessels (Tanaka: 0054). The method comprises determining an evaluation value of a captured image, wherein determining the evaluation value comprises comparing every pixel in the image data with a reference pixel (Tanaka: 0052: “The computing unit 41 b functioning as a region extraction unit extracts a candidate region estimated to contain a linear mucosal microstructure from the image data using the evaluation value D calculated for each pixel (Step S 7 in FIG. 5). Specifically, for example, the computing unit 41 b extracts a region containing a pixel whose evaluation value D is equal to or larger than a predetermined threshold as the candidate region described above.”;
0060: “by calculating the evaluation value D by setting the weighting factors W 1 and W 2 in mathematical expression (2) described above to W 1 >W 2 , the present embodiment can extract a candidate region estimated to contain the mucosal microstructure to be detected even if the shape of the mucosal microstructure to be detected is indistinct. Specifically, if blood vessels are taken as an example, by calculating the evaluation value D by setting the weighting factors W 1 and W 2 in mathematical expression (2) described above to W 1 >W 2 , it is possible to extract a region with a less distinct linear structure but a more reddish tone than surroundings as a candidate region estimated to contain blood vessels.”; Wherein the pixel regions extracted based on the evaluation value constitutes comparing every pixel in the image data with a reference pixel).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to incorporate the known technique of extracting a candidate region of interest for diagnosis based on an evaluation value taught by Tanaka prior to performing the lesion detection method disclosed by Kimoto in view of Matsubara. The suggestion/motivation for doing so would have been “to an image processing apparatus and an image processing method used for diagnosis and the like of living tissue…a region extraction unit adapted to extract a candidate region estimated to contain the linear or massive structure from the image based on a calculation result of the evaluation value.” (Tanaka: 0003 & 0006). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results.
Kimoto in view of Matsubara and Tanaka does not disclose expressly: update the extraction range recorded in the memory by updating one of the at least two reference values based on the evaluation value.
Yaguchi discloses: a method of removing images of a captured lesion by determining a reference image and deleting subsequent images based on the values of the reference image (Yaguchi: 0148: “The lesion area included in the determination target image is sufficiently covered by the lesion area included in the reference image when the lesion coverage is equal to or larger than the threshold value. In this case, it is determined that the determination target image can be deleted.”). If it is determined that a target image is to be saved, the target image is saved as the reference image (Yaguchi: 0152: “When the step S104 is performed for the second or subsequent time (i.e., when the step S104 is performed after the step S106), the determination target image that has been determined to be allowed to remain by the deletion determination process in the step S106 is selected as the next reference image.”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to incorporate the known technique of updating the thresholding/reference values based on an in-range image taught by Yaguchi by updating the second reference value associated with blood detection disclosed by Kimoto in view of Matsubara and Tanaka with the R/G ratio value of an extracted lesion. The suggestion/motivation for doing so would have been “it is likely that the images that are closely situated in the image sequence (i.e., images that are close to each other temporally or spatially) are similar images, and it is not likely that it is necessary to check all of a large number of images in order to determine the captured information. Since the number of images may typically be tens of thousands or more, it takes time for the user to check all of the images.” (Yamaguchi: 0003). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results.
Kimoto in view of Matsubara, Tanaka, and Yaguchi does not disclose expressly: determine whether or not the endoscope has reached an organ in the subject, and reset one of the at least two reference values recorded in the memory to an initial value when the endoscope has reached the organ.
Kitamura discloses: determining whether or not the endoscope has reached a specific organ (Kitamura: Figure 1; 0116: “As a specific method of identifying the organ types, a known technique may be used appropriately. For example, a technique disclosed in Japanese Laid-open Patent Publication No. 2006-288612 may be used for identifying the organ types based on an average R value, an average G value, and an average B value of the in-vivo image. More specifically, respective value ranges for the average R value, the average G value, and the average B value are set for each organ type in advance.”), and resetting one of the at least two reference values recorded in the memory to an initial value when the endoscope has reached the organ (Kitamura: 0119: “After the loop-A process is performed by taking each organ type as the processing organ type, and a criterion for each organ type is created, the series of in-vivo images acquired at Step h1 and recorded in the recording unit 14b are sequentially read out one by one. Then, the lesion-area detecting unit 19 performs the lesion-area detection process by taking the read in-vivo image as the processing target (Step h19). In the lesion-area detection process, values calculated for the organ type of the processing target image through the criterion creation process at Step h15 are used as the criterion in the hue direction and the criterion in the saturation direction.”; Wherein the lesion criterion/thresholds are calculated based on the organ detected.).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement the organ detection techniques disclosed by Kitamura into Kimoto in view of Matsubara, Tanaka, and Yaguchi for the adjustment of the threshold values based on a detected organ. The suggestion/motivation for doing so would have been “because compositions of actual mucosal membrane differ between organs, colors of the mucous-membrane area vary depending on the organ types.” (Kitamura: 0124; Wherein the color variations across different organs may affect detection). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kimoto in view of Matsubara, Tanaka, and Yaguchi with Kitamura to obtain the invention as specified in claim 1.
Regarding claim 2, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 1, further comprising: an alarm configured to generate a notification that the image of interest has been extracted, and a monitor configured to display the notification, wherein the processor is configured to: control the alarm to generate the notification, control the monitor to display the notification (Kimoto: 0005: “the simple image display device can input a signal that has been received and processed by the receiving device, and after performing predetermined processing based on the input signal, displays the image captured by the capsule endoscope on a small display screen, allowing doctors and others to view the image in real time.”; Wherein the lesion image, extracted based on the R/G ratio thresholding disclosed by Matsubara, is displayed on the display screen, which constitutes an alarm receiving an indication that an image of interest has been extracted, and the displaying of the images constitutes the displaying of a notification.).
Regarding claim 3, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 1, wherein determination of the evaluation value further comprises extraction of feature data on the captured image (Matsubara: 0039: “The R/G calculation unit 60 operates prior to the generation of image data, and calculates the ratio R/G of the R signal to the G signal out of the three color signals of BGR input from the CCD 21, and compares it with a predetermined first threshold value Th1 and a second threshold value Th2.”; Wherein a ratio based on an image’s extracted candidate region’s RGB pixel values constitutes feature data.).
Regarding claim 4, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 3, wherein the extraction of the feature data is based on a pixel value of each pixel in the image data of the captured image (Tanaka: 0052: “The computing unit 41 b functioning as a region extraction unit extracts a candidate region estimated to contain a linear mucosal microstructure from the image data using the evaluation value D calculated for each pixel (Step S 7 in FIG. 5). Specifically, for example, the computing unit 41 b extracts a region containing a pixel whose evaluation value D is equal to or larger than a predetermined threshold as the candidate region described above.”)
(Matsubara: 0075: “When adjusting the color tone of special light image data in parts, the parts to be adjusted for color tone can be identified based on the ratio R/G calculated for each pixel, with areas where the ratio R/G is below a first threshold being parts containing mucus, etc., and areas where the ratio R/G is above a second threshold being parts containing blood.”).
Regarding claim 5, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 1, wherein: the at least two reference values includes a first reference value and a second reference value larger than the first reference value (Matsubara: 0039: “calculates the ratio R/G of the R signal to the G signal out of the three color signals of BGR input from the CCD 21, and compares it with a predetermined first threshold value Th1 and a second threshold value Th2. However, the second threshold value Th2 is greater than the first threshold value Th1.”), and the processor is configured to determine that the evaluation value is included in the extraction range when the evaluation value exceeds the first reference value and the evaluation value exceeds the second reference value (Matsubara: 0058: “if the ratio R/G is equal to or greater than the second threshold value Th2, the DSP 52 determines that bleeding is occurring in the subject.”).
Regarding claim 6, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 5, wherein the processor is configured to update the extraction range to set the evaluation value as the second reference value when the evaluation value is included in the extraction range (Matsubara: 0058: “if the ratio R/G is equal to or greater than the second threshold value Th2, the DSP 52 determines that bleeding is occurring in the subject.”)
(Yaguchi: 0152: “When the step S104 is performed for the second or subsequent time (i.e., when the step S104 is performed after the step S106), the determination target image that has been determined to be allowed to remain by the deletion determination process in the step S106 is selected as the next reference image.”; Wherein the 2nd threshold’s value, associated with bleeding detection, disclosed by Matsubara is replaced by the next determined lesion’s R/G value).
Regarding claim 8, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 5, wherein when the endoscope has reached the organ, the processor is configured to reset the second reference value to the initial value (Kitamura: 0119: “After the loop-A process is performed by taking each organ type as the processing organ type, and a criterion for each organ type is created, the series of in-vivo images acquired at Step h1 and recorded in the recording unit 14b are sequentially read out one by one. Then, the lesion-area detecting unit 19 performs the lesion-area detection process by taking the read in-vivo image as the processing target (Step h19). In the lesion-area detection process, values calculated for the organ type of the processing target image through the criterion creation process at Step h15 are used as the criterion in the hue direction and the criterion in the saturation direction.”; Wherein the lesion criterion/thresholds are calculated, or “reset”, based on the organ detected.).
Regarding claim 15, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 1, wherein the evaluation value indicates a degree of importance of the captured image (Tanaka: 0052: “The computing unit 41 b functioning as a region extraction unit extracts a candidate region estimated to contain a linear mucosal microstructure from the image data using the evaluation value D calculated for each pixel (Step S 7 in FIG. 5). Specifically, for example, the computing unit 41 b extracts a region containing a pixel whose evaluation value D is equal to or larger than a predetermined threshold as the candidate region described above.”)
(Matsubara: 0057-0059: “Therefore, when the ratio R/G is equal to or less than the first threshold value Th1, the DSP 52 determines that mucus or the like is secreted in the subject… On the other hand, if the ratio R/G is equal to or greater than the second threshold value Th2, the DSP 52 determines that bleeding is occurring in the subject…When the ratio R/G is greater than the first threshold value Th1 and smaller than the second threshold value Th2, it is in a gray zone where it is difficult to distinguish between blood and mucus, etc., based on the ratio R/G.”; Wherein the higher the ratio is, the more likely it is that the subject is bleeding, constitutes a degree of importance.).
As per claim(s) 16, arguments made in rejecting claim(s) 1 are analogous.
As per claim(s) 17, arguments made in rejecting claim(s) 2 are analogous.
As per claim(s) 18, arguments made in rejecting claim(s) 3 are analogous.
As per claim(s) 19, arguments made in rejecting claim(s) 4 are analogous.
As per claim(s) 20, arguments made in rejecting claim(s) 1 are analogous. In addition, paragraphs 0029-0030 of Kimoto disclose a control unit performing predetermined operations based on input data implying the use of a non-transitory computer-readable recording medium with an executable program stored thereon, and a processor executing the program.
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura, and further in view of Ito (JP4686406B2).
Regarding claim 7, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 5.
Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura does not disclose expressly: wherein the processor is configured to determine whether or not a predetermined time period has elapsed, and when the predetermined time period has elapsed, the processor is configured to reset the second reference value to the initial value.
Ito discloses: the detection of a face image based on an adaptive threshold value, wherein the threshold value is reset to its initial value at regular intervals based on the number of iterations of the face detection process performed (Ito: 0058-0059: “First, a reset variable N is set to 0 (step 81). In this embodiment, when the reset variable N is divided by M (M is an integer of 2 or more and excluding N), if the remainder is 0, the threshold value is returned to the initial value…When the face image detection process is completed, the reset variable N is incremented (step 84).”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement the known technique of resetting the threshold value at regular intervals as taught by Ito to reset the threshold values of Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura based on the number of frames processed. The suggestion/motivation for doing so would have been “Since the threshold value is reset to the initial value at regular intervals, even if a non-face image portion is erroneously determined to be a face image portion due to a lowered threshold value, the erroneous determination can be stopped midway” (Ito: 0060; Wherein updated threshold values causing detection errors may be regularly corrected.). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura with Ito to obtain the invention as specified in claim 7.
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura, and further in view of Wen et al. (US2020/0029787A1), hereinafter referenced as Wen.
Regarding claim 9, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 5.
Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura does not disclose expressly: an operating portion configured to receive a user operation, wherein the processor is configured to reset the second reference value to an initial value when the operating portion has received the user operation.
Wen discloses: an operating portion configured to receive a user operation, wherein the processor is configured to reset a threshold value to the initial value when the operating portion has received the user operation (Wen: 0155: “the monitoring apparatus 2 further comprises a switch button, a threshold value setting button, a warning cancellation/delay switch, a zero setting button, a menu setting button, a record review button, a wireless connection button, a display awaking button, etc…The threshold value setting button allows setting the warning threshold value.”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement the threshold value setting button disclosed by Wen to allow a user to set/reset the threshold values disclosed by Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura. The suggestion/motivation for doing so would have been “When the threshold value needs to be adjusted, a new threshold value can be set by the monitoring apparatus 2 and stored by the second storage module.” (Wen: 0178; Wherein the modification of the threshold values is possible, when needed.). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura with Wen to obtain the invention as specified in claim 9.
Claim(s) 10-11 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura, and further in view of Ghosh et al. (Automatic Computer Aided Bleeding Detection Scheme for Wireless Capsule Endoscopy (WCE) Video Based on Higher and Lower Order Statistical Features in a Composite Color) hereinafter referenced as Ghosh.
Regarding claim 10, Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura discloses: The image processing device according to claim 1, wherein: the at least two reference values includes a third reference value and a fourth reference value smaller than the third reference value (Matsubara: 0039: “calculates the ratio R/G of the R signal to the G signal out of the three color signals of BGR input from the CCD 21, and compares it with a predetermined first threshold value Th1 and a second threshold value Th2. However, the second threshold value Th2 is greater than the first threshold value Th1.”; Wherein the first and second threshold values constitute the fourth and third reference values, respectively).
Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura does not disclose expressly: the processor is configured to determine that the evaluation value is included in the extraction range when the evaluation value is less than the third reference value and the evaluation value is less than the fourth reference value.
Ghosh discloses the classification of pixels based on a G/R pixel ratio value, wherein a pixel is classified as a bleeding class pixel, if the pixel ratio is less than a threshold value (Ghosh: 2.3.4 Count Based Feature from Pixel Intensity Ratio: “it is observed that the distributions of bleeding and non-bleeding pixels are clearly separable. Thus, a threshold value of G/R pixel ratio (TG/R) can be chosen to differentiate pixels into two classes. It is expected that bleeding zone pixels will have G/R pixel intensity ratio less than TG=R and nonbleeding zone pixels possess ratio value greater than TG=R.”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to incorporate the known technique of determining bleeding in an image based on minimizing G/R ratios disclosed by Ghosh by extracting lesion images based on thresholding the inverse of the R/G ratio disclosed by Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura. The suggestion/motivation for doing so would have been “It is to be noted that intensity level red equal to zero is very rarely observed” (Ghosh: 2.2 Transformation from RGB to a Composite Plane; Wherein the rarity a 0 red intensity level helps in preventing a 0 division in the ratio.). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kimoto in view of Matsubara, Tanaka, Yaguchi, and Kitamura with Ghosh to obtain the invention as specified in claim 10.
Regarding claim 11, Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh discloses: The image processing device according to claim 10, wherein the processor is configured to update the extraction range to set the evaluation value as the fourth reference value when the evaluation value is included in the extraction range (Matsubara: 0058: “if the ratio R/G is equal to or greater than the second threshold value Th2, the DSP 52 determines that bleeding is occurring in the subject.”)
(Ghosh: 2.3.4 Count Based Feature from Pixel Intensity Ratio: “it is observed that the distributions of bleeding and non-bleeding pixels are clearly separable. Thus, a threshold value of G/R pixel ratio (TG/R) can be chosen to differentiate pixels into two classes. It is expected that bleeding zone pixels will have G/R pixel intensity ratio less than TG=R and nonbleeding zone pixels possess ratio value greater than TG=R.”)
(Yaguchi: 0152: “When the step S104 is performed for the second or subsequent time (i.e., when the step S104 is performed after the step S106), the determination target image that has been determined to be allowed to remain by the deletion determination process in the step S106 is selected as the next reference image.”; Wherein the 4th threshold’s value, used to extract lesion images containing lower G/R ratio values, is replaced by the next determined lesion image’s G/R value).
Regarding claim 13, Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh discloses: The image processing device according to claim 10, wherein when the endoscope has reached the organ, the processor is configured to reset the fourth reference value to the initial value (Kitamura: 0119: “After the loop-A process is performed by taking each organ type as the processing organ type, and a criterion for each organ type is created, the series of in-vivo images acquired at Step h1 and recorded in the recording unit 14b are sequentially read out one by one. Then, the lesion-area detecting unit 19 performs the lesion-area detection process by taking the read in-vivo image as the processing target (Step h19). In the lesion-area detection process, values calculated for the organ type of the processing target image through the criterion creation process at Step h15 are used as the criterion in the hue direction and the criterion in the saturation direction.”; Wherein the lesion criterion/thresholds are calculated, or “reset”, based on the organ detected.).
Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh, and further in view of Ito.
Regarding claim 12, Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh discloses: The image processing device according to claim 10.
Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh does not disclose expressly: wherein the processor is configured to determine whether or not a predetermined time period has elapsed, and when the processor has determined that the predetermined time period has elapsed, the processor is configured to reset the fourth reference value to an initial value.
Ito discloses: the detection of a face image based on an adaptive threshold value, wherein the threshold value is reset to its initial value at regular intervals based on the number of iterations of the face detection process performed (Ito: 0058-0059: “First, a reset variable N is set to 0 (step 81). In this embodiment, when the reset variable N is divided by M (M is an integer of 2 or more and excluding N), if the remainder is 0, the threshold value is returned to the initial value…When the face image detection process is completed, the reset variable N is incremented (step 84).”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement the known technique of resetting the threshold value at regular intervals as taught by Ito to reset the threshold values of Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh based on the number of frames processed. The suggestion/motivation for doing so would have been “Since the threshold value is reset to the initial value at regular intervals, even if a non-face image portion is erroneously determined to be a face image portion due to a lowered threshold value, the erroneous determination can be stopped midway” (Ito: 0060; Wherein updated threshold values causing detection errors may be regularly corrected.). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh with Ito to obtain the invention as specified in claim 12.
Claim(s) 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh, and further in view of Wen.
Regarding claim 14, Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh discloses: The image processing device according to claim 10.
Kimoto in view of Matsubara, Tanaka, Yaguchi, and Ghosh does not disclose expressly: an operating portion configured to receive a user operation, wherein the processor is configured to reset the fourth reference value to the initial value when the operating portion has received the user operation.
Wen discloses: an operating portion configured to receive a user operation, wherein the processor is configured to reset a threshold value to an initial value when the operating portion has received the user operation (Wen: 0155: “the monitoring apparatus 2 further comprises a switch button, a threshold value setting button, a warning cancellation/delay switch, a zero setting button, a menu setting button, a record review button, a wireless connection button, a display awaking button, etc…The threshold value setting button allows setting the warning threshold value.”).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to implement the threshold value setting button disclosed by Wen to allow a user to set/reset the threshold values disclosed by Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh. The suggestion/motivation for doing so would have been “When the threshold value needs to be adjusted, a new threshold value can be set by the monitoring apparatus 2 and stored by the second storage module.” (Wen: 0178; Wherein the modification of the threshold values is possible, when needed.). Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Kimoto in view of Matsubara, Tanaka, Yaguchi, Kitamura, and Ghosh with Wen to obtain the invention as specified in claim 14.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANTHONY J RODRIGUEZ whose telephone number is (703)756-5821. The examiner can normally be reached Monday-Friday 10am-7pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sumati Lefkowitz can be reached at (571) 272-3638. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ANTHONY J RODRIGUEZ/Examiner, Art Unit 2672
/SUMATI LEFKOWITZ/Supervisory Patent Examiner, Art Unit 2672