DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 07/04/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-6, 8-12, 14-16 and 18-24 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Usuda (U.S. Publication No. 2020/0294227 A1).
As per claim 1, Usuda discloses an endoscopic image processing device (figure 1) that processes an endoscopic image, the endoscopic image processing device comprising: a processor (figure 1, processor 12), wherein the processor acquires the endoscopic image (figure 2, video images 38), recognizes a state of an organ (paragraph [0071]: the state of an organ may be inflammation) as an examination target from the acquired endoscopic image, sets a detection criterion for a region of interest according to a recognition result of the state of the organ (paragraph [0072]-[0073]: feature quantities are analyzed for detecting a region of interest, and detection criterion may be feature quantities in colors, gradient of pixel values, a shape or a size), and detects the region of interest from the endoscopic image on the basis of the set detection criterion (paragraph [0074]: the combination of detection unit 41 and acquisition unit 42 detect a region of interest based on the feature quantities and region-of interest information, such as coordinates and presence of the ROI).
As per claim 2, Usuda discloses wherein the processor recognizes the state of the organ from the endoscopic image in which a specific region of the organ is imaged (a region of interest is imaged to recognize inflammation of an organ).
As per claim 3, Usuda discloses wherein the processor recognizes the state of the organ from a plurality of endoscopic images in which different regions of the organ are imaged (paragraphs [0010]-[0011] & [0057], a plurality of endoscopic images are captured to detect a region of interest during the movement of the endoscope).
As per claim 4, Usuda discloses wherein the endoscopic image used for recognizing the state of the organ is the endoscopic image in which a relatively wider range than the endoscopic image used for detecting the region of interest is imaged (see figures 4-5).
As per claim 5, Usuda discloses wherein the processor recognizes the state of the organ by recognizing a state regarding histological abnormalities in a mucous membrane from the endoscopic image (paragraph [0071]: “an endoscopic mucosal resection (EMR) scar, an endoscopic submucosal dissection (ESD) scar”).
As per claim 6, Usuda discloses wherein the processor acquires a plurality of endoscopic images captured in chronological order, recognizes the state of the organ from a first endoscopic image among the plurality of endoscopic images, and detects the region of interest from a second endoscopic image different from the first endoscopic image, among the plurality of endoscopic images (the image acquisition unit 40 is a time-series image acquisition unit 40).
As per claim 8, Usuda discloses wherein the second endoscopic image is the endoscopic image captured temporally later than the first endoscopic image (as explained above, the endoscopic images are captured in a time-series).
As per claim 9, Usuda discloses wherein the processor displays information regarding the endoscopic image and the state of the organ recognized from the endoscopic image on a display device (see figures 1-2 for display section 16 and figures 4-5 for displayed endoscopic images).
As per claim 10, Usuda discloses wherein the processor notifies of a detection result of the region of interest in a different mode according to setting of the detection criterion (the detection result is notified on a display screen).
As per claim 11, Usuda discloses wherein the processor notifies of the endoscopic image to be displayed on a display device by surrounding the detected region of interest with a frame, and displays the frame in a display aspect according to the setting of the detection criterion (see figures 4-5).
As per claim 12, Usuda discloses wherein the processor detects the region of interest from the endoscopic image using a trained model, and sets the trained model to be used for detecting the region of interest, according to the recognition result of the state of the organ (the convolutional neural network in paragraph [0070] is the claimed “trained model”).
As per claim 14, Usuda discloses wherein the processor recognizes the state of the organ by recognizing a state regarding inflammation and/or atrophy of the mucous membrane as the state regarding the histological abnormalities in the mucous membrane (paragraph [0071]).
As per claim 15, Usuda discloses wherein the processor recognizes a state regarding pylori infection of a stomach (Usuda teaches the state may be an inflammation of an organ, although Usuda does not explicitly teach the organ is a stomach, it is understood the inflammation in Usuda can be any organ, including pylori infection of a stomach).
As per claim 16, Usuda discloses wherein the processor recognizes uninfected, currently infected, and eradicated states as the state regarding the pylori infection of the stomach (Usuda teaches the endoscopic image is capable of detecting heathy tissue, inflammation tissue and scars).
As per claim 18, Usuda discloses wherein the processor recognizes a state regarding Barrett's esophagus of an esophagus (as explained above, Usuda in paragraph [0071] teaches “Examples of a region of interest includes a polyp, a cancer, the colonic diverticula, an inflammation, an endoscopic mucosal resection (EMR) scar, an endoscopic submucosal dissection (ESD) scar, a clipped portion, a bleeding point, a perforation, blood vessel heteromorphism, a treatment tool, and the like”. The examiner notes the detection of a region of interest can be intended for different illnesses, such as “Barrett’s esophagus”).
As per claim 19, Usuda discloses wherein the processor recognizes a state regarding an inflammatory bowel disease of a large intestine (paragraph [0071]: “colonic diverticula”).
As per claim 20, Usuda discloses wherein the processor recognizes the state of the organ by dividing the state into three or more states, and sets the detection criterion according to the recognized state of the organ (paragraph [0072]: “tumor”, “non-tumor” and “others”).
As per claim 21, Usuda discloses wherein the processor acquires information on a light source type, and sets the detection criterion according to the recognition result of the state of the organ and the light source type (paragraph [0061]: “white light” or “special light”, the examiner notes the feature quantities of the images, ie. normal light image, special light image, are based on the light source types).
As per claim 22, Usuda discloses an endoscopic image processing method of performing processing of detecting a region of interest from an endoscopic image using a trained model, the endoscopic image processing method comprising: acquiring information on a state of an organ as an examination target; and setting the trained model to be used according to the state of the organ (see explanation in claim 1, the examiner notes a CNN model in paragraph [0070] is the claimed “trained model” for setting the feature quantities to detect a state of the organ, such as an inflammation in paragraph [0071]).
As per claim 23, Usuda discloses an endoscopic image processing method of performing processing of detecting region-of-interest candidates from an endoscopic image by calculating a confidence level indicating probability, and detecting the region-of-interest candidate of which the confidence level is equal to or greater than a threshold value, from among the detected region-of-interest candidates, as a region of interest (the preamble of the claim is not considered a limitation and is of no significance to claim construction. See Pitney Bowes, Inc. v. Hewlett-Packard Co., 182 F.3d 1298, 1305, 51 USPQ2d 1161, 1165 (Fed. Cir. 1999). See MPEP § 2111.02), the endoscopic image processing method comprising: acquiring information on a state of an organ as an examination target; and setting the threshold value according to the state of the organ (see explanation in claim 1, and the examiner notes the “size” or “shape” is the claimed “threshold value” for feature quantities).
As per claim 24, see figure 1 for the claimed endoscope system.
Allowable Subject Matter
Claims 7, 13 and 17 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TOM Y LU whose telephone number is (571)272-7393. The examiner can normally be reached Monday - Friday, 9AM - 5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at (571) 272 - 7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TOM Y LU/Primary Examiner, Art Unit 2667