DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) filed on July 21, 2024 is being considered by the examiner.
Claim Objections
Claim 1 and 21 are objected to because of the following informalities:
The limitation in lines 8-11 of claims 1 and 21 beginning with “and a display position…” is not grammatically correct. It appears the applicant intended to write “wherein a display position…”. For the purpose of examination, the examiner is interpreting “and a display position” in lines 8-9 in claims 1 and 21 as being “wherein a display position”.
Appropriate correction is required.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-10, 13, 15-16, 18, 19, 21-24 are rejected under 35 U.S.C. 102(a)(1) as being unpatentable by Hayami et al. (US2021/0327067).
Regarding claim 1, Hayami discloses a medical support device comprising: a processor (main body apparatus 12 comprises processor, CPU or the like [0031]; image processing apparatus 13), wherein the processor is configured to: acquire an image obtained by imaging an inside of a body with a camera (image generating unit 122 within main body apparatus 12 generates an endoscopic image [0030]); and output screen generation information used for generation of a screen on which the image and presence position information for specifying a presence position of an in-body feature region recognized by executing an object recognition process using the image on an outside of the image are displayed (display control unit 133 configured to perform processing for setting an endoscopic image display region on a display screen of the display apparatus 14 [0039]), and a display position of the presence position information with respect to the image in the screen is changed according to a change in a positional relationship between the camera and the in-body feature region (Fig. 5: presence of a mark in a mark display region such as mark MA in region AM2 indicates presence of an lesion corresponding to that lower left quadrant; should the lesion move to a different quadrant within the image due to change in positional relationship between the camera and the lesion, a mark would appear in the mark display region associated with that quadrant).
Regarding claim 2, Hayami discloses the medical support device according to claim 1, further disclosing wherein the change is caused by an operation of the camera and/or a body movement in the inside of the body (although not explicitly stated, it would be obvious that the change in position of a lesion within a body would occur because of relative movement between the camera and body, resulting the lesion moving to a different area of the frame, or the lesion being out of frame or obscured behind a blocking object, such as in Fig. 14).
Regarding claim 3, Hayami discloses the medical support device according to claim 2, further disclosing wherein the display position is changed according to the change to follow the operation and/or the body movement (Fig. 5: marker MA, shown in region AM2, indicating presence of lesion LA in the lower left quadrant AR2, would change to other regions AM1, AM3 or AM4 should lesion LA move to the respective quadrant [0060-0061]; lesions LA would be moved across the displayed image if the camera and body moved relative to one another).
Regarding claim 4, Hayami discloses the medical support device according to claim 1, further disclosing wherein a display aspect of the presence position information is changed according to a feature of the change (if a positional relationship between the camera and the in-body feature region is changed, the direction of the change would be indicated by marks, for example, in Fig. 8, the orientation of marks MD2, MD3, and MD4 is changed according to the direction of the movement of the lesion to the upper left corner [0076-0077]).
Regarding claim 5, Hayami discloses the medical support device according to claim 4, further disclosing wherein the feature includes a direction of the change (if a positional relationship between the camera and the in-body feature region is changed, the direction of the change would be indicated by marks, for example, in Fig. 8, the orientation of marks MD2, MD3, and MD4 is changed according to the direction of the movement of the lesion to the upper left corner [0076-0077]).
Regarding claim 6, Hayami discloses the medical support device according to claim 1, further disclosing wherein the presence position information includes within-angle-of-view position information for specifying, on the outside, the presence position in a case where the in-body feature region is within an angle of view of the camera, and the within-angle-of-view position information is displayed on the screen in a case where the in-body feature region is within the angle of view (Fig. 5: marker MA is displayed in region AM2 to indicate presence of lesions LA in the lower left quadrant AR2 [0060-0061]), and out-of-angle-of-view position information for specifying, on the outside, the presence position in a case where the in-body feature region is out of the angle of view, and the out-of-angle-of-view position information is displayed on the screen in a case where the in-body feature region is out of the angle of view (Fig. 10-11: if lesion LA in Fig. 10 is displaced outside the endoscopic image EGE, a display image shown in any one of Fig. 11-13 may be displayed to indicated the quadrant in which the lesion was last within the angle of view; In Fig. 11, the absence of marker ME2 in region AM2, in Fig. 12, the marker MG2 in region AM2, in Fig. 13, the marker MH2 in region AM2, is analogous to applicant’s out-of-angle-of-view position information).
Regarding claim 7, Hayami discloses the medical support device according to claim 1, further disclosing wherein the presence position information includes out-of-angle-of-view position information for specifying, on the outside, the presence position in a case where the in-body feature region is out of an angle of view of the camera, the out-of-angle-of-view position information is displayed on the screen in a case where the in-body feature region is out of the angle of view (Fig. 10-11: if lesion LA in Fig. 10 is displaced outside the endoscopic image EGE, a display image shown in any one of Fig. 11-13 may be displayed to indicated the quadrant in which the lesion was last within the angle of view; In Fig. 11, the absence of marker ME2 in region AM2, in Fig. 12, the marker MG2 in region AM2, in Fig. 13, the marker MH2 in region AM2, is analogous to applicant’s out-of-angle-of-view position information), and a display aspect of the out-of-angle-of-view position information is changed according to a feature of the change (Fig. 11-13: orientation, absence or presence, and luminance are display aspects of the marks MG2 and MH2 that may be changed according to a direction of the change in positional relationship between the camera and the in-body feature region).
Regarding claim 8, Hayami discloses the medical support device according to claim 7, further disclosing wherein the display aspect includes presence or absence of display or a display intensity (Fig. 11 illustrates a display aspect to include the absence of mark in region AM2 [0102, 0104], Fig. 13 illustrates a display aspect to include wherein mark MH2 has a higher luminance than the other marks [0111-0112]).
Regarding claim 9, Hayami discloses the medical support device according to claim 1, further disclosing wherein the presence position information includes out-of-angle-of-view position information for specifying, on the outside, the presence position in a case where the in-body feature region is out of an angle of view of the camera, the out-of-angle-of-view position information is displayed on the screen in a case where the in-body feature region is out of the angle of view (Fig. 10-11: if lesion LA in Fig. 10 is displaced outside the endoscopic image EGE, a display image shown in any one of Fig. 11-13 may be displayed to indicated the quadrant in which the lesion was last within the angle of view; In Fig. 11, the absence of marker ME2 in region AM2, in Fig. 12, the marker MG2 in region AM2, in Fig. 13, the marker MH2 in region AM2, is analogous to applicant’s out-of-angle-of-view position information), the in-body feature region is a lesion (Fig.10: lesion LE), and a display aspect of the out-of-angle-of-view position information is changed according a site where the lesion is present (Fig. 11-13: orientation, absence or presence, and luminance are display aspects of the marks MG2 and MH2 that may be changed according to the quadrant the lesion was last imaged, indicating the corner in which the lesion left the field of view).
Regarding claim 10, Hayami discloses the medical support device according to claim 1, further disclosing wherein the presence position information includes out-of-angle-of-view position information for specifying, on the outside, the presence position in a case where the in-body feature region is out of an angle of view of the camera, and the out-of-angle-of-view position information is displayed on the screen on a condition that the in-body feature region is out of the angle of view (Fig. 11-13 demonstrated instances wherein the presence of markers AM2, MH2, or the absence of a marker in region AM2 indicate the lesion is out of view and last seen in the lower left quadrant region AR2 [0105]).
Regarding claim 13, Hayami discloses the medical support device according to claim 1, further disclosing wherein the presence position information includes out-of-angle-of-view position information for specifying, on the outside, the presence position in a case where the in-body feature region is out of an angle of view of the camera, and the out-of-angle-of-view position information is displayed on the screen on a condition that the in-body feature region is out of the angle of view (Fig. 10-11: if lesion LA in Fig. 10 is displaced outside the endoscopic image EGE, a display image shown in any one of Fig. 11-13 may be displayed to indicated the quadrant in which the lesion was last within the angle of view, a mark in region AM2 would be displayed) and that a degree of the change exceeds a predetermined degree (the lesion appearing outside the bounds of the angle-of-view of the camera is a degree of change that exceeds the predetermined degree of within boundaries of the angle-of-view).
Regarding claim 15, Hayami discloses the medical support device according to claim 1, further disclosing wherein the screen generation information includes the image (endoscopic image EG) and position indication information for indicating a position of the presence position information in the screen, and the position indication information is updated according to the change (Fig. 5-8 indicate instances wherein the lesion L changes location within the endoscopic image EG and triangular markers in regions AM1-AM4 are displayed in a certain manner or absent dependent on location of the lesion [0059]).
Regarding claim 16, Hayami discloses the medical support device according to claim 15, further disclosing wherein the screen generation information includes the image, the presence position information, and the position indication information (Processing performed by the processor demonstrated in Fig. 2 include detecting a lesion, and displaying a display endoscopic image and marker that is formed based on the endoscopic image EG, the presence or absence of markers in regions AM1-AM4, and detection of lesions).
Regarding claim 18, Hayami discloses the medical support device according to claim 1, further disclosing wherein the in-body feature region is a lesion (lesions L).
Regarding claim 19, Hayami discloses the medical support device according to claim 1, further disclosing wherein the image is included in a plurality of frames obtained in time series by imaging the inside of the body with the camera, and the processor is configured to: specify the change based on the plurality of frames; and change the display position according to the specified change (determining unit 132 compares lesions found a first frame and lesions in a second frame, captured just before the first frame, to determine if lesion information in the first frame and lesion information in the second frame have remained the same or changed [0038]).
Regarding claim 21, Hayami discloses a medical support device comprising: a processor (main body apparatus 12 comprises processor, CPU or the like [0031]; image processing apparatus 13), wherein the processor is configured to: acquire an image obtained by imaging an inside of a body with a camera (image generating unit 122 within main body apparatus 12 generates an endoscopic image [0030]); and output screen generation information used for generation of a screen on which a medical image generated based on the image and presence position information for specifying a presence position of an in-body feature region recognized by executing an object recognition process using the image on an outside of the image are displayed (display control unit 133 configured to perform processing for setting an endoscopic image display region on a display screen of the display apparatus 14 [0039]), and a display position of the presence position information with respect to the medical image in the screen is changed according to a change in a positional relationship between the camera and the in-body feature region (Fig. 5: presence of a mark in a mark display region such as mark MA in region AM2 indicates presence of an lesion corresponding to that lower left quadrant; should the lesion move to a different quadrant within the image due to change in positional relationship between the camera and the lesion, a mark would appear in the mark display region associated with that quadrant).
Regarding claim 22, Hayami discloses an endoscope apparatus comprising: the medical support device according to claim 1; and the camera (endoscope comprising image pickup device such as a CCD image sensor or a CMOS sensor [0027]).
Regarding claim 23, Hayami discloses a medical support method comprising: acquiring an image obtained by imaging an inside of a body with a camera (image generating unit 122 within main body apparatus 12 generates an endoscopic image [0030]; image pickup unit 11, comprising CCD or CMOS, is configured to pick up an image of return light, and generate an image pickup signal [0027]); and outputting screen generation information used for generation of a screen on which the image and presence position information for specifying a presence position of an in-body feature region recognized by executing an object recognition process using the image on an outside of the image are displayed (display control unit 133 configured to perform processing for setting an endoscopic image display region on a display screen of the display apparatus 14 [0039]), wherein a display position of the presence position information with respect to the image in the screen is changed according to a change in a positional relationship between the camera and the in-body feature region (Fig. 5: presence of a mark in a mark display region such as mark MA in region AM2 indicates presence of an lesion corresponding to that lower left quadrant; should the lesion move to a different quadrant within the image due to change in positional relationship between the camera and the lesion, a mark would appear in the mark display region associated with that quadrant)..
Regarding claim 24, Hayami discloses a non-transitory computer-readable storage medium (main body apparatus 12 comprising processor, CPU or the like [0031]; image processing apparatus 13) storing a program executable by a computer to execute a medical support process, the medical support process comprising: acquiring an image obtained by imaging an inside of a body with a camera (image generating unit 122 within main body apparatus 12 generates an endoscopic image [0030]); and outputting screen generation information used for generation of a screen on which the image and presence position information for specifying a presence position of an in-body feature region recognized by executing an object recognition process using the image on an outside of the image are displayed (display control unit 133 configured to perform processing for setting an endoscopic image display region on a display screen of the display apparatus 14 [0039]), wherein a display position of the presence position information with respect to the image in the screen is changed according to a change in a positional relationship between the camera and the in-body feature region (Fig. 5: presence of a mark in a mark display region such as mark MA in region AM2 indicates presence of an lesion corresponding to that lower left quadrant; should the lesion move to a different quadrant within the image due to change in positional relationship between the camera and the lesion, a mark would appear in the mark display region associated with that quadrant).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Hayami in view of Oosake (US2021/0110915).
Regarding claim 12, Hayami discloses the medical support device according to claim 1, wherein the presence position information includes out-of-angle-of-view position information for specifying, on the outside, the presence position in a case where the in-body feature region is out of an angle of view of the camera (Fig. 11-13 illustrate displays wherein the lesion is out of an angle of view of the camera, various marks in mark display regions AM1-AM1 may be shown in different configurations, the mark in AM2, appearing to be different from the other marks, indicate the quadrant/area the lesion was last seen), but fails to disclose the out-of-angle-of-view position information displayed on the screen on a condition that a predetermined time has elapsed after the in-body feature region is out of the angle of view. In the same field of endeavor, Oosake teaches a substantially similar medical support device, comprising a processor (Fig. 1 & 2: processing apparatus 10, processor device 33[0054]), wherein the processor is configured to acquire an image within a body with a camera (image acquiring unit 11, 35 [0055]; endoscope acquires an image by capturing an image with white light[0054]); and output screen generation information used for generation of a screen on which the image and presence position information for specifying a presence position of an in-body feature region recognized by executing an object recognition process using the image on an outside of the image are displayed (display-manner determining unit 42 determines a display manner of report information for reporting that the region of interest has been detected [0058]; bar 48 is displayed in the second region 46, outside the first region 45 comprising the displayed image 43 [0062]), and display position of the presence position information with respect to the image in the screen is changed according to a change in a positional relationship between the camera and the in-body feature region (bar 48 is displayed in the second region 46, outside the first region 45 comprising the displayed image 43 [0062], appearance of 48 in second region 46 depends on change in lesion detection, which is a result in change in positional relationship between the endoscope imager and lesion). Oosake further teaches wherein the out-of-angle-of-view position information is displayed on the screen on a condition that a predetermined time has elapsed after the in-body feature region is out of the angle of view (Fig. 7-9: as lesion detection transitions from a detected state to an undetected state, bar 48 is reduced as time elapses that the lesion is still undetected; the amount of time in the undetected state is the degree of change, and if in the undetected state for an amount of time longer than the bar 48, the bar disappears [0062]; the display to the bar 48 and the second region 46 is conditional on the amount of time that has elapsed since entering the undetected state). In view of Oosake, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included wherein the out-of-angle-of-view position information is displayed on the screen on a condition that a predetermined time has elapsed after the in-body feature region is out of the angle of view, as this feature allows the user to be alerted that the lesion has left the image view and the elapsed time after change of the detection state, which may reduce examination load [0063].
Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Hayami in view of Choi et al. (US2022/0277445).
Regarding claim 17, Hayami discloses the medical support device according to claim 1, but fails to disclose wherein the object recognition process includes a process of recognizing the in-body feature region based on the image by using AI.
In the same field of endeavor, Choi teaches a substantially similar medical support device comprising a processor (processor [0063]), the processor configured to: acquire an image obtained by imaging an inside of a body with a camera (processor receives gastroscopic image to analyze each video frame of the gastroscopic image using the medical image analysis algorithm and detect a finding suspected of being a lesion in the video frame [0038]), and output screen generation information used for generation of a screen on which the image and presence position information for specifying a presence position of an in-body feature region is recognized by executing an objection recognition process using the image (processor generates display information including whether the finding suspected of being a lesion is present and the coordinates of the location of the suspected lesion [0036]; suspected lesions may be marked with a visualization element such as marker, box or highlighted, Fig. 4 [0107]), further teaching wherein the object recognition process includes a process of recognizing the in-body feature region based on the image by using AI (artificial intelligence server 110 may detect and determine lesion in video frame/image using an AI algorithm [0062]). In view of Choi, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included wherein object recognition processes uses artificial intelligence, as it is a known technology in the endoscope art that allows a computer or processor to automatically detect lesions and alert a user of a lesion’s presence during a medical procedure.
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Hayami in view of Kimura et al. (US2021/0251470).
Regarding claim 20, Hayami discloses the medical support device according to claim 1, but fails to disclose wherein the processor is configured to: specify the change based on a detection result by a sensor capable of detecting a behavior of the camera in the inside of the body; and change the display position according to the specified change. In the same field of endeavor, Kimura teaches an image processing device for an endoscope including a processor, the processor configured to: to acquire an image within a body with a camera (receives a generated image acquired by performing predetermined processing on a n image pickup signal acquired by an endoscope [abstract]); and output screen generation information used for generation of a screen on which the image and presence position information for specifying a presence position of an in-body feature region recognized by executing an object recognition process using the image on an outside of the image are displayed (lesion detection unit 341 configured to detect a lesion part contained in the generated image sequentially outputted from the image input unit 331 [0035]; any shaped marker added to the generated image to indicated presence of a lesion [0036]), and display position of the presence position information with respect to the image in the screen is changed according to a change in a positional relationship between the camera and the in-body feature region (generated image and marker image are displayed [0036], lesion detection may be halted when withdrawing speed is excessively high, as the diagnosis support function does not act normally [0100], the high withdrawing speed caused by movement of the endoscope changes whether the lesion detection diagnosis support function occurs, changing the display position of the marker), further teaching wherein the processor is configured to: specify the change based on a detection result by a sensor capable of detecting a behavior of the camera in the inside of the body; and change the display position according to the specified change (technique for calculating speed based on the difference between generated images of a plurality of frames or a technique where a sensor which can detect speed, such as a gyro sensor is at the a distal end portion of the endoscope [0101]). In view of Kimura, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included wherein the processor is configured to specify the change of the positional relationship between the camera and the in-body feature based on a detection result by a sensor capable of detecting a behavior of the camera in the inside of the body, and change the display position according to the specified change, as it allows for the lesion detection process to accurately detect lesions, and alert the user whether the endoscope is in an operation state that can support lesion detection [0100-0103].
Allowable Subject Matter
Claims 11 and 14 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter: Claim 11 recites language directed to a specific scenario in which the display of the out-of-angle-of-view position information on the screen is more emphasized in a case where a within-angle-of-view time during which the in-body feature region is within the angle of view is less than a certain amount of time, compare to a case where the within-angle-of-view time is equal to or longer than the certain time.
Claim 14 recites language direction to a specific scenario in which frequency of the in-body feature entering and exiting the angle of view is compared to a predetermined frequency within a unit of time to determine whether the display of the out-of-angle-of-view position information on the screen will me more emphasized.
There is no reason, teaching or suggesting with any prior art of record to modify the processor
of Hayami to include the specific scenarios and resulting actions as required of claims 11 and 14. After careful reviewing of the application in light of the prior art of record and the searches of all the possible areas relevant to the present application, a set of prior art references have been found, but those prior art references are not deemed strong to make the application unpatentable.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See referenced cited in PTO-892.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LI-TING SONG whose telephone number is (571)272-5771. The examiner can normally be reached 8-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anhtuan Nguyen can be reached at 571-272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LI-TING SONG/Examiner, Art Unit 3795
/ANH TUAN T NGUYEN/Supervisory Patent Examiner, Art Unit 3795
3/3/26