DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-18 are pending and are currently under consideration for patentability under 37 CFR 1.104.
Claim Objections
Claims 8-10 are objected to because of the following informalities:
Regarding claim 8, the limitation “emphasize display” should read “emphasize the display”.
Regarding claim 9, the limitation “blink display” should read “blink the display”.
Regarding claim 10, the limitation “hide display” should read “hide the display”.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 17 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 17, the limitation “when display content of a specific piece of support information is updated” is unclear. It is unclear what aspect of the display content is being updated, especially since display content has not been previously recited (i.e., what is being updated and how does it affect the display?).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-2, 5-7, 11-13, and 15-18 are rejected under 35 U.S.C. 103 as being unpatentable over Shelton (US 2022/0104694), in view of Nishiyama (US 2013/0152020).
Regarding claim 1, Shelton discloses an endoscopic image observation support device (surgical hub [0562]; 106, figure 2) that supports observation of an image (imaging module 138 coupled to an endoscope [0164], figure 3) captured by an endoscope (124, figure 2), the endoscopic image observation support device comprising: a processor (132, figure 3) configured to: cause a display device (135, figure 3) to display the image (surgical hub…on a display [0562]); set degrees of priority of a plurality of pieces of support information to be displayed on the display device (prioritizing data…prioritize the display data associated with the surgical task [0562]). Shelton is silent regarding cause the display device to display the plurality of pieces of support information based on the degrees of priority.
Nishiyama teaches a display example of an image observation screen (figure 24). Icons (303a-h, figure 24) have priority levels and are displayed from the top of the screen in accordance with the priority levels ([0131]). The form of the priority display of the icons is not limited, where the color, the shape, or the size of the icons having higher priority levels may be made different from that of the other icons ([0131]).
It would have been obvious to modify the device to be able to display data based on their priority level as taught by Nishiyama ([0131]). Doing so would display data/pieces of support information with higher priority levels different from that of other data ([0131]). The modified device would cause the display device to display the plurality of pieces of support information based on the degrees of priority (form of the priority display…[0131]; Nishiyama).
Regarding claim 2, Shelton further discloses the plurality of pieces of support information include at least one of information indicating a position of a lesion part, information indicating a discrimination result (contextual information…type of tissue being operated on [0223]), or information indicating an observation progress state (contextual information…particular step of the surgical procedure that the surgeon is performing [0223]).
Regarding claim 5, Nishiyama further teaches the processor is configured to cause the display device to display the plurality of pieces of support information at positions in accordance with the degrees of priority (displayed from the top of the screen…priority [0131]; Nishiyama).
Regarding claim 6, Nishiyama further teaches the processor is configured to cause the display device to display the plurality of pieces of support information in sizes in accordance with the degrees of priority (size of…priority levels [0131]; Nishiyama).
Regarding claim 7, Nishiyama further teaches the processor is configured to cause the display device to display the plurality of pieces of support information at luminances in accordance with the degrees of priority (not limited…color…priority levels [0131]; Nishiyama | luminance or brightness of a color can be a way to indicate a priority level).
Regarding claim 11, Shelton further discloses the processor is configured to change the degrees of priority in accordance with an operation state of the endoscope (contextual information…particular step of the surgical procedure that the surgeon is performing [0223]; Shelton | interpreted the “operation state” to be the current status/step of the instrument).
Regarding claim 12, Shelton further discloses the processor is configured to determine the operation state of the endoscope based on the image (image data…procedure has commenced [0238]; Shelton).
Regarding claim 13, Shelton further discloses the operation state of the endoscope is whether an observed site and/or a lesion part is being observed (image data…determine contextual information regarding type of procedure being done; determine patient’s lung…compare to the step [0238]; Shelton).
Regarding claim 15, Shelton further discloses the operation state of the endoscope is whether there is an unobserved site (determine contextual information regarding type of procedure being done; determine patient’s lung…compare to the step [0238]; Shelton | interpreted the operation state of the endoscope would also determine if something did not happen, i.e., the site is not detected/seen/commenced).
Regarding claim 16, Shelton further discloses the operation state of the endoscope is whether treatment is being performed (contextual information…step of the surgical procedure [0223]; Shelton | the step can indicate if treatment is occurring at that point).
Regarding claim 17, Shelton further discloses the processor is configured to change the degrees of priority when display content of a specific piece of support information is updated (surgical hub…intraoperative data…[0482]; contextual information from the data received [0483]; Shelton).
Regarding claim 18, Shelton discloses endoscope system comprising: an endoscope (endoscope [0159]); a display device (135, figure 3); and the endoscopic image observation support device according to claim 1 (see claim 1 rejection above).
Claim(s) 3-4 are rejected under 35 U.S.C. 103 as being unpatentable over Shelton (US 2022/0104694) and Nishiyama (US 2013/0152020) as applied to claim 2 above, and further in view of Popovic (US 2015/0112126).
Regarding claim 3, Shelton and Nishiyama disclose all of the features in the current invention as shown in claim 2 above. They are silent regarding the information indicating an observation progress state is displayed using a progress bar.
Popovic teaches a live view image (320, figure 3) is overlaid on the composite image ([0040]). A progress bar is implemented showing a progress length (324, figure 3) against a total length (see 326, figure 3). As the composite image (322, figure 3) is traversed, the portion of the total length that is isolated is indicated or calculated as a percentage ([0040]).
It would have been obvious to one of ordinary skill in the art to modify the device of Shelton and Nishiyama to use a composite image (322, figure 3) and live view (320, figure 3) to create/implement a progress bar (324 and 326, figure 3) as taught by Popovic. Doing so would indicate the progress in the procedure ([0040]). The modified device would have the information indicating an observation progress state (contextual information…particular step of the surgical procedure that the surgeon is performing [0223]; Shelton) is displayed using a progress bar (see 322 with 324 and 326, figure 3; Popovic).
Regarding claim 4, Shelton and Nishiyama disclose all of the features in the current invention as shown in claim 2 above. They are silent regarding the information indicating an observation progress state is displayed using a schema diagram of an observation target organ.
Popovic teaches a live view image (320, figure 3) is overlaid on the composite image ([0040]). A progress bar is implemented showing a progress length (324, figure 3) against a total length (see 326, figure 3). As the composite image (322, figure 3) is traversed, the portion of the total length that is isolated is indicated or calculated as a percentage ([0040]).
It would have been obvious to one of ordinary skill in the art to modify the device of Shelton and Nishiyama to use a composite image (322, figure 3) and live view (320, figure 3) to create/implement a progress bar (324 and 326, figure 3) as taught by Popovic. Doing so would indicate the progress in the procedure ([0040]). The modified device would have the information indicating an observation progress state (contextual information…particular step of the surgical procedure that the surgeon is performing [0223]; Shelton) is displayed using a schema diagram of an observation target organ (see live view 320 overlaid on composite image 322, figure 3; Popovic).
Claim(s) 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over Shelton (US 2022/0104694) and Nishiyama (US 2013/0152020) as applied to claim 1 above, and further in view of Elbaz (US 2019/0269485).
Regarding claim 8, Shelton and Nishiyama disclose all of the features in the current invention as shown in claim 1 above. They are silent regarding the processor is configured to emphasize display of one of the plurality of pieces of support information with a degree of priority higher than a threshold value.
Elbaz teaches a display that emphasizes flagged regions using animation, such as flashing, or color ([0380]). A region is flagged if it is outside a set threshold ([0378]).
It would have been obvious to modify the device to use a set threshold to establish a degree of higher priority as taught by Elbaz ([0378]). Doing so would emphasize one of the plurality of pieces of support information that are above a set threshold ([0378] and [0380]). The modified device would have the processor is configured to emphasize (emphasize…animation [0380]; Elbaz) display of one of the plurality of pieces of support information with a degree of priority higher than a threshold value (flag…set threshold [0378]).
Regarding claim 9, Elbaz further teaches the processor is configured to blink display (animation…flashing [0380]; Elbaz) of the one of the plurality of pieces of support information with the degree of priority higher than the threshold value (flag…set threshold [0378]).
Claim(s) 10 is rejected under 35 U.S.C. 103 as being unpatentable over Shelton (US 2022/0104694) and Nishiyama (US 2013/0152020) as applied to claim 1 above, and further in view of Shelton (US 2019/0104919).
Shelton and Nishiyama disclose all of the features in the current invention as shown in claim 1 above. They are silent regarding the processor is configured to hide display of one of the plurality of pieces of support information with a degree of priority lower than a threshold value.
Shelton teaches non-critical aspects may be selectively hidden from view ([0367]), where only the recommendations can be displayed when they are at a priority level above a predefined threshold ([0372]).
It would have been obvious to modify the device to selectively hide one of the plurality of pieces of support information that are non-critical aspects ([0367]) as taught by Shelton 2019. Doing so would selectively communicate support information to a clinician ([0367]). The modified device would have the processor is configured to hide display (selectively hidden [0367]; Shelton 2019) of one of the plurality of pieces of support information with a degree of priority lower than a threshold value (only display…above a predefined threshold [0372]).
Claim(s) 14 is rejected under 35 U.S.C. 103 as being unpatentable over Shelton (US 2022/0104694) and Nishiyama (US 2013/0152020) as applied to claim 11 above, and further in view of Kimura (US 2021/0251470).
Shelton and Nishiyama disclose all of the features in the current invention as shown in claim 11 above. They are silent regarding the operation state of the endoscope is whether a region in which a lesion part is detectable is being observed.
Kimura teaches an endoscope unit (1, figure 1) with an image processing device (32, figure 1). The image processing device (32, figure 2) has a lesion detection unit (341, figure 2) configured to detect a lesion part contained in the generated image, where the lesion detection unit is within a diagnosis support unit (34, figure 2). A support information control unit (343, figure 2) determines whether to generate support information for the lesion part detected by the lesion detection unit and to add and display the support information ([0037]).
It would have been obvious to modify the device of Shelton and Nishiyama to have a lesion detection unit (341, figure 2) and support information control unit (343, figure 2) to generate and display this support information as taught by Kimura. Doing so would include the detection of a lesion in support information to be displayed ([0037]). The modified device would have the operation state of the endoscope is whether a region in which a lesion part is detectable is being observed (use of image data [0229] of Shelton | lesion detection unit 341, figure 2; lesion…display the support information [0037]; Kimura).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Nishide (US 2022/0198742) and Wang (US 2016/0058362).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PAMELA F WU whose telephone number is (571)272-9851. The examiner can normally be reached M-F: 8-4 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael Carey can be reached at 571-270-7235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
PAMELA F. WU
Examiner
Art Unit 3795
January 7, 2026
/RYAN N HENDERSON/Primary Examiner, Art Unit 3795