DETAILED ACTION
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on December 4th, 2025 has been entered.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements (IDS) submitted on December 15th, 2025 is being considered by the examiner.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 6-11, and 15-22 are rejected under 35 U.S.C. 103 as being unpatentable over Kamon (WO2018179991) in view of Endo (US20230027950) and Yan (US20180132944). For the purposes of this application, US20200008653 (the US continuation of Kamon) is being used as an English translation of WO2018179991, therefore the paragraph numbers will reference the US document.
In regards to claim 1, Kamon teaches an image processing apparatus for processing an image captured with an endoscope having a distal end from which an instrument is protrudable (Kamon Paragraph [0051] “The distal end portion 21 has, at its distal end surface, an illumination window, an observation window, an air/water supply nozzle, and a forceps outlet, which are not illustrated. The illumination window is used to apply illumination light onto a portion to be observed. The observation window is used to capture light from the portion to be observed. The air/water supply nozzle is used to clean the illumination window and the observation window. The forceps outlet is used to perform various types of treatments by using surgical instruments, such as forceps and an electric scalpel.”) the image processing apparatus comprising a processor configured to perform: a process of acquiring an image captured with the endoscope; (Kamon Paragraph [0083] “The ROI detecting unit 80 detects a ROI in a lumen by using an endoscopic image acquired from the endoscopic image acquisition unit 54.”) a process of causing a display to display the acquired image; (Kamon Paragraph [0089] “The first marked image generating unit 84 generates a first marked image having a marking at the position corresponding to the position information of the ROI in a schematic diagram of the lumen. In this embodiment, the first marked image is displayed on the display unit 18, and thereby the position information of the ROI is displayed.”) and a process of detecting a region of interest from the acquired image Kamon Paragraph [0083] “The ROI detecting unit 80 detects a ROI in a lumen by using an endoscopic image acquired from the endoscopic image acquisition unit 54.” Examiner note: In this reference, ROI stand for region of interest.).
Kamon does not teach a process of detecting a region of interest from the acquired image wherein the region of interest is a region on which a treatment is to be performed with the instrument; a process of determining whether the region of interest is present at a position where the treatment is performable with the instrument protruded from the distal end of the endoscope; a process of determining whether an obstacle is present between the region of interest and the instrument from the acquired image; and a process of, in a case where it is determined that the region of interest is present at the position where the treatment is performable with the instrument protruded from the distal end of the endoscope; in a case where it is determined that the obstacle is not present between the region of interest and the instrument, providing a notification that the region of interest is present at the position where the treatment is performable with the instrument; and in a case where it is determined that the obstacle is present between the region of interest and the instrument, not providing the notification.
However, Endo teaches a process of determining whether an obstacle is present between the region of interest and the instrument from the acquired image (Endo Figure 12; Paragraph [0087] “As illustrated in FIG. 12(A), in a case where the FIG. 79 as the highlight display is displayed, the FIG. 79 may be assimilated with surroundings or may be less conspicuous with respect to surrounding portions, depending on a color of a subject in the endoscopic image 75, the presence or absence of an object existing in the subject, and the like. As a result, the visibility may be decreased. In such a case, in general, a value of the color difference between the endoscopic image 75 and the FIG. 79 also decreases.”); in a case where it is determined that the obstacle is not present between the region of interest and the instrument, providing a notification (Endo Figure 13(B); Paragraph [0092] “As illustrated in FIG. 13(B), in a case where the color difference exceeds the first threshold value, the visibility determination unit 72 outputs identification information 85B to the display control unit 58. The display control unit 58 notifies the user that the visibility is high by displaying the identification information 85B on the display screen 84. In the example illustrated in FIG. 13(B), text information “high visibility” is displayed as the identification information 85B.” Examiner note: The notification in this reference is the indication of “high visibility”. This notification is also shown in Figure 14(B), in the form of an identification figure “O”); and in a case where it is determined that the obstacle is present between the region of interest and the instrument, not providing the notification (Endo Figure 13(A); Paragraph [0091] “In the example illustrated in FIG. 13(A), text information “low visibility” is displayed as the identification information 85A. Thereby, as in the first embodiment, a doctor as a user can recognize a decrease in the visibility of the highlight display.” Examiner note: The notification provided in the case where it is determined that the obstacle is not present is a text box displaying “high visibility” as seen in figure 13(B) part 85B. This same notification is not provided when it is determined that there is something obscuring the visibility of the target. Therefore, “the notification” is not provided in the case where the obstacle is present.).
Endo is considered to be analogous to the claimed invention because they are in the same field of image assisted endoscopic systems. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Kamon to include the teachings of Endo, to provide the benefit of increased visibility of the target in the endoscopic imaging (Endo Paragraph [0006] “As a result, depending on a color of a subject in the medical image, the presence or absence of an object existing in the subject, and the like, the highlight display may be assimilated with surroundings or may be less conspicuous with respect to surrounding portions. In a case where the visibility of the highlight display is decreased in this way, a doctor may not notice the region-of-interest.”)
Furthermore, Yan teaches a process of detecting a region of interest from the acquired image wherein the region of interest is a region on which a treatment is to be performed with the instrument; (Yan Paragraph [0035] “For image fusion based targeted biopsy, a user picks a target (T.sub.current) from a list of targets for biopsy in block 202.”; Paragraph [0037] “Referring again to FIG. 2, in block 206, for fusion guided targeted procedures, to correctly compute a distance to a target, the target and the detected needle need to be mapped into a common space (e.g., T.sub.current 3D space). This can be achieved by using device tracking and image registration techniques. For example, electromagnetic (EM) tracking may be employed for tracking the ultrasound probe. By registering a reconstructed 3D ultrasound volume with a 3D MR volume, 2D ultrasound images can be mapped to the 3D MR space in real-time. Since the targets are identified from MR images, the transformation chain will bring the needle detected from 2D ultrasound images into the same MR imaging space as the targets.” Examiner note: This reference teaches that targets are selected for biopsy (a treatment), and these targets are identified from magnetic resonance images.); a process of determining whether the region of interest is present at a position where the treatment is performable with the instrument protruded from the distal end of the endoscope (Yan Figure 2 Step 210; Paragraph [0039] “In block 210, once the needle is in the right direction, the system will continue to check whether the core taking part of the needle will cover the biopsy target once being fired.” Examiner note: This reference teaches detecting when an instrument is in a position which would lead the biopsy needle to be fired into the target (which is being considered the “region of interest”), this is analogous to determining that the region of interest is at a position where the treatment can be performed.); and a process of, in a case where it is determined that the region of interest is present at the position where the treatment is performable with the instrument protruded from the distal end of the endoscope; that the region of interest is present at the position where the treatment is performable with the instrument. (Yan Figure 2 Step 210; Paragraph [0039] “For visual feedback, a marker can be put at the desired location for the needle tip along the needle. The user needs to insert the needle to that marked point for firing. Alternatively, a beeping sound can be played when the needle is getting close to the firing point. The frequency of the beeping may be used for denoting the distance between the needle tip and its desired location.” Examiner note: The case where it is determined that the obstacle is not present between the region of interest and the instrument is shown by Endo as cited above.)
Yan is considered to be analogous to the claimed invention because they are in the same field of an endoscopic system that assists users. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Kamon in view of Endo to include the teachings of Yan, to provide the benefit of allowing users to predict more accurately where their instrument will be after use (Yan Paragraph [0042] “With this feedback, users can know accurately where the needle 344 will end up after firing. This provides more accurate guidance than just virtual estimation based on a users' own knowledge and experience. An estimated or projected core taking region 346 is shown on the projected biopsy guideline 334 at an appropriate distance to account for firing of the needle 344.”)
In regards to claim 6, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 1, wherein the processor is configured to further perform a process of detecting the instrument from the acquired image (Yan Figure 2 Step 204; Paragraph [0035] “In block 204, once a biopsy needle enters a viewing field of a live imaging viewing pane or image, the needle object (O.sub.needle) can be detected by using an image based detection method.”), and the processor is configured to perform the process of providing the notification in a case where the instrument is detected from the acquired image, it is determined that the region of interest is present at the position where the treatment is performable with the instrument protruded from the distal end of the endoscope (Yan Figure 2 Step 210; Paragraph [0039] “In block 210, once the needle is in the right direction, the system will continue to check whether the core taking part of the needle will cover the biopsy target once being fired. The 3D position of the core taking part, which is usually in the shape of a cylinder, is computed based on the anatomy of the needle, the needle tip location, and also the needle pointing direction. This is illustratively depicted in FIG. 5. For visual feedback, a marker can be put at the desired location for the needle tip along the needle. The user needs to insert the needle to that marked point for firing. Alternatively, a beeping sound can be played when the needle is getting close to the firing point. “), and it is determined that the obstacle is not present between the region of interest and the instrument (Endo Figure 12; Figure 13(B); Paragraph [0087] “As illustrated in FIG. 12(A), in a case where the FIG. 79 as the highlight display is displayed, the FIG. 79 may be assimilated with surroundings or may be less conspicuous with respect to surrounding portions, depending on a color of a subject in the endoscopic image 75, the presence or absence of an object existing in the subject, and the like. As a result, the visibility may be decreased. In such a case, in general, a value of the color difference between the endoscopic image 75 and the FIG. 79 also decreases.”; Paragraph [0092] “As illustrated in FIG. 13(B), in a case where the color difference exceeds the first threshold value, the visibility determination unit 72 outputs identification information 85B to the display control unit 58. The display control unit 58 notifies the user that the visibility is high by displaying the identification information 85B on the display screen 84. In the example illustrated in FIG. 13(B), text information “high visibility” is displayed as the identification information 85B.”).
In regards to claim 7, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 1, wherein the processor is configured to change a display image to be displayed on the display to provide the notification. (Endo Figure 13(B) “High visibility” or Figure 14(B) “O”)
In regards to claim 8, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 7, wherein the processor is configured to change display of a display region for the image to provide the notification, the display region for the image being set in the display image. (Endo Figure 13(B) “High visibility” or Figure 14(B) “O”)
In regards to claim 9, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 8, wherein the processor is configured to display a geometric figure indicating the region of interest such that the geometric figure is superimposed on the image to be displayed in the display region to provide the notification. (Endo Figure 14(B) “O”; Paragraph [0096] “ As illustrated in FIG. 14(B), in a case where the color difference exceeds the first threshold value, the visibility determination unit 72 outputs information of the icon 86B to the display control unit 58. The display control unit 58 notifies the user that the visibility is high by displaying the icon 86B as the identification figure on the display screen 84. In the example illustrated in FIG. 14(B), as the icon 86B, a double circle mark is displayed.”)
In regards to claim 10, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 7, wherein the processor is configured to further perform a process of displaying a geometric figure indicating the region of interest such that the geometric figure is superimposed on the image to be displayed in a display region for the image in a case where the region of interest is detected, the display region for the image being set in the display image, and the processor is configured to change display of the geometric figure to provide the notification. (Kamon Figure 11; Paragraph [0090] “As illustrated in FIG. 11, in a first marked image 90, a mark M1 indicating that the ROI R1 has been detected is displayed in a superimposed manner at the position corresponding to the detection position P1 of the distal end portion 21 as the position information of the ROI R1, in a schematic diagram 92. Also, in the first marked image 90, a mark M2 indicating that the ROI R2 has been detected is displayed in a superimposed manner at the position corresponding to the detection position P2 of the distal end portion 21 as the position information of the ROI R2, in the schematic diagram 92. Accordingly, the user is able to recognize the positions of the ROIs. The marks M1 and M2 indicating that the ROIs have been detected are circular in FIG. 11. Alternatively, the marks M1 and M2 may be polygonal or the like.”; Paragraph [0092] “The first marked image generating unit 84 changes the display mode of the marking in accordance with a discrimination result acquired by the discrimination unit 82. A method of changing the color, size, thickness, shape, density, or the like of the marking may be used as a method for changing the display mode of the marking. In this example, the line thicknesses of the mark M1 representing an adenoma and the mark M2 representing a lesion portion are made different from each other so as to represent a difference in the discrimination result.”)
In regards to claim 11, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 10, wherein the processor is configured to change at least one of a color, a shape, a brightness, or a line type of the geometric figure to change the display of the geometric figure. (Kamon Figure 11; Paragraph [0092] “The first marked image generating unit 84 changes the display mode of the marking in accordance with a discrimination result acquired by the discrimination unit 82. A method of changing the color, size, thickness, shape, density, or the like of the marking may be used as a method for changing the display mode of the marking. In this example, the line thicknesses of the mark M1 representing an adenoma and the mark M2 representing a lesion portion are made different from each other so as to represent a difference in the discrimination result.”)
In regards to claim 15, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 7, wherein the processor is configured to change display of a region other than a display region for the image to provide the notification, the display region for the image being set in the display image. (Kamon Figure 13/14; Paragraph [0094] “The warning unit 83 changes a warning mode in accordance with a discrimination result acquired by the discrimination unit 82. In this example, the display mode of a warning is changed. For example, the display unit 18 is caused to display “!” when a discrimination result indicating an adenoma is acquired, whereas the display unit 18 is caused to display “!!” when a discrimination result indicating a lesion portion is acquired.” Examiner note: As can be seen from figures 13 and 14, the warning symbol (! Or !!) is displayed outside of the display region of the image.)
In regards to claim 16, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 15, wherein the processor is configured to display information in the region other than the display region for the image to provide the notification. (Kamon Figure 13/14; Paragraph [0094] “The warning unit 83 changes a warning mode in accordance with a discrimination result acquired by the discrimination unit 82. In this example, the display mode of a warning is changed. For example, the display unit 18 is caused to display “!” when a discrimination result indicating an adenoma is acquired, whereas the display unit 18 is caused to display “!!” when a discrimination result indicating a lesion portion is acquired.”)
In regards to claim 17, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 16, wherein the processor is configured to display a message or a geometric figure as the information. (Kamon Figure 13/14; Paragraph [0094] “The warning unit 83 changes a warning mode in accordance with a discrimination result acquired by the discrimination unit 82. In this example, the display mode of a warning is changed. For example, the display unit 18 is caused to display “!” when a discrimination result indicating an adenoma is acquired, whereas the display unit 18 is caused to display “!!” when a discrimination result indicating a lesion portion is acquired.”)
In regards to claim 18, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 1, wherein the processor is configured to cause audio to be output to provide the notification. (Kamon Paragraph [0094] “Also in the case of outputting a warning sound as a warning, the warning mode may be changed in accordance with a discrimination result. Examples of a method for changing the warning mode of a warning sound include a method for changing the volume of the warning sound and a method for changing the pitch of the warning sound.”)
In regards to claim 19, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 1, wherein the region of interest is a lesion portion. (Kamon Paragraph [0107] “Because the ROI R2 is a region for which a discrimination result indicating a lesion portion is acquired by the discrimination unit 82, it is preferable to display the warning indication 95 that is further emphasized.”)
In regards to claim 20, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 1, wherein the region of interest is an organ. (Kamon Paragraph [0110] “The method for displaying the position information of a ROI is not limited to the method for displaying the first marked image 90 as in this embodiment. For example, lumen portion information of a plurality of lumen portions, which are acquired by dividing a lumen, may be displayed as the position information of a ROI. For example, the large intestine is divided into the cecum, the ascending colon, the transverse colon, the descending colon, the sigmoid colon, and the rectum as lumen portions.” Examiner note: This shows that a ROI can be a section of organ, such as the ascending or descending colon.)
In regards to claim 21, Kamon in view of Endo and Yan renders obvious the claim limitations as in the consideration of claim 1.
In regards to claim 22, Kamon in view of Endo and Yan teaches a non-transitory, computer-readable tangible recording medium which records thereon an image processing program for causing, when read by a computer, the computer to implement the method of claim 1 (Kamon Paragraph [0068] “The controller 52 has a central processing unit (CPU), a read only memory (ROM) that stores a control program and setting data that is necessary for control, a random access memory (RAM) serving as a work memory to which the control program is loaded, and the like. With the CPU executing the control program, the controller 52 controls the individual units of the processor device 16 and also controls the light source control unit 32 and the image sensor 46.”), and renders obvious the remaining claim limitations as in the consideration of claim 1.
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Kamon in view of Endo and Yan as applied to the claims above, and further in view of Genova (US20240099794).
In regards to claim 4, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 1, wherein the processor is configured to perform: determining whether the region of interest is present at the position where the treatment is performable with the instrument protruded from the distal end of the endoscope (Yan Figure 2 Step 210; Paragraph [0039] “In block 210, once the needle is in the right direction, the system will continue to check whether the core taking part of the needle will cover the biopsy target once being fired.” Examiner note: This reference teaches detecting when an instrument is in a position which would lead the biopsy needle to be fired into the target (which is being considered the “region of interest”), this is analogous to determining that the region of interest is at a position where the treatment can be performed.),
Kamon in view of Endo and Yan fails to teach a process of determining whether a distance from an extension line of the instrument protruding from the distal end of the endoscope to the region of interest is less than or equal to a second threshold value.
However, Genova teaches a process of determining whether a distance from an extension line of the instrument protruding from the distal end of the endoscope to the region of interest is less than or equal to a second threshold value (Genova Paragraph [0083] “Block 606 then directs the microprocessor 500 to read the instrument parameters from the memory location 522 for each instrument and to determine an instrument envelope based on the instrument parameters. The instrument envelope corresponding to the instrument 110 is depicted in broken lines at 416 in FIGS. 4 and 5 and identifies a region through which the instrument 110 is capable of physically moving when inserted into the body cavity 404.”; Paragraph [0107] “In some embodiments, the master processor circuit 114 may determine whether there are any regions of potential encroachment of between the instrument envelope 710 and an identified anatomical feature, in which case an alert signal may be generated. The alert signal may take the form of a further warning overlay image 1302 for display as part of the composite image.” Examiner note: In this reference, the instrument envelope is analogous to the extension line because they are both representations of an instruments potential path. Encroachment is defined as “intrusions on a person’s territory or rights.” It can be understood, in this context, to mean “intrusion on an identified anatomical feature”. So when the instrument is “intruding” on the organ, the distance between them is 0 or close to 0 and the instrument is encroaching on the organ. Therefore the threshold distance in this reference would be 0 or close to 0.)
Genova is considered to be analogous to the claimed invention because they are in the same field of an endoscopic system that assists users. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Kamon in view of Endo and Yan to include the teachings of Genova, to provide the benefit of reducing damage caused by tools to unintended organs in the surrounding area (Genova Paragraph [0002] “When performing surgery using a robotic surgical system, instruments are usually inserted into a body cavity of a patient. The insertion process has some risk since instruments may inadvertently damage organs and/or tissue while being inserted. Incorrect positioning of the instruments in the body cavity may also result in a limited range of motion within the body cavity.”)
Claims 12-14 are rejected under 35 U.S.C. 103 as being unpatentable over Kamon in view of Endo and Yan as applied to the claims above, and further in view of Uchida (WO2021019851)
In regards to claim 12, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 7, but fails to teach wherein the processor is configured to further perform a process of causing information indicating a protruding direction of the instrument to be displayed superimposed on the image to be displayed in a display region for the image, the display region for the image being set in the display image, and the processor is configured to change display of the information indicating the protruding direction of the instrument to provide the notification.
However, Uchida teaches wherein the processor is configured to further perform a process of causing information indicating a protruding direction of the instrument to be displayed superimposed on the image to be displayed in a display region for the image, the display region for the image being set in the display image, and the processor is configured to change display of the information indicating the protruding direction of the instrument to provide the notification. (Uchida Page 9 Paragraph 2 “The display control unit 158A shows the ultrasonic image G a straight line L1 indicating the movable path of the puncture needle when the puncture needle is inserted into the living tissue in a state where the puncture needle is derived at the minimum angle, and the puncture needle. The straight line L2 indicating the movable path of the puncture needle when the puncture needle is inserted into the living tissue in the state where the derivation angle is maximum is superimposed as information indicating the reachable range of the puncture needle.”)
Uchida is considered to be analogous to the claimed invention because they are in the same field of an endoscopic system that can assist users. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Kamon to include the teachings of Uchida, to provide the benefit of a system that can examine and manipulate body parts and/or organs from outside the body (Uchida Page 2 Paragraph 2 “Here, the observation target site is a site that is difficult to inspect from the body surface side (outside) of the patient, for example, the gallbladder or the pancreas. By using the endoscopic ultrasonography system 10, the state of the observation target site and the presence or absence of abnormalities are ultrasonically diagnosed via the gastrointestinal tract such as the esophagus, stomach, duodenum, small intestine, and large intestine, which are the body cavities of the patient. It is possible.”)
In regards to claim 13, Kamon in view of Endo, Yan, and Uchida teaches the image processing apparatus according to claim 12, wherein the processor is configured to display a straight line along the protruding direction of the instrument as the information indicating the protruding direction of the instrument. (Uchida Page 9 Paragraph 2 “The display control unit 158A shows the ultrasonic image G a straight line L1 indicating the movable path of the puncture needle when the puncture needle is inserted into the living tissue in a state where the puncture needle is derived at the minimum angle, and the puncture needle. The straight line L2 indicating the movable path of the puncture needle when the puncture needle is inserted into the living tissue in the state where the derivation angle is maximum is superimposed as information indicating the reachable range of the puncture needle.”)
In regards to claim 14 Kamon in view of Endo, Yan, and Uchida teaches the image processing apparatus according to claim 13, wherein the processor is configured to change at least one of a color, a brightness, or a line type of the straight line to change display of the straight line. (Uchida Page 13 Paragraph 7 “The measurement unit 158B uses, for example, a monitor 20 or a speaker (not shown) to output a message such as "Be careful because there are blood vessels on the movement path of the puncture needle", or the straight line shown in FIG. Call the surgeon's attention by changing the color of M to another color, etc.”)
Claims 3 and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Kamon in view of Endo and Yan as applied to the claims above, and further in view of Prendergast (Autonomous Localization, Navigation and Haustral Fold Detection for Robotic Endoscopy)
In regards to claim 23, Kamon in view of Endo and Yan teaches the image processing apparatus according to claim 1, wherein the processor is configured to determine whether a distance from a Yan Figure 2 Step 210; Paragraph [0039] “In block 210, once the needle is in the right direction, the system will continue to check whether the core taking part of the needle will cover the biopsy target once being fired.” Examiner note: This reference teaches detecting when an instrument (which is being considered the “point set in the acquired image”) is in a position which would lead the biopsy needle to be fired into the target (which is being considered the “region of interest”), this is analogous to determining that the region of interest is within a distance from the point set in the acquired image where the treatment can be performed.).
Kamon in view of Endo and Yan fails to teach wherein the processor is configured to determine whether a distance from a reference point set in the acquired image to the region of interest is less than or equal to a first threshold value.
However, Prendergast teaches a process of determining whether a distance from a reference point set in the acquired image to the region of interest is less than or equal to a first threshold value (Prendergast Page 787-788 Equations 1-5 “To navigate the MESA, center estimates from each image frame are calculated using the vision algorithm presented previously in [2]. A pixel error E is calculated by taking the difference between the true center pixel position 𝐹𝑐𝑒𝑛𝑡𝑒𝑟 and the X pixel position of the center estimate 𝐸𝑠𝑡𝐶𝑒𝑛𝑡𝑒𝑟.” Examiner note: In this reference, the estimated center of the lumen (which is being considered the “region of interest”) is compared to the true center of the image (which is being considered the “reference point”), and based on their difference, either equations 2-3 or 4-5 are used to navigate the endoscope. When equations 2-3 are used, the error is greater than 30 pixels and the region of interest is not considered reachable so more steering is needed. When using equations 4-5 the pixel error is less than 30 so less steering is needed and the region of interest is considered reachable.).
Prendergast is considered to be analogous to the claimed invention because they are in the same field of an endoscopic system that assists users. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Kamon in view of Endo and Yan to include the teachings of Prendergast, to provide the benefit of being able to correct for centering errors by adjusting the wheel speed based on error magnitude (Prendergast Page 788 “This controller consists of a proportional, integral, derivative (PID) control mode for errors greater than 30 pixels, and allows the REP-2 to use tank steering (one wheel forward, one driving in reverse) to correct for center tracking errors”)
In regards to claim 3, Kamon in view of Endo, Yan, and Prendergast teaches the image processing apparatus according to claim 23, wherein the reference point is a center of the acquired image. (Prendergast Page 787-788 Equations 1-5 “To navigate the MESA, center estimates from each image frame are calculated using the vision algorithm presented previously in [2]. A pixel error E is calculated by taking the difference between the true center pixel position 𝐹𝑐𝑒𝑛𝑡𝑒𝑟 and the X pixel position of the center estimate 𝐸𝑠𝑡𝐶𝑒𝑛𝑡𝑒𝑟.” Examiner note: Here the reference point is EstCenter.)
Claim 24 is rejected under 35 U.S.C. 103 as being unpatentable over Kamon in view of Endo, Yan, and Prendergast as applied to the claims above, and further in view of Tabandeh (US20220211460).
In regards to claim 24, Kamon in view of Endo, Yan, and Prendergast teaches the image processing apparatus according to claim 23, but fails to teach wherein the first threshold value is set individually in accordance with at least one of a type of the instrument or a type of the treatment performed with the instrument.
However, Tabandeh teaches wherein the first threshold value is set individually in accordance with at least one of a type of the instrument or a type of the treatment performed with the instrument (Tabandeh Paragraph [0060] “ In some examples, the instrument is considered looked at when the operator moves the imaging device so that the center of the viewing region is within a threshold distance of a point representative of the distal portion of the instrument as projected onto a viewing plane of the imaging device… In some examples, the threshold distance may be based on a type of procedure being formed, a type of the computer-assisted device, operator preference, and/or the like.”).
Tabandeh is considered to be analogous to the claimed invention because they are in the same field of an endoscopic system that assists users. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Kamon in view of Endo, Yan, and Prendergast to include the teachings of Tabandeh, to provide the benefit of an imaging system which can automatically follow the instrument and keep the instrument in the same position relative to the imaging device (Tabandeh Paragraph [0005] “For example, it is possible to have an instrument move along with the imaging device so that a relative position and/or orientation of the instrument is held fixed relative to the imaging device and, from the perspective of the operator, does not move or shows very little movement in the images captured by the imaging device.”)
Response to Arguments
Applicant’s arguments with respect to independent claims 1, 21, and 22 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
“Virtual Instrumentation System With Real-Time Visual Feedback and Needle Position Warning Suitable for Ophthalmic Anesthesia Training” teaches a method of training for ocular surgery, where warnings are created based on the angle and position of the needle relative to important ocular structures.
“Photoacoustic imaging for surgical guidance: Principles, applications, and outlook” gives an overview of using imaging for surgical guidance. This reference discloses that a warning could be created when a target is detected “One underlying goal of this approach is to visualize both the tool tip and a structure that needs to be either targeted or avoided in the same photoacoustic image, as illustrated in Fig. 1.”
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CALEB LOGAN ESQUINO whose telephone number is (703)756-1462. The examiner can normally be reached M-Fr 8:00AM-4:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached at (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CALEB L ESQUINO/ Examiner, Art Unit 2677
/ANDREW W BEE/ Supervisory Patent Examiner, Art Unit 2677