DETAILED CORRESPONDENCE
This final office action is in response to the Amendments filed on 14 January 2026, regarding application number 18/759,742.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Response to Amendment
Claims 1, 5-6 and 9-25 remain pending in the application, while claims 2-4 and 7-8 have been cancelled. Claims 21-25 are new.
Response to Arguments
Applicant’s arguments, see Pages 9-26, filed 14 January 2026, with respect to the rejections of the claims under 35 U.S.C. § 102 and 35 U.S.C. § 103 have been fully considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. See full rejection details below.
Claim Objections
Claim 25 is objected to because of the following informality. The claim recites "the target object", but there is no prior recitation of a target object, therefore the claim element lacks antecedent basis. For the purpose of compact prosecution, "the target object" will be read as "a target object".
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 11, 21-22 and 25 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding Claim 11
Claim 10 recites “controlling a movement state of the imaging device and/or the surgical robot in a next process”, but then claim 11 recites “controlling a movement state of the imaging device andthe surgical robot in a next process”. It is unclear if claim 11 is referring to the same movement state as mentioned in claim 10. It is also unclear whether or not the movement state requires the imaging device and the surgical robot and/or the surgical robot. As such, the metes and bounds of the claims are unclear. Claims 21-22 are rejected by virtue of dependency on claim 11. For the purpose of compact prosecution, claim 11 will be read as stating “wherein the controlling [[a]] the movement state of the imaging device and/or the surgical robot in a next process”.
Regarding Claim 25
Claim 25 recites "…at this time the target object suddenly raises a hand so that the hand is close to the surgical robot…". The terms "suddenly" and "close to" render the claims indefinite because it is unclear what is meant by suddenly and close to. For example, is 1 mm or 1 cm considered as being "close to"?. As such, the metes and bounds of the claims are unclear. For the purpose of compact prosecution, "at this time the target object suddenly raises a hand so that the hand is close to the surgical robot" will be read as "at this time the target object is still decreasing". The term "is still decreasing" renders the claims indefinite because there is no prior reference of a decreasing distance, therefor it is unclear what is meant by "still decreasing". As such, the metes and bounds of the claims are unclear. For the purpose of compact prosecution, "and the distance between the imaging device and the surgical robot is still decreasing." will be read as "and the distance between the imaging device and the surgical robot decreases
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 16-17 and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kan et al. (US 6094590 A and Kan hereinafter) .
Regarding Claim 1
Kan teaches a method for controlling an image-guided interventional puncture device (see all Figs.; Col. 2, lines 35-50), comprising:
obtaining an initial movement state of an imaging device and a surgical robot, the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see MRI apparatus 1 and operating manipulators 4 and/or "surgical operating equipment" in most Figs.; Fig. 9, steps S904-905 and 909-910; Claim 1 "...wherein at least one of said magnetic resonance imaging apparatus and said operation table is relatively movable to a position at which said magnetic resonance imaging apparatus can measure an area of said operation table, and said operating manipulator is movable with respect to said operation table."; Col. 4, line 66 - Col. 5, line 8; Col. 6, lines 35-61; Col 8, lines 48-64; Col. 13, lines 10-63, especially "When the measurement by the MRI apparatus is selected, the equipment management/control device makes, on the basis of status information, the judgement of whether or not the MRI apparatus is under measurement (step S904). In the case where the MRI apparatus is under measurement, the measurement is continued, as it is (step S906). In the case where the MRI apparatus is not under measurement, the judgement of whether or not the surgical operating equipment is under use, is made on the basis of status information of the surgical operating equipment (step S905) ... On the other hand, when the use of the surgical operating equipment is selected, the judgement is made of whether or not the surgical operating equipment is under use (step S909). In the case where the surgical operating equipment is under use, the use is continued (step 911). In the case where the surgical operating equipment is not under use, the judgement is made of whether or not the MRI apparatus is under measurement (step S910)."); and
controlling a target movement state of the surgical robot and the imaging device based on the initial movement state (see Fig. 9, all; Col. 8, lines 48-64; Col. 13, lines 10-63, see the limitations below), including:
controlling a second movement state of the surgical robot based on a first movement state of the imaging device (see Fig. 9, steps S904-S907; Col. 8, lines 48-64; Col. 13, lines 10-63, see the limitation below), including:
in response to a determination that the imaging device is in movement, locking a moving part of the surgical robot to control the surgical robot to remain stationary based on the first movement state of the imaging device (see Fig. 9, steps S904-S907; Col. 8, lines 48-64 "During MRI measurement, the operating manipulators 4, the surgical operating equipments, the endoscope and the ultrasonic scanner are controlled such that they are placed under electrically stopped conditions and mechanically locked non-operative conditions, in order to avoid influence which may be exerted on the result of detection by the MRI apparatus. On the other hand, even if an MRI measurement request is indicated by the doctor under the operation of the operating manipulators 4, the surgical operating equipments, the endoscope and the ultrasonic scanner, the indication of "UNDER OPERATION" and a warning are displayed to advise the doctor to stop the devices under operation. The equipment management/control device 8 makes a control of permitting the measurement by the MRI apparatus after the stopped conditions of the abovementioned devices are detected."; Col. 13, lines 10-63, especially "In the case where the MRI apparatus is under measurement, the measurement is continued, as it is (step S906). In the case where the MRI apparatus is not under measurement, the judgement of whether or not the surgical operating equipment is under use, is made on the basis of status information of the surgical operating equipment (step S905). In the case where the surgical operating equipment is under use, the equipment management/control device displays the state thereof to call the doctor's attention and performs an equipment use stopping process (step S907). After the confirmation of the stoppage of the surgical operating, the start of measurement by the MRI apparatus is displayed and a measuring operation sets in (step S908). In the case where the surgical operating equipment is not under use, the use of the surgical operating equipment is immediately started ... According to this flow chart, it is possible to control the measurement by the MRI apparatus and the use of the surgical operating equipment exclusively from each other."); and/or
controlling the first movement state of the imaging device based on the second movement state of the surgical robot (see Fig. 9, steps S909-S912; Col. 13, lines 10-63, see the limitation below), including:
in response to a determination that the surgical robot is in movement, locking a moving part of the imaging device to control the imaging device to remain stationary based on the second movement state of the surgical robot (see Fig. 9, steps S909-S912; Col. 13, lines 10-63, especially "On the other hand, when the use of the surgical operating equipment is selected, the judgement is made of whether or not the surgical operating equipment is under use (step S909). In the case where the surgical operating equipment is under use, the use is continued (step 911). In the case where the surgical operating equipment is not under use, the judgement is made of whether or not the MRI apparatus is under measurement (step S910). If the MRI apparatus is under measurement, a measurement stopping operation sets in (step S912). Thereafter, the use of the surgical operating equipment is started (step S913) ... According to this flow chart, it is possible to control the measurement by the MRI apparatus and the use of the surgical operating equipment exclusively from each other.").
Regarding Claim 16
Kan teaches a system for controlling an image-guided interventional puncture device (see all Figs.; Col. 2, lines 35-50), comprising:
an imaging device, configured to obtain image data of a target object (see MRI apparatus 1 in most Figs.; Col. 4, lines 20-30); and
a surgical robot, configured to perform a puncture operation (see operating manipulators 4 and/or "surgical operating equipment" in most Figs. Col. 4, line 67 - Col. 5, line 8); and
a control module, configured to control a target movement state of the surgical robot and the imaging device based on an initial movement state of the imaging device and the surgical robot, wherein the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see Fig. 9, steps S904-905 and 909-910; Claim 1 "...wherein at least one of said magnetic resonance imaging apparatus and said operation table is relatively movable to a position at which said magnetic resonance imaging apparatus can measure an area of said operation table, and said operating manipulator is movable with respect to said operation table."; Col. 4, line 66 - Col. 5, line 8; Col. 6, lines 35-61; Col 8, lines 48-64; Col. 13, lines 10-63, especially "When the measurement by the MRI apparatus is selected, the equipment management/control device makes, on the basis of status information, the judgement of whether or not the MRI apparatus is under measurement (step S904). In the case where the MRI apparatus is under measurement, the measurement is continued, as it is (step S906). In the case where the MRI apparatus is not under measurement, the judgement of whether or not the surgical operating equipment is under use, is made on the basis of status information of the surgical operating equipment (step S905) ... On the other hand, when the use of the surgical operating equipment is selected, the judgement is made of whether or not the surgical operating equipment is under use (step S909). In the case where the surgical operating equipment is under use, the use is continued (step 911). In the case where the surgical operating equipment is not under use, the judgement is made of whether or not the MRI apparatus is under measurement (step S910).");
to control the target movement state of the surgical robot and the imaging device based on the initial movement state of the imaging device and/or the surgical robot (see Fig. 9, all; Col. 8, lines 48-64; Col. 13, lines 10-63, see the limitations below), the control module is configured to:
control a second movement state of the surgical robot based on a first movement state of the imaging device (see Fig. 9, steps S904-S907; Col. 8, lines 48-64; Col. 13, lines 10-63, see the limitation below), including:
in response to a determination that the imaging device is in movement, locking a moving part of the surgical robot to control the surgical robot to remain stationary based on the first movement state of the imaging device (see Fig. 9, steps S904-S907; Col. 8, lines 48-64 "During MRI measurement, the operating manipulators 4, the surgical operating equipments, the endoscope and the ultrasonic scanner are controlled such that they are placed under electrically stopped conditions and mechanically locked non-operative conditions, in order to avoid influence which may be exerted on the result of detection by the MRI apparatus. On the other hand, even if an MRI measurement request is indicated by the doctor under the operation of the operating manipulators 4, the surgical operating equipments, the endoscope and the ultrasonic scanner, the indication of "UNDER OPERATION" and a warning are displayed to advise the doctor to stop the devices under operation. The equipment management/control device 8 makes a control of permitting the measurement by the MRI apparatus after the stopped conditions of the abovementioned devices are detected."; Col. 13, lines 10-63, especially "In the case where the MRI apparatus is under measurement, the measurement is continued, as it is (step S906). In the case where the MRI apparatus is not under measurement, the judgement of whether or not the surgical operating equipment is under use, is made on the basis of status information of the surgical operating equipment (step S905). In the case where the surgical operating equipment is under use, the equipment management/control device displays the state thereof to call the doctor's attention and performs an equipment use stopping process (step S907). After the confirmation of the stoppage of the surgical operating, the start of measurement by the MRI apparatus is displayed and a measuring operation sets in (step S908). In the case where the surgical operating equipment is not under use, the use of the surgical operating equipment is immediately started ... According to this flow chart, it is possible to control the measurement by the MRI apparatus and the use of the surgical operating equipment exclusively from each other."); and/or
control the first movement state of the imaging device based on the second movement state of the surgical robot (see Fig. 9, steps S909-S912; Col. 13, lines 10-63, see the limitation below), including:
in response to a determination that the surgical robot is in movement, locking a moving part of the imaging device to control the imaging device to remain stationary based on the second movement state of the surgical robot (see Fig. 9, steps S909-S912; Col. 13, lines 10-63, especially "On the other hand, when the use of the surgical operating equipment is selected, the judgement is made of whether or not the surgical operating equipment is under use (step S909). In the case where the surgical operating equipment is under use, the use is continued (step 911). In the case where the surgical operating equipment is not under use, the judgement is made of whether or not the MRI apparatus is under measurement (step S910). If the MRI apparatus is under measurement, a measurement stopping operation sets in (step S912). Thereafter, the use of the surgical operating equipment is started (step S913) ... According to this flow chart, it is possible to control the measurement by the MRI apparatus and the use of the surgical operating equipment exclusively from each other.").
Regarding Claim 17
Kan teaches the system of claim 16 (as discussed above in claim 16),
further comprising: a display module, configured to receive control command information and movement status information output by the imaging device and/or the surgical robot and display the information in a display interface (see Fig. 1, monitors 51 and 61; Col. 5, lines 41-55 "The main console 5 is used by a main operator 100. The main console 5 is provided with an operation input device for operating the operating manipulators 4 and a display device for displaying a tomographic (or cross-sectional) image from the MRI apparatus 1, an image from the endoscope 2, an ultrasonic image and the other medical information. Further, an MRI image before a surgical operation and guidance information are displayed. A stereoscopic monitor 51 can display information from the MRI apparatus and a three-dimensional version of the ultrasonic image stereoscopically. Also, a sub-monitor 52 is a monitor for displaying the cross-sectional image or the ultrasonic image. A change-over between the images can be conducted using a pedal or the like without using a hand.").
Regarding Claim 20
Kan teaches a non-transitory computer readable storage medium, comprising at least one set of instructions, wherein when executed by at least one processor of a computer device (see all Figs.; Col. 2, lines 35-50), the at least one set of instructions directs the at least one processor to perform a method including:
obtaining an initial movement state of an imaging device and a surgical robot, the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see MRI apparatus 1 and operating manipulators 4 and/or "surgical operating equipment" in most Figs.; Fig. 9, steps S904-905 and 909-910; Claim 1 "...wherein at least one of said magnetic resonance imaging apparatus and said operation table is relatively movable to a position at which said magnetic resonance imaging apparatus can measure an area of said operation table, and said operating manipulator is movable with respect to said operation table."; Col. 4, line 66 - Col. 5, line 8; Col. 6, lines 35-61; Col 8, lines 48-64; Col. 13, lines 10-63, especially "When the measurement by the MRI apparatus is selected, the equipment management/control device makes, on the basis of status information, the judgement of whether or not the MRI apparatus is under measurement (step S904). In the case where the MRI apparatus is under measurement, the measurement is continued, as it is (step S906). In the case where the MRI apparatus is not under measurement, the judgement of whether or not the surgical operating equipment is under use, is made on the basis of status information of the surgical operating equipment (step S905) ... On the other hand, when the use of the surgical operating equipment is selected, the judgement is made of whether or not the surgical operating equipment is under use (step S909). In the case where the surgical operating equipment is under use, the use is continued (step 911). In the case where the surgical operating equipment is not under use, the judgement is made of whether or not the MRI apparatus is under measurement (step S910)."); and
controlling a target movement state of the surgical robot and the imaging device based on the initial movement state (see Fig. 9, all; Col. 8, lines 48-64; Col. 13, lines 10-63, see the limitations below), including:
controlling a second movement state of the surgical robot based on a first movement state of the imaging device (see Fig. 9, steps S904-S907; Col. 8, lines 48-64; Col. 13, lines 10-63, see the limitation below), including:
in response to a determination that the imaging device is in movement, locking a moving part of the surgical robot to control the surgical robot to remain stationary based on the first movement state of the imaging device (see Fig. 9, steps S904-S907; Col. 8, lines 48-64 "During MRI measurement, the operating manipulators 4, the surgical operating equipments, the endoscope and the ultrasonic scanner are controlled such that they are placed under electrically stopped conditions and mechanically locked non-operative conditions, in order to avoid influence which may be exerted on the result of detection by the MRI apparatus. On the other hand, even if an MRI measurement request is indicated by the doctor under the operation of the operating manipulators 4, the surgical operating equipments, the endoscope and the ultrasonic scanner, the indication of "UNDER OPERATION" and a warning are displayed to advise the doctor to stop the devices under operation. The equipment management/control device 8 makes a control of permitting the measurement by the MRI apparatus after the stopped conditions of the abovementioned devices are detected."; Col. 13, lines 10-63, especially "In the case where the MRI apparatus is under measurement, the measurement is continued, as it is (step S906). In the case where the MRI apparatus is not under measurement, the judgement of whether or not the surgical operating equipment is under use, is made on the basis of status information of the surgical operating equipment (step S905). In the case where the surgical operating equipment is under use, the equipment management/control device displays the state thereof to call the doctor's attention and performs an equipment use stopping process (step S907). After the confirmation of the stoppage of the surgical operating, the start of measurement by the MRI apparatus is displayed and a measuring operation sets in (step S908). In the case where the surgical operating equipment is not under use, the use of the surgical operating equipment is immediately started ... According to this flow chart, it is possible to control the measurement by the MRI apparatus and the use of the surgical operating equipment exclusively from each other."); and/or
controlling the first movement state of the imaging device based on the second movement state of the surgical robot (see Fig. 9, steps S909-S912; Col. 13, lines 10-63, see the limitation below), including:
in response to a determination that the surgical robot is in movement, locking a moving part of the imaging device to control the imaging device to remain stationary based on the second movement state of the surgical robot (see Fig. 9, steps S909-S912; Col. 13, lines 10-63, especially "On the other hand, when the use of the surgical operating equipment is selected, the judgement is made of whether or not the surgical operating equipment is under use (step S909). In the case where the surgical operating equipment is under use, the use is continued (step 911). In the case where the surgical operating equipment is not under use, the judgement is made of whether or not the MRI apparatus is under measurement (step S910). If the MRI apparatus is under measurement, a measurement stopping operation sets in (step S912). Thereafter, the use of the surgical operating equipment is started (step S913) ... According to this flow chart, it is possible to control the measurement by the MRI apparatus and the use of the surgical operating equipment exclusively from each other.").
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 5 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Kan as applied to claims 1 and 16 above, and further in view of Gregerson et al. (US 20160302871 A1 and Gregerson hereinafter) .
Regarding Claim 5
Kan teaches the method of claim 1 (as discussed above in claim 1),
Kan is silent regarding wherein the controlling a second movement state of the surgical robot based on a first movement state of the imaging device includes:
determining a first movement trajectory of the imaging device based on the first movement state of the imaging device; and
controlling, based on the first movement trajectory, the surgical robot to move.
Gregerson teaches a method for controlling an image-guided interventional puncture device (see all Figs., especially Fig. 5; [0003]-[0004]), comprising:
obtaining an initial movement state of an imaging device and a surgical robot, the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see Figs. 1A-2C, imaging device 125 and robotic arms 101a-101b; Fig. 5, step 503; [0003]-[0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient and moving the robotic arm..."], [0030]-[0031], [0035], [0050 "In block 503 of method 500, the controller 105 may determine that at least a portion of the imaging device 125 is moving with respect to the patient 103. For example, after obtaining imaging data of the patient 103, the gantry 40 of the imaging device 125 may be translated away from the surgical area as shown in FIGS. 1A-1C to provide easier access to the surgical area for performing a surgical procedure."]-[0055 "In embodiments, the controller 105 may determine the position of each of the robotic arms 101 a, 101 b in relation to the gantry 40 based on position data received from the imaging device 125 (e.g., indicating the translation and/or tilt position of the gantry 40 with respect to the base 20)."] and and [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount..."]); and
controlling a target movement state of the surgical robot and the imaging device based on the initial movement state (see Fig. 5, step 505; [0003]-[0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient and moving the robotic arm to maintain the end effector in the pre-determined position and orientation with respect to the patient while preventing the robotic arm from colliding with the imaging device or with the patient while the imaging device moves with respect to the patient."], [0053 "In block 505 of method 500, the controller 105 may control the at least one robotic arm 101 a, 101 b to move a first portion of the at least one robotic arm 101 a, 101 b while the imaging device 125 moves with respect to the patient 103 while maintaining the end effector 121of the arm in the pre-determined position and orientation (e.g., vector 409) with respect to the patient 103."]-[0055] and [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount, the motion tracking apparatus 129 may send a message to the imaging system 125 and the controller 105 of the robotic arm(s) to stop all motion of the system 100."]), including:
controlling a second movement state of the surgical robot based on a first movement state of the imaging device (see Fig. 5, step 505; [0003]-[0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient and moving the robotic arm to maintain the end effector in the pre-determined position and orientation with respect to the patient while preventing the robotic arm from colliding with the imaging device or with the patient while the imaging device moves with respect to the patient."] and [0053 "In block 505 of method 500, the controller 105 may control the at least one robotic arm 101 a, 101 b to move a first portion of the at least one robotic arm 101 a, 101 b while the imaging device 125 moves with respect to the patient 103 while maintaining the end effector 121of the arm in the pre-determined position and orientation (e.g., vector 409) with respect to the patient 103."]-[0055]), including:
controlling the first movement state of the imaging device based on the second movement state of the surgical robot (see claim 26; [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount, the motion tracking apparatus 129 may send a message to the imaging system 125 and the controller 105 of the robotic arm(s) to stop all motion of the system 100."]), including:
in response to a determination that the surgical robot is in movement, control the imaging device to remain stationary based on the second movement state of the surgical robot (see claim 26; [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount, the motion tracking apparatus 129 may send a message to the imaging system 125 and the controller 105 of the robotic arm(s) to stop all motion of the system 100."]);
wherein the controlling a second movement state of the surgical robot based on a first movement state of the imaging device includes:
determining a first movement trajectory of the imaging device based on the first movement state of the imaging device (see Fig. 5, step 503; [0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient..."], [0035], [0050 "In block 503 of method 500, the controller 105 may determine that at least a portion of the imaging device 125 is moving with respect to the patient 103. For example, after obtaining imaging data of the patient 103, the gantry 40 of the imaging device 125 may be translated away from the surgical area as shown in FIGS. 1A-1C to provide easier access to the surgical area for performing a surgical procedure."]-[0052] and [0055 "In embodiments, the controller 105 may determine the position of each of the robotic arms 101 a, 101 b in relation to the gantry 40 based on position data received from the imaging device 125 (e.g., indicating the translation and/or tilt position of the gantry 40 with respect to the base 20)."]); and
controlling, based on the first movement trajectory, the surgical robot to move (see Fig. 5, step 505; [0003]-[0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient and moving the robotic arm to maintain the end effector in the pre-determined position and orientation with respect to the patient while preventing the robotic arm from colliding with the imaging device or with the patient while the imaging device moves with respect to the patient."] and [0053 "In block 505 of method 500, the controller 105 may control the at least one robotic arm 101 a, 101 b to move a first portion of the at least one robotic arm 101 a, 101 b while the imaging device 125 moves with respect to the patient 103 while maintaining the end effector 121of the arm in the pre-determined position and orientation (e.g., vector 409) with respect to the patient 103."]-[0055]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the process of Kan to further determine a first movement trajectory of the imaging device based on the first movement state of the imaging device and control the surgical robot to move based on the first movement trajectory, as taught by Gregerson, in order to maintain an end effector of the surgical robot at a predetermined pose with respect to a patient.
Regarding Claim 19
Kan teaches the system of claim 16 (as discussed above in claim 16),
further comprising:
a second transmission channel, configured to transmit first movement state information of the imaging device to the surgical robot, and/or transmit second movement state information of the surgical robot to the imaging device (see Fig. 9, all; Col. 8, lines 48-64; Col. 13, lines 10-63).
Kan is silent regarding a first transmission channel, configured to transmit the image data obtained by the imaging device to the surgical robot to cause the surgical robot to perform the puncture operation based on the image data.
Gregerson teaches a system for controlling an image-guided interventional puncture device (see all Figs., especially Fig. 5; [0003]-[0004]), comprising:
an imaging device, configured to obtain image data of a target object (see Figs. 1A-2C, imaging device 125; [0003] and [0030]); and
a surgical robot, configured to perform a puncture operation (see Figs. 1A-2C, robotic arms 101a-101b; Fig. 4, all; [0003], [0029]-[0030] and [0045 "The surgeon/clinician may identify and select at least one target point 403 in the displayed image 401. The target point 403 may represent an end point for the insertion of a particular surgical tool 122 into the patient's body during a surgical procedure. The surgeon/clinician may also identify and select at least one entrance point 405 on the displayed image 401. The entrance point 405 may represent a point on the exterior of the patient's body (e.g., the skin) through which the surgeon will insert the particular surgical tool 122. "]); and
a control module, configured to control a target movement state of the surgical robot and the imaging device based on an initial movement state of the imaging device and the surgical robot, wherein the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see Fig. 5, steps 503-505; [0003]-[0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient and moving the robotic arm to maintain the end effector in the pre-determined position and orientation with respect to the patient while preventing the robotic arm from colliding with the imaging device or with the patient while the imaging device moves with respect to the patient."], [0053 "In block 505 of method 500, the controller 105 may control the at least one robotic arm 101 a, 101 b to move a first portion of the at least one robotic arm 101 a, 101 b while the imaging device 125 moves with respect to the patient 103 while maintaining the end effector 121of the arm in the pre-determined position and orientation (e.g., vector 409) with respect to the patient 103."]-[0055] and [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount, the motion tracking apparatus 129 may send a message to the imaging system 125 and the controller 105 of the robotic arm(s) to stop all motion of the system 100."]);
to control the target movement state of the surgical robot and the imaging device based on the initial movement state of the imaging device and/or the surgical robot (see Fig. 5, step 505; [0003]-[0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient and moving the robotic arm to maintain the end effector in the pre-determined position and orientation with respect to the patient while preventing the robotic arm from colliding with the imaging device or with the patient while the imaging device moves with respect to the patient."], [0053 "In block 505 of method 500, the controller 105 may control the at least one robotic arm 101 a, 101 b to move a first portion of the at least one robotic arm 101 a, 101 b while the imaging device 125 moves with respect to the patient 103 while maintaining the end effector 121of the arm in the pre-determined position and orientation (e.g., vector 409) with respect to the patient 103."]-[0055] and [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount, the motion tracking apparatus 129 may send a message to the imaging system 125 and the controller 105 of the robotic arm(s) to stop all motion of the system 100."]), the control module is configured to:
control a second movement state of the surgical robot based on a first movement state of the imaging device (see Fig. 5, step 505; [0003]-[0004 "Embodiments may further include determining that the imaging device is moving with respect to the patient and moving the robotic arm to maintain the end effector in the pre-determined position and orientation with respect to the patient while preventing the robotic arm from colliding with the imaging device or with the patient while the imaging device moves with respect to the patient."] and [0053 "In block 505 of method 500, the controller 105 may control the at least one robotic arm 101 a, 101 b to move a first portion of the at least one robotic arm 101 a, 101 b while the imaging device 125 moves with respect to the patient 103 while maintaining the end effector 121of the arm in the pre-determined position and orientation (e.g., vector 409) with respect to the patient 103."]-[0055]), including:
control the first movement state of the imaging device based on the second movement state of the surgical robot (see claim 26; [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount, the motion tracking apparatus 129 may send a message to the imaging system 125 and the controller 105 of the robotic arm(s) to stop all motion of the system 100."]), including:
in response to a determination that the surgical robot is in movement, control the imaging device to remain stationary based on the second movement state of the surgical robot (see claim 26; [0058 "In embodiments, when the motion tracking apparatus 129 detects a change in position or orientation of the end effector 121 with respect to the patient 103 by more than a threshold amount, the motion tracking apparatus 129 may send a message to the imaging system 125 and the controller 105 of the robotic arm(s) to stop all motion of the system 100."]);
further comprising:
a first transmission channel, configured to transmit the image data obtained by the imaging device to the surgical robot to cause the surgical robot to perform the puncture operation based on the image data (see Fig. 4, all; [0029]-[0030] and [0045 "The surgeon/clinician may identify and select at least one target point 403 in the displayed image 401. The target point 403 may represent an end point for the insertion of a particular surgical tool 122 into the patient's body during a surgical procedure. The surgeon/clinician may also identify and select at least one entrance point 405 on the displayed image 401. The entrance point 405 may represent a point on the exterior of the patient's body (e.g., the skin) through which the surgeon will insert the particular surgical tool 122. "]); and
a second transmission channel, configured to transmit first movement state information of the imaging device to the surgical robot, and/or transmit second movement state information of the surgical robot to the imaging device (see [0045 "In embodiments, the surgeon may select the entrance point 405 and the trajectory 407 within the patient's body in order to facilitate the insertion of the surgical tool 122 to the target point 403 while minimizing damage to other tissue or organs of the patient 103. As also shown in FIG. 4, the trajectory 407 may also be extended outside of the patient's body to define a unique vector 409 in three-dimensional space extending from the selected entrance point 405, as indicated by the dashed-dotted line in FIG. 4."]-[0049]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the system of Kan to further include a first transmission channel, configured to transmit the image data obtained by the imaging device to the surgical robot to cause the surgical robot to perform the puncture operation based on the image data, as taught by Gregerson, in order to automatically perform an operation based on the image data.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Kan (as modified by Gregerson) as applied to claim 5 above, and further in view of Weir (US 20170189126 A1 and Weir hereinafter) and Mantri et al. (US 20210401521 A1 and Mantri hereinafter).
Regarding Claim 6
Modified Kan teaches the method of claim 5 (as discussed above in claim 5),
Kan is silent regarding wherein the controlling, based on the first movement trajectory, the surgical robot to move includes:
predicting a distance between the imaging device and the surgical robot based on the first movement trajectory of the imaging device; and
in response to the distance being less than a distance threshold, controlling both the imaging device and the surgical robot to remain stationary.
Weir teaches a method for controlling an image-guided interventional puncture device (see all Figs.; [0005]), comprising:
obtaining an initial movement state of an imaging device and a surgical robot, the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see "first movable arm" and "second movable arm" in most Figs. and the following paragraphs, especially Fig. 11; [0005 "...can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination."]-[0007 "In another embodiment, a surgical system is provided that includes an electromechanical arm, a camera, and a controller. The electromechanical arm can be configured to removably couple to a surgical instrument. The electromechanical arm can be configured to move so as to move the surgical instrument removably coupled thereto relative to a patient on which a surgical procedure is being performed. The camera can have a field of view."], [0069 " In an exemplary embodiment, first and second surgical instruments can be coupled to the robotic surgical system, with the first surgical instrument including a camera..."] and [0111 "In an exemplary embodiment, at least one of the surgical instruments 608 a, 608 b, 608N can include a camera having a field of view and configured to visualize a surgical area such that the robotic surgical system 600 can be coupled to at least one camera."]-[0113]); and
controlling a target movement state of the surgical robot and the imaging device based on the initial movement state (see [0005 "...can have a first mode in which the robotic surgical system is configured to control movement of the first arm such that the first arm and the first surgical instrument removably coupled thereto move relative to the second arm and the second surgical instrument removably coupled thereto, can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination. The controller can be configured to cause the coordinated movement."] and [0069 "In an exemplary embodiment, first and second surgical instruments can be coupled to the robotic surgical system, with the first surgical instrument including a camera. The robotic surgical system can thus be configured to allow the camera to follow movement of the second surgical instrument, e.g., the camera is selected as the follower instrument and the second surgical instrument is selected as the master instrument, which can help maintain visualization of the second surgical instrument during movement thereof. The robotic surgical system can also thus be configured to allow the second surgical instrument to follow movement of the camera, e.g., the second surgical instrument is selected as the follower instrument and the camera is selected as the master instrument, which can allow the second surgical instrument, when visualized by the camera, to remain within the camera's vision during the camera's movement."]), including:
controlling the first movement state of the imaging device based on the second movement state of the surgical robot, including:
in response to a determination that the surgical robot is in movement, locking a moving part of the imaging device to control the imaging device to remain stationary based on the second movement state of the surgical robot (see [0009 "The first sensor can be configured to detect an impending collision between the first and second arms by determining when the second arm is within a threshold minimum distance of the first arm. The controller can be configured to trigger performance of a remedial action in response to the detected impending collision."]-[0010 "For still another example, triggering performance of the remedial action can include stopping movement of the first arm."], [0018] and [0021]-[0022]).).
wherein the controlling, based on the first movement trajectory, the surgical robot to move includes:
predicting a distance between the imaging device and the surgical robot based on the first movement trajectory of the imaging device (see [0009 "The first sensor can be configured to detect an impending collision between the first and second arms by determining when the second arm is within a threshold minimum distance of the first arm. The controller can be configured to trigger performance of a remedial action in response to the detected impending collision."]-[0010], [0014], [0021]-[0022] and [0123]-[0124]); and
in response to the distance being less than a distance threshold, controlling the imaging device or the surgical robot to remain stationary (see [0009 "The first sensor can be configured to detect an impending collision between the first and second arms by determining when the second arm is within a threshold minimum distance of the first arm. The controller can be configured to trigger performance of a remedial action in response to the detected impending collision."]-[0010 "For still another example, triggering performance of the remedial action can include stopping movement of the first arm."], [0018] and [0021]-[0022]).
Although it may be implied, Weir does not explicitly teach controlling both the imaging device and the surgical robot to remain stationary.
Mantri teaches a method for controlling an image-guided interventional puncture device (see all Figs.; [0011]), comprising:
obtaining an initial movement state of an imaging device and a surgical robot, the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see all Figs.; [0011]-[0012 "The treatment probe may be coupled to a first robotic arm configured to provide computer-controlled movement of the treatment probe during tissue resection with the treatment probe. The imaging probe may be coupled to a second robotic arm configured to provide computer-controlled movement of the imaging probe during scanning of the target site with the imaging probe, before and/or during the tissue resection procedure with the treatment probe"], [0013] and [0070 "In some embodiments, the imaging probe and the treatment probe are aligned so that the treatment probe is within the field of view of the imaging probe. In some embodiments, the alignment is configured to maintain the treatment probe within a field of view of the imaging probe. In some embodiments, the treatment probe is configured to move to a position and the imaging probe is configured to maintain the treatment probe within the field of view.]-[0073]); and
controlling a target movement state of the surgical robot and the imaging device based on the initial movement state (see [0011]-[0013] and [0070 "In some embodiments, the imaging probe and the treatment probe are aligned so that the treatment probe is within the field of view of the imaging probe. In some embodiments, the alignment is configured to maintain the treatment probe within a field of view of the imaging probe. In some embodiments, the treatment probe is configured to move to a position and the imaging probe is configured to maintain the treatment probe within the field of view.]-[0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]), including:
controlling a second movement state of the surgical robot based on a first movement state of the imaging device (see [0011]-[0013] and [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0072 "In some embodiments, one or more of computer vision, image recognition, or a trained machine learning model may be used to assist the system to recognize when one or both probes are too close to each other or to tissue or an organ of a patient."]), including:
in response to a determination that the imaging device is in movement, control the surgical robot to remain stationary based on the first movement state of the imaging device (see [0011]-[0013] and [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0072 "In some embodiments, one or more of computer vision, image recognition, or a trained machine learning model may be used to assist the system to recognize when one or both probes are too close to each other or to tissue or an organ of a patient."]); and/or
controlling the first movement state of the imaging device based on the second movement state of the surgical robot (see [0011]-[0013] and [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0073]), including:
in response to a determination that the surgical robot is in movement, control the imaging device to remain stationary based on the second movement state of the surgical robot (see [0011]-[0013] and [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0073]);
wherein the controlling, based on the first movement trajectory, the surgical robot to move includes:
predicting a distance between the imaging device and the surgical robot based on the first movement trajectory of the imaging device (see [0072 "In some embodiments, one or more of computer vision, image recognition, or a trained machine learning model may be used to assist the system to recognize when one or both probes are too close to each other or to tissue or an organ of a patient."]); and
in response to the distance being less than a distance threshold, controlling both the imaging device and the surgical robot to remain stationary (see [0071]-[0073], especially [0073 "In these and other embodiments, stopping motion of one or both probes or changing the position, location, or orientation of one or both probes may be implemented by controlling one or both robotic arms."]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the process of Kan to further predict a distance between the imaging device and the surgical robot based on the first movement trajectory of the imaging device and in response to the distance being less than a distance threshold, control both the imaging device and the surgical robot to remain stationary, as taught by Weir and Mantri, in order to prevent a collision between the imaging device and surgical robot.
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Kan as applied to claim 1 above, and further in view of Mantri.
Regarding Claim 9
Kan teaches the method of claim 1 (as discussed above in claim 1),
Kan is silent regarding further comprising: controlling a movement speed of the surgical robot and the imaging device based on environmental information.
Mantri teaches further comprising: controlling a movement speed of the surgical robot and the imaging device based on environmental information (see [0011]-[0013] and [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0073]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the process of Kan to further control a movement speed of the surgical robot and/or the imaging device based on environmental information, as taught by Mantri, in order to prevent a collision between the surgical robot and imaging device.
Claims 10-11 and 21-22 are rejected under 35 U.S.C. 103 as being unpatentable over Kan as applied to claims 1 and 11 above, and further in view of Tan et al. (US 20220241975 A1 and Tan hereinafter).
Regarding Claim 10
Kan teaches the method of claim 1 (as discussed above in claim 1),
Kan is silent regarding further comprising:
obtaining a first end signal generated by the imaging device at an end of a current preset process, or a second end signal generated by the surgical robot at the end of the current preset process; and
controlling a movement state of the imaging device and/or the surgical robot in a next process based on the first end signal or the second end signal.
Tan teaches a method (see all Figs.; [0008]), comprising:
obtaining an initial movement state of an imaging device and a robot, the robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see [0008], [0043 "In the illustrated embodiment, the first robotic machine may be different than the second robotic machine, and has at least some different capabilities than the first robotic machine … The first robotic machine also has the capabilities to grasp and manipulate a target object 132 on a designated the equipment, such as the first equipment, using a robotic arm 210. The robotic arm may have the capabilities to rotate, tilt, lift, extend, retract, push, and/or pull the target object 132. The first robotic machine may be referred to herein as a grasping robotic machine."]-[0044 "The second robotic machine in FIG. 1 may be referred to as an aerial robotic machine. The aerial robotic machine may include an imaging device 150 that may be configured to generate imaging data ... Using the imaging device, the aerial robotic machine has the capability to visually inspect designated equipment, including a target object thereof, such as to determine a position or status of the target object."] and [0080 "At step 428, the first robotic machine provides a status notification to the second robotic machine. The status notification may be a message communicated wirelessly as electromagnetic RF signals from the communication circuit of the first robotic machine to the communication circuit of the second robotic machine. The second robotic machine receives the status notification at step 434. The status notification may inform the second robotic machine that the first robotic machine has started or completed a specific sub-task in the first sequence."]-[0082]); and
controlling a target movement state of the robot and the imaging device based on the initial movement state (see Fig. 9, all; [0075 "For example, the first sequence may specify an order that the sub-tasks are to be performed relative to each other and relative to the sub-tasks in the second sequence to be performed by the second robotic machine. Thus, the first sequence may specify that after completing a given sub-task, the first robotic machine is to wait until receiving a notification from the second robotic machine that a specific sub-task in the second sequence has been completed before starting a subsequent sub-task in the first sequence."] and [0080]-[0082 "At steps 436 and 438, respectively, the first and second robotic machines complete the performances of the first and second sequences of sub-tasks."]);
further comprising:
obtaining a first end signal generated by the imaging device at an end of a current preset process, or a second end signal generated by the robot at the end of the current preset process (see Fig. 4, steps 428-434; [0080 "At step 428, the first robotic machine provides a status notification to the second robotic machine. The status notification may be a message communicated wirelessly as electromagnetic RF signals from the communication circuit of the first robotic machine to the communication circuit of the second robotic machine. The second robotic machine receives the status notification at step 434. The status notification may inform the second robotic machine that the first robotic machine has started or completed a specific sub-task in the first sequence."]-[0081 "At step 430, the second robotic machine provides a status notification to the first robotic machine. The status notification from the second robotic machine may be similar in form and/or function to the status notification sent from the first robotic machine at step 428. The first robotic machine receives the status notification from the second robotic machine at step 432."]); and
controlling a movement state of the imaging device and/or the robot in a next process based on the first end signal or the second end signal (see Fig. 4, steps 436-438; [0075 "Thus, the first sequence may specify that after completing a given sub-task, the first robotic machine is to wait until receiving a notification from the second robotic machine that a specific sub-task in the second sequence has been completed before starting a subsequent sub-task in the first sequence."] and [0080 "The second robotic machine processes the received status notification and may use the status notification to determine when to start performing certain sub-tasks in the second sequence. For example, at least some of the sub-tasks in the first and second sequences may be sequential, such that the second robotic machine may begin performance of a corresponding sub-task in the second sequence responsive to receiving the notification from the first robotic machine that the first robotic machine has completed a specific sub-task in the first sequence."]-[0082]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the process of Kan to obtain a first end signal generated by the imaging device at an end of a current preset process, or a second end signal generated by the surgical robot at the end of the current preset process and control a movement state of the imaging device and/or the surgical robot in a next process based on the first end signal or the second end signal, as taught by Tan, in order to coordinate sub-tasks between the imaging device and surgical robot to execute a main-task in an optimal manner.
Regarding Claim 11
Modified Kan teaches the method of claim 10 (as discussed above in claim 10),
Kan is silent regarding wherein the controlling a movement state of the imaging device and the surgical robot in a next process based on the first end signal or the second end signal includes:
controlling the imaging device to remain stationary, and/or be released from a stationary state of the surgical robot, based on the first end signal; and
controlling the surgical robot to remain stationary, and/or be released from a stationary state of the imaging device, based on the second end signal.
Tan teaches wherein the controlling a movement state of the imaging device and the robot in a next process based on the first end signal or the second end signal includes:
controlling the imaging device to remain stationary, and/or be released from a stationary state of the robot, based on the first end signal (see Fig. 4, steps 428-438; [0075 "Thus, the first sequence may specify that after completing a given sub-task, the first robotic machine is to wait until receiving a notification from the second robotic machine that a specific sub-task in the second sequence has been completed before starting a subsequent sub-task in the first sequence."] and [0080 "At step 428, the first robotic machine provides a status notification to the second robotic machine. The status notification may be a message communicated wirelessly as electromagnetic RF signals from the communication circuit of the first robotic machine to the communication circuit of the second robotic machine. The second robotic machine receives the status notification at step 434. The status notification may inform the second robotic machine that the first robotic machine has started or completed a specific sub-task in the first sequence. The second robotic machine processes the received status notification and may use the status notification to determine when to start performing certain sub-tasks in the second sequence. For example, at least some of the sub-tasks in the first and second sequences may be sequential, such that the second robotic machine may begin performance of a corresponding sub-task in the second sequence responsive to receiving the notification from the first robotic machine that the first robotic machine has completed a specific sub-task in the first sequence.."]-[0082]); and
controlling the robot to remain stationary, and/or be released from a stationary state of the device, based on the second end signal (see Fig. 4, steps 428-438; [0075 "Thus, the first sequence may specify that after completing a given sub-task, the first robotic machine is to wait until receiving a notification from the second robotic machine that a specific sub-task in the second sequence has been completed before starting a subsequent sub-task in the first sequence."] and [0080 "At step 428, the first robotic machine provides a status notification to the second robotic machine. The status notification may be a message communicated wirelessly as electromagnetic RF signals from the communication circuit of the first robotic machine to the communication circuit of the second robotic machine. The second robotic machine receives the status notification at step 434. The status notification may inform the second robotic machine that the first robotic machine has started or completed a specific sub-task in the first sequence. The second robotic machine processes the received status notification and may use the status notification to determine when to start performing certain sub-tasks in the second sequence. For example, at least some of the sub-tasks in the first and second sequences may be sequential, such that the second robotic machine may begin performance of a corresponding sub-task in the second sequence responsive to receiving the notification from the first robotic machine that the first robotic machine has completed a specific sub-task in the first sequence.."]-[0081 "At step 430, the second robotic machine provides a status notification to the first robotic machine. The status notification from the second robotic machine may be similar in form and/or function to the status notification sent from the first robotic machine at step 428.']).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process of modified Kan to control the imaging device to remain stationary and/or be released from a stationary state of the surgical robot, based on the first end signal and control the surgical robot to remain stationary and/or be released from a stationary state of the imaging device, based on the second end signal, as taught by Tan, in order to coordinate sub-tasks between the imaging device and surgical robot to execute a main-task in an optimal manner.
Regarding Claim 21
Modified Kan teaches the method of claim 11 (as discussed above in claim 11),
Kan is silent regarding further comprising:
in response to the current preset process and the next process being executed by a same device, maintaining the imaging device or the surgical robot in an unlocked state after receiving the first end signal or the second end signal, so as to continuously execute the next process.
Tan teaches further comprising:
in response to the current preset process and the next process being executed by a same device, maintaining the imaging device or the robot in an unlocked state after receiving the first end signal or the second end signal, so as to continuously execute the next process (see Fig. 4, steps 428-438; [0075 "Thus, the first sequence may specify that after completing a given sub-task, the first robotic machine is to wait until receiving a notification from the second robotic machine that a specific sub-task in the second sequence has been completed before starting a subsequent sub-task in the first sequence."] and [0080 "The status notification may inform the second robotic machine that the first robotic machine has started or completed a specific sub-task in the first sequence. The second robotic machine processes the received status notification and may use the status notification to determine when to start performing certain sub-tasks in the second sequence. For example, at least some of the sub-tasks in the first and second sequences may be sequential, such that the second robotic machine may begin performance of a corresponding sub-task in the second sequence responsive to receiving the notification from the first robotic machine that the first robotic machine has completed a specific sub-task in the first sequence. Other sub-tasks in the first and second sequences may be performed concurrently by the first and second robotic machines, such that the time period that the first robotic machine performs a given sub-task in the first sequence at least partially overlaps the time period that the second robotic machine performs a given sub-task in the second sequence."]-[0082]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process of modified Kan to maintain the imaging device or the surgical robot in an unlocked state after receiving the first end signal or the second end signal, so as to continuously execute the next process, as taught by Tan, in order to coordinate sub-tasks between the imaging device and surgical robot to execute a main-task in an optimal manner.
Regarding Claim 22
Modified Kan teaches the method of claim 21 (as discussed above in claim 21),
Kan further teaches wherein the current preset process and the next process correspond to work tasks of individual stages in a complete preset process, the complete preset process being determined based on patient information, and the work tasks of the individual stages being executed separately by the imaging device or the surgical robot (see Col. 6, lines 35-60 "Prior to a surgical operation in the present embodiment, the patient is fixed on the operating table 10 at a position taken out of the area made the object of measurement by the MRI apparatus and the supporting device 9 is moved to the optimum position for a surgical operating position of the affected part and is fixed to the operating table (see FIG. 3). Next, a body surface tissue is lifted for ensuring a surgical operating field of the affected part. This is performed in a manner that a plurality of holes are provided through the body surface (mainly, abdominal wall or the like) and abdominal wall lifting members provided in the supporting device 9 are inserted into the holes to lift the abdominal wall above. The abdominal wall is fixed in the lifted state. Further, the endoscope 2, the ultrasonic scanner 3 and the operating manipulators are inserted from small holes which are similarly provided in the body surface."; Col. 9, lines 14-25).
Tan additionally teaches wherein the current preset process and the next process correspond to work tasks of individual stages in a complete preset process, and the work tasks of the individual stages being executed separately by the imaging device or the robot (see Fig. 4, all; [0075]-[0082]).
Claims 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Kan (as modified by Weir) as applied to claim 1 above, and further in view of Balicki (US 20230339109 A1 and Balicki hereinafter).
Regarding Claim 12
Kan teaches the method of claim 1 (as discussed above in claim 1),
Kan is silent regarding further comprising: controlling, based on an access request from the surgical robot or the imaging device, the imaging device and the surgical robot to get into an integral working mode, wherein movement states of the imaging device and the surgical robot are mutually associated in the integral working mode.
Weir teaches further comprising: controlling, based on an access request the imaging device and the surgical robot to get into an integral working mode, wherein movement states of the imaging device and the surgical robot are mutually associated in the integral working mode (see [0005 "...can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination."]-[0006], [0023 "The surgical method can include, in response to the received second user input, moving the first electromechanical arm in a second mode of the robotic surgical system in which the other movement of the first electromechanical arm causes corresponding movement of the second electromechanical arm such that the relative position of the first and second electromechanical arms is maintained."] and [0179]-[0180 "The robotic surgical system can be configured to selectively move between the first and second modes in response to a user input to the robotic surgical system, e.g., an input via the user input device such as a pressing of a “mode” button thereon."]).
Although it may be implied, Weir does not explicitly teach the access request from the surgical robot or the imaging device.
Balicki teaches a method for controlling an image-guided interventional puncture device (see all Figs.; [0005]), comprising:
obtaining an initial movement state of a surgical robot, the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see Fig. 7, all; [0005 "...analyze the temporal force/torque data to determine at least one of a current intention of the user and a state of the collaborative procedure…"] and [0085]-[0087]); and
controlling a target movement state of the surgical robot based on the initial movement state (see Fig. 7, steps 704-706; [0005 "...analyze the temporal force/torque data to determine at least one of a current intention of the user and a state of the collaborative procedure, and cause the robot controller to control the robotic arm in a control mode which is predefined for the determined current intention of the user or state of the collaborative procedure, wherein the control mode determines at least one robot control parameter."] and [0087]-[0096]);
further comprising: controlling, based on an access request from the surgical robot, the surgical robot to get into an integral working mode (see [0081]-[0096], especially [0088] "In some embodiments, in an operation 716, system controller 300 may alert a user to the fact that system controller 300 has a pending control mode switch request. System controller 300 may request that the user (e.g., surgeon 10) confirm or approve the control mode switch request. In some embodiments, operation 716 may only be executed for some specific procedures, but skipped for other procedures.").
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the process of Kan to further control, based on an access request from the surgical robot or the imaging device, the imaging device and the surgical robot to get into an integral working mode, wherein movement states of the imaging device and the surgical robot are mutually associated in the integral working mode, as taught by Weir and Balicki, in order to maintain a relative position between the imaging device and the surgical robot.
Regarding Claim 13
Modified Kan teaches the method of claim 12 (as discussed above in claim 12),
Kan is silent regarding further comprising:
obtaining an interrupt request sent by the imaging device or the surgical robot; and
controlling, based on the interrupt request, the imaging device and the surgical robot to get into an independent working mode, wherein the movement states of the imaging device and the surgical robot are independent of each other in the independent working mode.
Weir teaches further comprising:
obtaining an interrupt request sent by the imaging device or the surgical robot (see [0005]-[0006 "For yet another example, the robotic surgical system can be configured to switch between the first and second modes in response to a user command to switch modes."] and [0179]-[0180 "The robotic surgical system can be configured to selectively move between the first and second modes in response to a user input to the robotic surgical system, e.g., an input via the user input device such as a pressing of a “mode” button thereon ... In at least some embodiments, in addition to the robotic surgical system being configured to selectively move at least from the first mode to second mode in response to the user input, the robotic surgical system can be configured to automatically move from the second mode to the first mode. The automatic movement can occur after the robotic surgical system receives the user input requesting movement of the surgical instrument such that the next user input requesting movement of the surgical instrument will move only that surgical instrument unless the user manually provides another input moving from the first mode to the second mode. The automatic movement from the second mode to the first mode may help prevent inadvertent coordinated movement of surgical instruments, which may be more likely to occur when user inputs requesting movement of the surgical instruments are separated by enough time for the user to possibly forget that the robotic surgical system is in the second mode.]); and
controlling, based on the interrupt request, the imaging device and the surgical robot to get into an independent working mode, wherein the movement states of the imaging device and the surgical robot are independent of each other in the independent working mode (see [0005 "...can have a first mode in which the robotic surgical system is configured to control movement of the first arm such that the first arm and the first surgical instrument removably coupled thereto move relative to the second arm and the second surgical instrument removably coupled thereto..."]-[0006], [0023 "The surgical method can include, in response to the received first user input, moving the first electromechanical arm in a first mode of the robotic surgical system in which the first electromechanical arm moves relative to a second electromechanical arm of the robotic surgical system such that a relative position of the first and second electromechanical arms changes."] and [0179 "The robotic surgical system can have first and second modes of operation. In the first mode of operation, the robotic surgical system can be configured to only move the one of the surgical instruments that the user requests movement of via the input device. The first mode may therefore allow the one of the surgical instruments to move relative to all other surgical instruments coupled to the robotic surgical system via the other movement mechanisms."]-[0180]).
Balicki additionally teaches obtaining an interrupt request sent by the imaging device or the surgical robot (see [0081]-[0096], especially [0088] "In some embodiments, in an operation 716, system controller 300 may alert a user to the fact that system controller 300 has a pending control mode switch request. System controller 300 may request that the user (e.g., surgeon 10) confirm or approve the control mode switch request. In some embodiments, operation 716 may only be executed for some specific procedures, but skipped for other procedures.").
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process of modified Kan to obtain an interrupt request sent by the imaging device or the surgical robot and control the imaging device and the surgical robot to get into an independent working mode, wherein the movement states of the imaging device and the surgical robot are independent of each other in the independent working mode, as taught by Weir, in order to allow the surgical robot and control the imaging device to move relative to each other while preventing inadvertent coordinated movement between them.
Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Kan (as modified by Weir and Balicki) as applied to claim 12 above, and further in view of Onoe et al. (JP H10315170 A and Onoe hereinafter).
Regarding Claim 14
Modified Kan teaches the method of claim 12 (as discussed above in claim 12),
Kan is silent regarding teaches further comprising:
detecting a connection relationship between the imaging device and the surgical robot; and
in response to the connection relationship being abnormal, controlling both the imaging device and the surgical robot to remain stationary.
Weir teaches further comprising:
detecting a connection relationship between the imaging device and the surgical robot (see [0005 "...can have a first mode in which the robotic surgical system is configured to control movement of the first arm such that the first arm and the first surgical instrument removably coupled thereto move relative to the second arm and the second surgical instrument removably coupled thereto, can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination. The controller can be configured to cause the coordinated movement."] and [0069 "In an exemplary embodiment, first and second surgical instruments can be coupled to the robotic surgical system, with the first surgical instrument including a camera. The robotic surgical system can thus be configured to allow the camera to follow movement of the second surgical instrument, e.g., the camera is selected as the follower instrument and the second surgical instrument is selected as the master instrument, which can help maintain visualization of the second surgical instrument during movement thereof. The robotic surgical system can also thus be configured to allow the second surgical instrument to follow movement of the camera, e.g., the second surgical instrument is selected as the follower instrument and the camera is selected as the master instrument, which can allow the second surgical instrument, when visualized by the camera, to remain within the camera's vision during the camera's movement."]).
Onoe teaches a method (see all Figs.; [0007]-[0010]; see the corresponding paragraphs in the attached reference JP_H10315170_A), comprising:
obtaining an initial movement state of an device and a robot, the robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see [0010 "An information signal having a plurality of servo units 200a and 200b for driving control, identification data for identifying each robot body, and command data for driving and controlling each axis of the identified robot body is generated in the order of each robot body."]-[0013]); and
controlling a target movement state of the robot and the device based on the initial movement state (see [0010 "An information signal having a plurality of servo units 200a and 200b for driving control, identification data for identifying each robot body, and command data for driving and controlling each axis of the identified robot body is generated in the order of each robot body."]-[0013]);
further comprising:
detecting a connection relationship between the device and the robot (see [0007 "The abnormality information of the situation is detected only as a communication abnormality such as communication interruption, and therefore it is difficult to identify the abnormality factor ... In contrast, when an abnormality occurs in the communication path including the communication lines 61, 62; 63, the operation of the robot bodies 40A and 40B is stopped only when such an abnormality occurs frequently in a short time, thereby interrupting the work. Can be avoided as much as possible."], [0029] and [0076]); and
in response to the connection relationship being abnormal, controlling both the device and the robot to remain stationary (see [0007 "In contrast, when an abnormality occurs in the communication path including the communication lines 61, 62; 63, the operation of the robot bodies 40A and 40B is stopped only when such an abnormality occurs frequently in a short time, thereby interrupting the work. Can be avoided as much as possible."], [0029] and [0076]0.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process of modified Kan to detect a connection relationship between the imaging device and the surgical robot and in response to the connection relationship being abnormal, control both the imaging device and the surgical robot to remain stationary, as taught by Weir and Onoe, in order to ensure safety of the imaging device and surgical robot.
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Kan (as modified by Weir and Balicki) as applied to claim 12 above, and further in view of Wouhaybi et al. (US 20210107151 A1 and Wouhaybi hereinafter).
Regarding Claim 15
Modified Kan teaches the method of claim 12 (as discussed above in claim 12),
Kan is silent regarding further comprising:
in response to a failure of the imaging device or the surgical robot, controlling the imaging device and the surgical robot to get into an independent working mode.
Wouhaybi teaches a method (see all Figs;, especially Fig. 5; [0029]-[0030]), comprising:
obtaining an initial movement state of an imaging device and a robot, the robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see Figs. 3-4, optoelectronic sensors 104; Fig. 5, autonomous machines 500; Fig. 11, all; [0029], [0056]-[0058 "Examples of the one or more sensors 104 include one or more optoelectronic sensors 104 (e.g., providing one or more image acquisition devices), one or more position sensors 106, one or more speed sensors, one or more distance sensors, e.g., one or more radar sensors 108 and/or one or more LIDAR sensors, one or more temperature sensors 110, one or more force sensors 112. "], [0088]-[0090] and [0102]); and
controlling a target movement state of the surgical robot and the imaging device based on the initial movement state (see [0089], [0102 " If the trust score falls below minimum threshold 704 it may indicate a failure 708. If the trust score is indicates a failure, the autonomous machine may no longer be allocated tasks or may be removed from the group. If the trust score is between minimum threshold 704 and a higher threshold 706, this may identify that the autonomous machine requires a repair or an update. If the trust score is above threshold 706, the autonomous machine may be identified as a “trustworthy” machine. Task allocation may be done based on an autonomous machine's trust score and the autonomous machines with trust scores above threshold 706 will be assigned the majority of the tasks.]);
further comprising:
in response to a failure of the imaging device or the robot, controlling the imaging device and the robot to get into an independent working mode (see Fig. 7, "failure"; [0102 "If the trust score falls below minimum threshold 704 it may indicate a failure 708. If the trust score is indicates a failure, the autonomous machine may no longer be allocated tasks or may be removed from the group. If the trust score is between minimum threshold 704 and a higher threshold 706, this may identify that the autonomous machine requires a repair or an update."]-[0110 "Based on the trust score, an autonomous machine may allocate a task to itself or choose another autonomous machine to help with an allocated task. The discovery of other autonomous machines that may take a task or help with a task may improve the accuracy at which tasks are completed and the time in which it takes to complete them."] and [0120]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process of modified Kan to control the imaging device and the surgical robot to get into an independent working mode in response to a failure of the imaging device or the surgical robot, as taught by Wouhaybi, in order to assign tasks to an appropriate imaging device or surgical robot based on a determined trust score.
Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Kan as applied to claim 16 above, and further in view of Weir.
Regarding Claim 18
Kan teaches the system of claim 16 (as discussed above in claim 16),
Kan further teaches further comprising:
a second interlock interface, configured to control a movement state of the imaging device based on a movement state of the surgical robot (see Fig. 9, steps S909-S912; Col. 13, lines 10-63, especially "On the other hand, when the use of the surgical operating equipment is selected, the judgement is made of whether or not the surgical operating equipment is under use (step S909). In the case where the surgical operating equipment is under use, the use is continued (step 911). In the case where the surgical operating equipment is not under use, the judgement is made of whether or not the MRI apparatus is under measurement (step S910). If the MRI apparatus is under measurement, a measurement stopping operation sets in (step S912). Thereafter, the use of the surgical operating equipment is started (step S913) ... According to this flow chart, it is possible to control the measurement by the MRI apparatus and the use of the surgical operating equipment exclusively from each other."); and
a third interlock interface, configured to control the imaging device and/or the surgical robot to remain stationary in a preset emergency situation (see Fig. 9, all; Col. 8, lines 48-64; Col. 13, lines 10-63).
Kan is silent regarding a first interlock interface, configured to control the surgical robot to establish or interrupt a connection relationship with the imaging device, and to detect the connection relationship.
Weir teaches a system for controlling an image-guided interventional puncture device (see all Figs.; [0005]), comprising:
an imaging device, configured to obtain image data of a target object; a surgical robot, configured to perform a puncture operation (see "first movable arm" and "second movable arm" in most Figs. and the following paragraphs, especially Figs. 5 and 11; [0002], [0005 "...can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination."]-[0007 "In another embodiment, a surgical system is provided that includes an electromechanical arm, a camera, and a controller. The electromechanical arm can be configured to removably couple to a surgical instrument. The electromechanical arm can be configured to move so as to move the surgical instrument removably coupled thereto relative to a patient on which a surgical procedure is being performed. The camera can have a field of view."], [0069 " In an exemplary embodiment, first and second surgical instruments can be coupled to the robotic surgical system, with the first surgical instrument including a camera..."] and [0111 "In an exemplary embodiment, at least one of the surgical instruments 608 a, 608 b, 608N can include a camera having a field of view and configured to visualize a surgical area such that the robotic surgical system 600 can be coupled to at least one camera."]-[0113]); and
a control module, configured to control a target movement state of the surgical robot and the imaging device based on an initial movement state of the imaging device and the surgical robot, wherein the surgical robot being a robotic arm body for driving a robotic arm end in movement to control and/or adjust an operation and/or attitude of a functional component carried by the robotic arm end (see [0005 "...can have a first mode in which the robotic surgical system is configured to control movement of the first arm such that the first arm and the first surgical instrument removably coupled thereto move relative to the second arm and the second surgical instrument removably coupled thereto, can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination. The controller can be configured to cause the coordinated movement."] and [0069 "In an exemplary embodiment, first and second surgical instruments can be coupled to the robotic surgical system, with the first surgical instrument including a camera. The robotic surgical system can thus be configured to allow the camera to follow movement of the second surgical instrument, e.g., the camera is selected as the follower instrument and the second surgical instrument is selected as the master instrument, which can help maintain visualization of the second surgical instrument during movement thereof. The robotic surgical system can also thus be configured to allow the second surgical instrument to follow movement of the camera, e.g., the second surgical instrument is selected as the follower instrument and the camera is selected as the master instrument, which can allow the second surgical instrument, when visualized by the camera, to remain within the camera's vision during the camera's movement."]);
control the first movement state of the imaging device based on the second movement state of the surgical robot, including:
in response to a determination that the surgical robot is in movement, to control the imaging device to remain stationary based on the second movement state of the surgical robot (see [0009 "The first sensor can be configured to detect an impending collision between the first and second arms by determining when the second arm is within a threshold minimum distance of the first arm. The controller can be configured to trigger performance of a remedial action in response to the detected impending collision."]-[0010 "For still another example, triggering performance of the remedial action can include stopping movement of the first arm."], [0018] and [0021]-[0022])).
further comprising:
a first interlock interface, configured to control the surgical robot to establish or interrupt a connection relationship with the imaging device, and to detect the connection relationship (see [0005 "...can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination."]-[0006], [0023 "The surgical method can include, in response to the received second user input, moving the first electromechanical arm in a second mode of the robotic surgical system in which the other movement of the first electromechanical arm causes corresponding movement of the second electromechanical arm such that the relative position of the first and second electromechanical arms is maintained."] and [0179]-[0180 "The robotic surgical system can be configured to selectively move between the first and second modes in response to a user input to the robotic surgical system, e.g., an input via the user input device such as a pressing of a “mode” button thereon."]);
a second interlock interface, configured to control a movement state of the imaging device based on a movement state of the surgical robot (see [0005 "...can have a first mode in which the robotic surgical system is configured to control movement of the first arm such that the first arm and the first surgical instrument removably coupled thereto move relative to the second arm and the second surgical instrument removably coupled thereto, can have a second mode in which the robotic surgical system is configured to control movement of the first arm in coordination with the second arm such that the first arm and the first surgical instrument removably coupled and the second arm and the second surgical instrument removably coupled thereto to simultaneously move in coordination. The controller can be configured to cause the coordinated movement."] and [0069 "In an exemplary embodiment, first and second surgical instruments can be coupled to the robotic surgical system, with the first surgical instrument including a camera. The robotic surgical system can thus be configured to allow the camera to follow movement of the second surgical instrument, e.g., the camera is selected as the follower instrument and the second surgical instrument is selected as the master instrument, which can help maintain visualization of the second surgical instrument during movement thereof. The robotic surgical system can also thus be configured to allow the second surgical instrument to follow movement of the camera, e.g., the second surgical instrument is selected as the follower instrument and the camera is selected as the master instrument, which can allow the second surgical instrument, when visualized by the camera, to remain within the camera's vision during the camera's movement."]); and
a third interlock interface, configured to control the imaging device and/or the surgical robot to remain stationary in a preset emergency situation (see [0009 "The first sensor can be configured to detect an impending collision between the first and second arms by determining when the second arm is within a threshold minimum distance of the first arm. The controller can be configured to trigger performance of a remedial action in response to the detected impending collision."]-[0010 "For still another example, triggering performance of the remedial action can include stopping movement of the first arm."], [0018] and [0021]-[0022]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the system of Kan to include a first interlock interface configured to control the surgical robot to establish or interrupt a connection relationship with the imaging device and to detect the connection relationship, as taught by Weir, in order to prevent a collision with the imaging device and/or surgical robot, in order to maintain or change a relative position between the imaging device and the surgical robot.
Claim 23 is rejected under 35 U.S.C. 103 as being unpatentable over Kan (as modified by Weir, Balicki and Wouhaybi) as applied to claim 15 above, and further in view of Mantri and Kim et al. (US 20160157887 A1 and Kim hereinafter).
Regarding Claim 23
Modified Kan teaches the method of claim 15 (as discussed above in claim 15),
Kan further teaches further comprising:
capturing an image of a target object in real-time by a camera (see Col. 2, lines 35-50).
Kan is silent regarding when the presence of unintended movement of the target object is recognized based on the captured image, controlling the imaging device and the surgical robot to remain stationary at the same time.
Mantri teaches further comprising:
capturing an image of a target object in real-time by a camera (see [0012 "The imaging probe may be coupled to a second robotic arm configured to provide computer-controlled movement of the imaging probe during scanning of the target site with the imaging probe, before and/or during the tissue resection procedure with the treatment probe. "]), and
controlling the imaging device and the surgical robot to remain stationary at the same time (see [0011]-[0013] and [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0073]).
Kim teaches a method for controlling an image-guided interventional puncture device (see all Figs.; [0011]-[0013]), comprising:
capturing an image of a target object in real-time by a camera (see [0025 "...a position tracker 30 that tracks a robot-side optical tool and a patient-side optical tool with a stereo infrared camera…"] and [0049]), and
when the presence of unintended movement of the target object is recognized based on the captured image, controlling the surgical robot to remain stationary (see [0044] and [0054 "Moreover, when a movement of the patient is outside an allowable value, the monitoring part 50 may output a warning sound, display a warning message on a screen, or generate a signal for stopping driving of the interventional robot."]-[0056]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process of modified Kan to control the imaging device and the surgical robot to remain stationary when the presence of unintended movement of the target object is recognized based on the captured image, as taught by Mantri and Kim, in order to improve safety by preventing collision between the surgical robot and a patient.
Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Kan (as modified by Mantri) as applied to claim 9 above, and further in view of Vannuffelen et al. (US 20230001568 A1 and Vannuffelen hereinafter).
Regarding Claim 25
Modified Kan teaches the method of claim 9 (as discussed above in claim 9),
Kan is silent regarding further comprising:
controlling the surgical robot to reduce the speed when the surgical robot is moving toward the target object, and at this time the target object suddenly raises a hand so that the hand is close to the surgical robot;
controlling the surgical robot to return to an original speed when the hand of the target object is put back to an original position; and
controlling the imaging device to stop moving when the surgical robot stops moving and the distance between the imaging device and the surgical robot is still decreasing.
Mantri teaches further comprising:
controlling the surgical robot to reduce the speed when the surgical robot is moving toward the target object (see [0011]-[0013] and [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0073]); and
controlling the imaging device to stop moving when the surgical robot stops moving and the distance between the imaging device and the surgical robot is still decreasing (see [0071 "In some embodiments, the monitoring of the position and orientation of the treatment probe by the imaging probe can generate a signal to a computing device to stop movement of one or both probes or to alter the position, location or orientation of one or both probes to prevent a collision between the probes while inside a patient's body and/or to prevent harm to a patient's tissue or organs. In some embodiments, monitoring of the position and orientation of the treatment probe by the imaging probe can cause the system to stop motion of one or both probes and generate an alert to the physician to prevent harm to the patient."]-[0072 "In some embodiments, one or more of computer vision, image recognition, or a trained machine learning model may be used to assist the system to recognize when one or both probes are too close to each other or to tissue or an organ of a patient."]).
Vannuffelen teaches a method (see all Figs.; [0007]), comprising:
controlling a surgical robot to reduce the speed when the surgical robot is moving toward a target object, and at this time the target object suddenly raises a hand so that the hand is close to the surgical robot (see Figs. 7A-7B, all; [0060 "Then, following the increasing of the compression, the user's hand may finally reach a status shown in the third portion of FIG. 7A: the full compression of the resilient member 32, where the force that sensor 31 senses reaches a threshold 53 in FIG. 7B. This is a steady supported status for the hand of the user 20, wherein motion control module 42 may maintain robot 1 in a hand guidance motion. Then, if there is a larger force than threshold 53, for example, a set threshold 52 in FIG. 7B (e.g., a force that falls outside the range between thresholds 53 and 52), sensor module 3 may send out a safety signal to safety module 41, and the robot 1 may be stopped safely by safety module 41."]); and
controlling the surgical robot to return to an original speed when the hand of the target object is put back to an original position (it would be obvious to additionally restart the surgical robot (and therefore to return to an original speed) after the hand is moved away from the robot and out of hazards way).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process of Kan to control the surgical robot to reduce the speed when the surgical robot is moving toward the target object the target object raises a hand, control the surgical robot to return to an original speed when the hand of the target object is put back to an original position and control the imaging device to stop moving when the surgical robot stops moving and the distance between the imaging device and the surgical robot is still decreasing, as taught by Mantri and Vannuffelen, in order to provide a safety system to prevent collision between the surgical robot and a person in a collaborative environment.
Allowable Subject Matter
Claim 24 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TANNER LUKE CULLEN whose telephone number is (303)297-4384. The examiner can normally be reached Monday-Friday 9:00-5:00 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TANNER L CULLEN/Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656