The present application is being examined under the pre-AIA first to invent provisions.
DETAILED ACTION
Specification
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant's cooperation is requested in correcting any errors of which applicant may become aware in the specification.
Claim Objections
Claim 1 is objected to because of the following informalities: “image ;” in Line 10 should recite --image;--. Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made.
Claims 1-7 are rejected under 35 U.S.C. 103(a) as being unpatentable over Popovic et al. (U.S. Publication 2015/0094856, hereinafter “Popovic”) and in further views of Diolaiti et al. (U.S. Publication 2015/0065793, hereinafter “Diolaiti”; now U.S. Issued Patent 9,516,996).
As to Claim 1, Popovic discloses a method of (20) in [0025] and (21) in [0033] and Figs. 1 and 2 for controlling an endoscope (30) in [0026] comprising steps of:
positioning an endoscope in a body cavity, said endoscope comprising a camera;
displaying in real time images of a field of view (FOV) acquired by the endoscope within the body cavity, said FOV defining at least two axes of an endoscope coordinate system “image coordinate system” (80) in [0047] and Fig. 8 that is fixed with respect to the displayed images and configured to be real time “real time” as described in [0065] updated as at least a tip “tip of the endoscope” in [0044] of said endoscope moves and said FOV changes (images and their respective axes being updated during normal operation of image acquisition as said endoscope moves);
receiving from a touch screen via (50) in [0025] and [0030] FOV commands of motion, said FOV commands of motion (52) in [0036] after performing inverse kinematics described in [0033] and [0035] comprising a user input directing movement of said at least a tip of said endoscope in real time in a desired direction relative to said endoscope coordinate system as real time displayed in said image;
converting via (51) in [0032] and Fig. 2 said FOV commands of motion to maneuvering system commands of motions (52) in [0036] after performing inverse kinematics described in [0033] and [0035] for a maneuvering system (40) in [0027]-[0029] configured to maneuver at least a tip of said endoscope in at least two DOF, said maneuvering system defining an x-axis, a y-axis and a z-axis “robot coordinate system” (90) in [0048] and Fig. 8, said maneuvering system commands of motion being commands of motion relative to said x-axis, said y-axis and said z-axis; and
causing said maneuvering system to maneuver said at least a tip of said endoscope according to said maneuvering system commands of motions via visual servo (51) with tracking vector in [0032] and Fig. 2.
Although Popovic discloses robot control commands of the robot controller (50) in [0025] and [0030] wherein provisory manual camera movements are utilized in [0029] and automatic tracking in [0069], Popovic does not explicitly disclose further details as to manual commands of motion in real time in a desired direction specified using the FOV commands of motion and specifics of the manual movement controller. Diolaiti teaches a manual movement controller (108, 213) in [0052] and Fig. 22 configured to receive FOV commands of motions from a user via (108) in [0052] to maneuver said at least a tip of said endoscope in real time in a desired direction specified using the FOV commands of motion, said FOV commands of motion being commands of motion relative to at least two axes selected from said FOVx-axis, said FOVy-axis and said FOVz-axis, as real time displayed in said image as described in [0093], [0095], and [0101]-[0102]; and said manual movement controller selected from a group consisting of a joystick in [0038], a lever, a button, a vocal command, a touchscreen, typing a command into a keyboard, and any combination thereof. This evidences the level of ordinary mechanical skill in recognizing that the movement controller of Popovic and the manual movement controller of Diolaiti as equivalent alternatives for providing the same predictable result of transmitting user input to move the endoscope. It would have been obvious to one of ordinary skill in the art at the time of invention to have provided the movement controller of Popovic having automatic tracking capabilities maintaining positional mapping between image and instrument reference frames, with manual input taught by Diolaiti in order to provide the predicable result of fulfilling the same function of moving the endoscope in a manner intuitive to the operator looking at the image (Diolaiti, [0012]-[0013]).
As to Claim 2, Popovic discloses the method according to claim 1, wherein the method includes causing said maneuvering system to maneuver said at least a tip of said endoscope according to said maneuvering system commands of motions relative to said at least two axes of said endoscope coordinate system regardless of the orientation of said endoscope with respect to said maneuvering system as shown in Fig. 8.
As to Claim 3, Popovic discloses the method according to claim 1, wherein the converting step includes converting said commands of motions relative to said at least two axes of an endoscope coordinate system to commands of motions relative to said x-axis, said y-axis and said z-axis, and wherein the method additionally comprises instructing said maneuvering system to maneuver said endoscope according to said commands of motions relative to said at least two axes of an endoscope coordinate system as real time “real time” as described in [0065] displayed in said image, regardless of said x-axis, said y-axis and said z-axis as defined by said maneuvering system as shown in Fig. 8.
As to Claim 4, Popovic discloses the method according to claim 1, wherein receiving the user input from the touch screen comprises receiving the user input from controller (50) in [0025] and [0050], wherein Diolaiti teaches that the Popovic’s controller can be a manual movement controller (213) with input device (108) in [0052] for the reasons relevant to claim 1 discussed above in response to a touch and movement on the touch screen in views of [0038] of Diolaiti for the reasons relevant to claim 1 discussed above.
As to Claim 5, Popovic discloses the method according to claim 1, wherein the method includes maintaining the x-axis, y-axis and z-axis constant as the maneuvering system is caused to maneuver said at least a tip of said endoscope according to said maneuvering system commands of motion as described in [0093], [0095], and [0101]-[0102].
As to Claim 6, Popovic discloses a system for controlling an endoscope (30) in [0026] positionable in a body cavity, the system comprising:
a display configured to display in real time images of a field of view (FOV) acquired by the endoscope within the body cavity, said FOV defining at least two axes “x axis and a y axis” in [0047] of an endoscope coordinate system “image coordinate system” (80) in [0047] and Fig. 8 that is fixed with respect to the displayed images and configured to be real time “real time” as described in [0065] updated as at least a tip “tip of the endoscope” in [0044] of said endoscope moves and said FOV changes (images and their respective axes being updated during normal operation of image acquisition as said endoscope moves);
a touch screen controller configured to receive user input directing movement of said at least a tip of said endoscope in real time in a desired direction relative to said endoscope coordinate system as real time displayed in said image via (50) in [0025] and [0030];
a maneuvering system (40) in [0027]-[0029] for maneuvering at least a tip of said endoscope in at least two DOF, said maneuvering system defining an x-axis, a y-axis and a z-axis “robot coordinate system” (90) in [0048] and Fig. 8; and
a processor configured to receive FOV commands of motion corresponding to said user input, and to convert via (51) in [0032] and Fig. 2 said FOV commands of motion (52) in [0036] after performing inverse kinematics described in [0033] and [0035] to maneuvering system commands of motions for a maneuvering system configured to maneuver at least a tip of said endoscope in at least two DOF, said maneuvering system commands of motion being commands of motion relative to said x-axis, said y-axis and said z-axis; and
causing said maneuvering system to maneuver said at least a tip of said endoscope according to said maneuvering system commands of motions via visual servo (51) with tracking vector in [0032] and Fig. 2.
Although Popovic discloses robot control commands of the robot controller (50) in [0025] and [0030] wherein provisory manual camera movements are utilized in [0029] and automatic tracking in [0069], Popovic does not explicitly disclose further details as to manual commands of motion in real time in a desired direction specified using the FOV commands of motion and specifics of the manual movement controller. Diolaiti teaches a manual movement controller (108, 213) in [0052] and Fig. 22 configured to receive FOV commands of motions from a user via (108) in [0052] to maneuver said at least a tip of said endoscope in real time in a desired direction specified using the FOV commands of motion, said FOV commands of motion being commands of motion relative to at least two axes selected from said FOVx-axis, said FOVy-axis and said FOVz-axis, as real time displayed in said image as described in [0093], [0095], and [0101]-[0102]; and said manual movement controller selected from a group consisting of a joystick in [0038], a lever, a button, a vocal command, a touchscreen, typing a command into a keyboard, and any combination thereof. This evidences the level of ordinary mechanical skill in recognizing that the movement controller of Popovic and the manual movement controller of Diolaiti as equivalent alternatives for providing the same predictable result of transmitting user input to move the endoscope. It would have been obvious to one of ordinary skill in the art at the time of invention to have provided the movement controller of Popovic having automatic tracking capabilities maintaining positional mapping between image and instrument reference frames, with manual input taught by Diolaiti in order to provide the predicable result of fulfilling the same function of moving the endoscope in a manner intuitive to the operator looking at the image (Diolaiti, [0012]-[0013]).
As to Claim 7, Popovic discloses the system of claim 6, wherein the maneuvering system defines an x-axis, a y-axis and a z-axis that remain constant as the maneuvering system maneuvers said at least a tip of said endoscope according to said maneuvering system commands of motion as described in [0093], [0095], and [0101]-[0102].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See the enclosed 892 form. 20100234857 is cited to show inverse kinematics maintaining known positional relationship mapping between an input device and end effector movement between different scopes in [0253]. 20080159653, 20080281467, 20090055023, 20150094856, 20170156808, 20180303558, and 20190056693 are cited to show similar display/image coordinate space mapping means. The prior art should be considered to define the claims over the art of record.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM B CHOU whose telephone number is (571) 270-3367. The examiner can normally be reached on M-F 9 am - 6 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael Carey can be reached on (571) 270-7235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WILLIAM CHOU/
Examiner, Art Unit 3795
/MICHAEL J CAREY/Supervisory Patent Examiner, Art Unit 3795