DETAILED ACTION
Claims 1-16 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements provided complies with the provisions of MPEP § 609. It has been placed in the application file, and the information referred to therein has been considered as to the merits. A signed copy of the form is attached.
Claim Rejections - 35 USC § 103
The following is a quotation of U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the marmer in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-16 are rejected under 35 U.S.C. 103 as being unpatentable over Mack (Minimally Invasive and Robotic Surgery).
PNG
media_image1.png
593
578
media_image1.png
Greyscale
As per claim 1, Mack teaches a surgical robot arm (see page 571, for the Robotic Arms) control system (see page 571, particularly the surgeon’s console/surgeon Interface), comprising: a surgical robot arm (see page 571, for the Robotic Arms); a spatial positioning information acquisition unit (see page 571, the Surgeon Console/Surgeon Interface acquire information as well), configured to acquire spatial coordinate data (see page 570, second col. in the table, and fifth row for Real-Time Data Acquisition and Nonvisual Imaging); a depth image acquisition unit (see page 571, the Surgeon Console/Surgeon Interface acquire image), configured to acquire a panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image); and a processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument), coupled to the surgical robot arm (see page 571, for the Robotic Arms), the spatial positioning information acquisition unit (see page 571, the Surgeon Console/Surgeon Interface acquire information as well), and the depth image acquisition unit (see page 571, the Surgeon Console/Surgeon Interface acquire image), wherein the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) performs image recognition on the panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) to recognize the surgical robot arm (see page 571, for the Robotic Arms) and environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and locates a position) of the surgical robot arm (see page 571, for the Robotic Arms) according to the spatial coordinate data (see page 570, second col. in the table, and fifth row for Real-Time Data Acquisition and Nonvisual Imaging), wherein the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) defines an environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) according to the position of the surgical robot arm (see page 571, for the Robotic Arms) and plans a movement path of the surgical robot arm (see page 571, for the Robotic Arms) in the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space), wherein the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) controls the surgical robot arm (see page 571, for the Robotic Arms) according to the movement path of the surgical robot arm (see page 571, for the Robotic Arms). Although, the above limitations are disclosed from different embodiments.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of different embodiments of Mack into the intended end result, thereby improving the surgical robot arm as a whole.
As per claim 2, Mack teaches wherein the panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) comprises at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely) and an image of the surgical robot arm (see page 571, for the Robotic Arms), and the at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely) is disposed on a surgical subject, wherein the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) performs the image recognition on the panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) to recognize the at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely), the surgical subject, and the surgical robot arm (see page 571, for the Robotic Arms), and the spatial coordinate data (see page 570, second col. in the table, and fifth row for Real-Time Data Acquisition and Nonvisual Imaging) comprise a plurality of coordinates of the at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely), the surgical subject, and the surgical robot arm (see page 571, for the Robotic Arms).
As per claim 3, Mack teaches wherein the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) is centered around an end mechanism of the surgical robot arm (see page 571, for the Robotic Arms), and the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) is updated together with a movement of the end mechanism of the surgical robot arm (see page 571, for the Robotic Arms).
As per claim 4, Mack teaches wherein the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) trains a real path model corresponding to the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) based on a virtual path model through transfer learning to acquire the movement path of the surgical robot arm through the real path model (see page 571, for the Robotic Arms, and the image of the upper left).
As per claim 5, Mack teaches wherein the virtual path model and the real path model are respectively a densely connected convolutional network model (see page 571, for the Robotic Arms, and the image of the upper left).
As per claim 6, Mack teaches wherein after the surgical robot arm (see page 571, for the Robotic Arms) is moved, the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) recognizes a surgical environment around the surgical robot arm (see page 571, for the Robotic Arms) to determine whether the surgical robot arm (see page 571, for the Robotic Arms) reaches a target region, so as to decide whether to re-define a new environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) and plan another movement path of the surgical robot arm (see page 571, for the Robotic Arms) in the new environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space).
As per claim 7, Mack teaches wherein an end mechanism of the surgical robot arm (see page 571, for the Robotic Arms) is equipped with a reference tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely), and the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) locates the position of the surgical robot arm (see page 571, for the Robotic Arms) based on the reference tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely), wherein the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) controls the surgical robot arm (see page 571, for the Robotic Arms) according to the movement path of the surgical robot arm (see page 571, for the Robotic Arms), so as to make the end mechanism of the surgical robot arm (see page 571, for the Robotic Arms) approach the target region.
As per claim 8, Mack teaches wherein before the surgical robot arm (see page 571, for the Robotic Arms) is moved, the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) re-defines a target coordinate point to extend the target coordinate point to a line segment and convert the line segment into a reference target region, wherein the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) determines whether the reference target region matches the target region to decide whether to control the surgical robot arm (see page 571, for the Robotic Arms) according to the movement path of the surgical robot arm (see page 571, for the Robotic Arms).
As per claim 9, Mack teaches a surgical robot arm (see page 571, for the Robotic Arms) control method (see page 571, particularly the surgeon’s console/surgeon Interface), comprising: acquiring spatial coordinate data (see page 570, second col. in the table, and fifth row for Real-Time Data Acquisition and Nonvisual Imaging) by a spatial positioning information acquisition unit (see page 571, the Surgeon Console/Surgeon Interface acquire information as well); acquiring a panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) by a depth image acquisition unit (see page 571, the Surgeon Console/Surgeon Interface acquire image); performing image recognition on the panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) by a processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) to recognize a surgical robot arm (see page 571, for the Robotic Arms); locating a position of the surgical robot arm (see page 571, for the Robotic Arms) by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) according to the spatial coordinate data (see page 570, second col. in the table, and fifth row for Real-Time Data Acquisition and Nonvisual Imaging); by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument), defining an environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) according to the position of the surgical robot arm (see page 571, for the Robotic Arms) and planning a movement path of the surgical robot arm (see page 571, for the Robotic Arms) in the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space); and controlling the surgical robot arm (see page 571, for the Robotic Arms) by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) according to the movement path of the surgical robot arm (see page 571, for the Robotic Arms). Although, the above limitations are disclosed from different embodiments.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of different embodiments of Mack into the intended end result, thereby improving the surgical robot arm as a whole.
As per claim 10, Mack teaches wherein the panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) comprises at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely) and an image of the surgical robot arm (see page 571, for the Robotic Arms), and the at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely) is disposed on a surgical subject, wherein the step of performing the image recognition on the panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) comprises: performing the image recognition on the panoramic depth image (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image) by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) to recognize the at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely), the surgical subject, and the surgical robot arm (see page 571, for the Robotic Arms), wherein the spatial coordinate data (see page 570, second col. in the table, and fifth row for Real-Time Data Acquisition and Nonvisual Imaging) comprise a plurality of coordinates of the at least one tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely), the surgical subject, and the surgical robot arm (see page 571, for the Robotic Arms).
As per claim 11, Mack teaches wherein the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) is centered around an end mechanism of the surgical robot arm (see page 571, for the Robotic Arms), and the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) is updated together with a movement of the end mechanism of the surgical robot arm (see page 571, for the Robotic Arms).
As per claim 12, Mack teaches wherein the step of planning the movement path of the surgical robot arm (see page 571, for the Robotic Arms) in the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) comprises: training a real path model corresponding to the environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) based on a virtual path model through transfer learning to acquire the movement path of the surgical robot arm through the real path model (see page 571, for the Robotic Arms, and the image of the upper left).
As per claim 13, Mack teaches wherein the virtual path model and the real path model are respectively a densely connected convolutional network model (see page 571, for the Robotic Arms, and the image of the upper left as noted above).
As per claim 14, Mack further comprising: after moving the surgical robot arm (see page 571, for the Robotic Arms), recognizing a surgical environment around the surgical robot arm (see page 571, for the Robotic Arms) by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) to determine whether the surgical robot arm (see page 571, for the Robotic Arms) reaches a target region, so as to decide whether to re-define a new environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space) and plan another movement path of the surgical robot arm (see page 571, for the Robotic Arms) in the new environmental space (see page 571, the Surgeon Console/Surgeon Interface and upper left for depth image and environmental space).
As per claim 15, Mack teaches wherein an end mechanism of the surgical robot arm (see page 571, for the Robotic Arms) is equipped with a reference tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely), wherein the step of controlling the surgical robot arm (see page 571, for the Robotic Arms) comprises: locating the position of the surgical robot arm (see page 571, for the Robotic Arms) by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) according to the reference tracking ball (see page 572, third col. and first par. for swallowable cameras...that can be navigated remotely); and controlling the surgical robot arm (see page 571, for the Robotic Arms) by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) according to the movement path of the surgical robot arm (see page 571, for the Robotic Arms) to make the end mechanism of the surgical robot arm approach the target region (see page 571, for the Robotic Arms, and the image of the upper left).
As per claim 16, Mack teaches wherein the step of controlling the surgical robot arm (see page 571, for the Robotic Arms) comprises: before moving the surgical robot arm (see page 571, for the Robotic Arms), re-defining a target coordinate point by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument) to extend the target coordinate point to a line segment and convert the line segment into a reference target region; and determining whether the reference target region matches the target region by the processor (see page 571, first col. third par. by placing a microprocessor between the surgeon’s hand and the tip of the surgical instrument), so as to decide whether to control the surgical robot arm (see page 571, for the Robotic Arms) according to the movement path of the surgical robot arm (see page 571, for the Robotic Arms).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MCDIEUNEL MARC whose telephone number is (571) 272-6964. The examiner can normally be reached on Work 9:00 AM to 7:30.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, WADE MILES can be reached on (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is (571)-273-3976.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
PNG
media_image2.png
275
275
media_image2.png
Greyscale
/McDieunel Marc/
Primary Examiner, Art Unit 3665