DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Gregerson et al. (US 2020/0078097) (“Gregerson”), and further in view of Hansen et al. (US 2018/0014851) (“Hansen”).
With regards to claim 1 and 11, Gregerson discloses a surgical robot system (Abstract; FIG. 1) comprising:
a workstation ([0055]; “…a workstation (e.g., a workstation located on a cart 120).”)(FIG. 13; [0124]; “…a computing device providing the functional capabilities of the computer device 1300 may be implemented as a workstation computer,…”) comprising a housing (cart 120), a computation and control center (computer 113), a display apparatus (display device 119) and an input device ([0055]; “…a user interface device may alternately or additionally include one or more of a button, a keyboard, a joystick, a mouse, a touchpad, etc. which may be located on a display device 119, 401 and/or on a workstation (e.g., a workstation located on a cart 120).”);
a robotic arm 101 comprising a plurality of arm segments connected by joints ([0030]; “The robotic arm 101 may comprise a multi-joint arm that includes a plurality of linkages connected by joints…”);
a scanning module ([0035]; motion tracking system 105) configured to collect information for a target space ([0035]; surgical field), wherein the scanning module comprises a projecting component ([0035]; “The optical sensor device 111 may include one or more radiation sources (e.g., diode ring(s)) that direct radiation (e.g., IR radiation) into the surgical field…”) and an image acquiring apparatus ([0035]; “…a stereoscopic optical sensor device 111 that includes two or more cameras (e.g., IR cameras).”), wherein the projecting component is configured to project radiation (e.g., IR radiation) to a target space [0035] and the image acquiring apparatus is configured to collect the image to acquire a three-dimensional structure of the target space through a corresponding decoding algorithm ([0035]; “A computer 113 may be coupled to the sensor device 111 and may determine the transformations between each of the marker devices 119, 202, 115 and the cameras using, for example, triangulation techniques. A 3D model of the surgical space in a common coordinate system may be generated and continually updated using motion tracking software implemented by the computer 113.”)[0046][0126][0128]; and
a guiding module configured to guide a surgical instrument to move in a trajectory, wherein the guiding module is connectable to the robotic arm, wherein the guiding module comprises a through hole and is configured to assist the surgical instruments to move along an axial direction of the through hole ([0050]; “…the end effector 102 of the robotic arm 101 may include a hollow tube or cannula that may be configured to hold one or more tools, such as a surgical instrument, and may be used to guide an instrument as it is inserted into the patient's body.”)[0068][0074][0100],
wherein information collected by the scanning module is processed by the workstation to acquire three-dimensional information of the target space ([0034]; “A 3D model of the space may be constructed in software based on the signals detected by the sensing device.”)([0035]; “A 3D model of the surgical space in a common coordinate system may be generated and continually updated using motion tracking software implemented by the computer 113.”).
Gregerson discloses that the optical sensor device 111 includes diode ring(s)) radiation sources that direct IR radiation into the surgical field [0035].
Gregerson do not disclose that the projecting component (diode ring(s)) radiation sources) is configured to project structural light to a target space.
In the same field of endeavor, Hansen discloses a minimally invasive surgery system comprising a robot, a cannula assembly and a computer system, suitable for performing robotic surgery or robot assisted surgery [0002]. Hansen teaches “The pattern generating member comprises a pattern light source and a projector temporarily or permanently fixed to the cannula shaft portion, wherein the pattern light source is operatively connected to the projector for projecting a light pattern. The computer system is configured for in real time receiving image data representing light pattern reflections from a surgical surface and for determine a real-time spatial position of the cannula assembly relative to the surgical surface.” [0030]. Also, the reference teaches that the projector of the pattern generating member is configured to emit a coded structured light configuration [0157]. The coded structured light configuration is suitable for determining a topographic shape of the target surface [0161]. Further, Hansen teaches the use of LEDs (light emitting diodes) that can generate well defined light patterns which are visible and are enhanced on the monitor and/or easily detectable for computer recognition, decoding and/or vision procession [0124]. Also, the reference teaches that the relative position between the cannula assembly and the surgical tool may also be tracked using other distance and positioning sensors, such as structured light sensors [0052]. Finally, the reference teaches that pattern light source can be configured for emitting at least one electromagnetic wavelength within the IR range of from about 700 nm to about 2500 nm [0130].
In view of Hansen, it would have been obvious to one of ordinary skill within the art before the effective filing date of the claimed invention to modify or exchange the IR diode ring(s)) radiation sources, of Gregerson’s surgical robot system, with LEDs capable of generating IR coded structured light for the purpose of obtaining topographical shape of a target space. The motivation is to use coded structure light to obtain the topographical shape of the surgical space for the purpose of allowing the insertion and position tracking a surgical tool into a patient.
With regards to claim 2, Gregerson, in view of Hansen, discloses the system according to claim 1, wherein the projecting component is further configured to project an image to the target space. (Gregerson; [0034]; surgical field, surgical space)
With regards to claim 3, Gregerson, in view of Hansen, discloses the system according to claim 1, wherein the projecting component and the image acquiring apparatus have a predetermined relative spatial position relationship. (Gregerson; [0067]) (Hansen; [0037][0189])
With regards to claim 4, Gregerson, in view of Hansen, discloses the system according to claim 1, wherein the projecting component comprises a light source (Hansen; [0069]; see the rejection of claim 1), a lens group (Hansen; [0106]), a digital micromirror device (Hansen; [0106]) and a control module (Gregerson; [0057] robotic control system 405).
With regards to claim 5, Gregerson, in view of Hansen, discloses the system according to claim 1, wherein a position of the scanning module in a coordinate system of the robotic arm is determined by the robotic arm. (Gregerson; [0035][0057])
With regards to claim 6, Gregerson, in view of Hansen, discloses the system according to claim 1, further comprising a position tracking module, wherein the position tracking module comprises an optical tracking apparatus or an electromagnetic tracking apparatus. (Gregerson; [0035])(see the rejection of claim 1)
With regards to claim 7, Gregerson, in view of Hansen, discloses the system according to claim 6, wherein the optical tracking apparatus comprises a light-traceable marker (Gregerson; [0034]), a camera unit (Gregerson; [0034]) and a light emitting unit (Gregerson; [0034]; see the rejection of claim 1).
With regards to claim 8, Gregerson, in view of Hansen, discloses the system according to claim 1, wherein a force applied to the robotic arm is calculated with a current of a motor or at least one force sensor is provided. (Gregerson; [0057][0066][0110])
With regards to claim 9, Gregerson, in view of Hansen, discloses the system according to claim 1, wherein the robotic arm has 6, 7, 8, 9, or 10 degrees of freedom. (Gregerson; [0117])
With regards to claim 10, Gregerson, in view of Hansen, discloses a method for using the surgical robot system according to claim 1, comprising the following steps:
a) using the surgical robot system to receive image data (Gregerson; [0031][0035]), and making a surgical plan (Gregerson; [0002][0038]);
b) the user manually drags the robotic arm so that the scanning module reaches a desired position and acquires the scanning information (Gregerson; [0068]), or the workstation (Gregerson; [0055]) calculates an appropriate position for the scanning module to scan and plans an appropriate movement trajectory of the robotic arm based on parameters of the scanning module (Gregerson; [0038][0055][0064]), and then a scanning step is executed automatically to control movement of the robotic arm and drive the scanning module to reach a predetermined position in a predetermined order (Gregerson; [0066][0067][0069]), generating a three-dimensional structure with scanned data via the workstation, and registration with the image data acquired in step a (Gregerson; [0035][0039][0043]); and
c) mounting the guiding module at an end of the robotic arm and executing the predetermined surgical plan (Gregerson; [0050][0074]).
With regards to claim 12, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein the position tracking module is further configured to track the position of the guiding module by a traceable structure mounted on the guiding module (Gregerson; [0035][0036]) and/or track the position of the robotic arm by a traceable structure mounted on and the robotic arm (Gregerson; [0035]).
With regards to claim 13, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein the projecting component is further configured to project an image to the target space. (Gregerson; [0096][0098])
With regards to claim 14, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein the projecting component comprises a light source (Hansen; [0069]; see the rejection of claim 1), a lens group (Hansen; [0106]), a digital micromirror device (Hansen; [0106]) and a control module (Gregerson; [0057] robotic control system 405).
With regards to claim 15, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein a force applied to the robotic arm is calculated with a current of a motor or at least one force sensor is provided. (Gregerson; [0057][0066][0110])
With regards to claim 16, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein the position tracking module is an optical tracking apparatus. (Gregerson; [0035])(see the rejection of claim 1)
With regards to claim 17, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein the position tracking module is an electromagnetic tracking apparatus. (Gregerson; [0035])(see the rejection of claim 1)
With regards to claim 18, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein the robotic arm has 6, 7, 8, 9, or 10 degrees of freedom. (Gregerson; [0117])
With regards to claim 19, Gregerson, in view of Hansen, discloses the system according to claim 11, wherein the system configured to perform the following steps:
a) using the surgical robot system to receive image data (Gregerson; [0031][0035]), and making a surgical plan (Gregerson; [0002][0038]);
b) the user manually drags the robotic arm so that the scanning module reaches a desired position and acquires the scanning information (Gregerson; [0068]), or the workstation (Gregerson; [0055]) calculates an appropriate position for the scanning module to scan and plans an appropriate movement trajectory of the robotic arm based on parameters of the scanning module (Gregerson; [0038][0055][0064]), and then a scanning step is executed automatically to control movement of the robotic arm and drive the scanning module to reach a predetermined position in a predetermined order (Gregerson; [0066][0067][0069]), generating a three-dimensional structure with scanned data via the workstation, and registration with the image data acquired in step a (Gregerson; [0035][0039][0043]); and
c) mounting the guiding module at an end of the robotic arm and executing the predetermined surgical plan (Gregerson; [0050][0074]).
With regards to claim 20, Gregerson, in view of Hansen, discloses the method for using the surgical robot system according to claim 11, comprising the following steps:
a) using the surgical robot system to receive image data (Gregerson; [0031][0035]), and making a surgical plan (Gregerson; [0002][0038]);
b) the user manually drags the robotic arm so that the scanning module reaches a desired position and acquires the scanning information (Gregerson; [0068]), or the workstation (Gregerson; [0055]) calculates an appropriate position for the scanning module to scan and plans an appropriate movement trajectory of the robotic arm based on parameters of the scanning module (Gregerson; [0038][0055][0064]), and then a scanning step is executed automatically to control movement of the robotic arm and drive the scanning module to reach a predetermined position in a predetermined order (Gregerson; [0066][0067][0069]), generating a three-dimensional structure with scanned data via the workstation, and registration with the image data acquired in step a (Gregerson; [0035][0039][0043]); and
c) mounting the guiding module at an end of the robotic arm and executing the predetermined surgical plan (Gregerson; [0050][0074]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Gregerson et al. (US 2018/0185113)
Bourlion et al. (US 2021/0282862)
Crampton (US 2008/0235970)
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HUGH H MAUPIN whose telephone number is (571)270-1495. The examiner can normally be reached M-F 7:30 - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Uzma Alam can be reached at 571-272-3995. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HUGH MAUPIN/ Primary Examiner, Art Unit 2884