DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim does not fall within at least one of the four categories of patent eligible subject matter because claim 20’s preamble simply recites a “Computer Program” as opposed to a “Non-Transitory Computer Readable Medium”. Appropriate correction to the claim language is required to overcome the 101 rejection.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 3, 5, 7-8, 10-14, and 16-20 are rejected under 35 U.S.C. 102(a)(1) and 102(a)(2) as being unpatentable by Malackowski(WO 2021003401 A1).
Regarding claim 1, Malackowski discloses a method for supporting clinical personnel with multiple navigation views, the method being performed by at least one processor and comprising: obtaining patient tracking information indicative of tracked poses of two or more patient trackers, each of the patient trackers having a fixed spatial relationship relative to one or more anatomical elements of a patient’s body; obtaining instrument tracking information indicative of tracked poses of two or more medical instruments; and triggering, based on the patient tracking information and the instrument tracking information, simultaneous display of multiple navigation views, each of the navigation views being determined based on a respective set comprising (i) at least one of the medical instruments and (ii) at least one of the patient trackers , wherein the sets differ from one another in the at least one of the patient trackers(The surgical navigation system 12 may display the relative positions of objects tracked during a surgical procedure to aid the surgeon[0047]. During a surgical procedure, the surgical navigation system 12 may track the position (location and orientation) of objects of interest within a surgical workspace using a combination of tracker-based localization and machine vision. The surgical workspace for a surgical procedure may be considered to include the target volume of patient tissue being treated and the area immediately surrounding the target volume being treated in which an obstacle to treatment may be present. The tracked objects may include, but are not limited to, anatomical structures of the patient, target volumes of anatomical structures to be treated, surgical instruments such as the surgical instrument 16, and anatomical structures of surgical personal such as a surgeon’s hand or fingers. The tracked anatomical structures of the patient and target volumes may include soft tissue such as ligaments, muscle, and skin, may include hard tissue such bone. The tracked surgical instruments may include retractors, cutting tools, and waste management devices used during a surgical procedure[0042]). The “set” of the tracked anatomical tissue and the tracked medical instrument are the objects of interest in the surgical procedure, and the sets or pairings of tissue and instrument differ based on the surgical procedure of the navigation system.
Regarding claim 3, Malackowski discloses the method of claim 1, wherein each of the at least one of the patient trackers within a same set has a fixed spatial relationship relative to a common anatomical element of the patient’s body(For example, when the target volume to be treated is located at a patient’s knee area, a tracker 34 may be firmly affixed to the femur F of the patient, a tracker 36 may be firmly affixed to the to the tibia T of the patient, and a tracker 38 may be firmly affixed to the surgical instrument 16[0058]).
Regarding claim 5, Malackowski discloses the method claim 1, wherein the sets differ from one another in the at least one of the medical instruments(The target site 200 may also include surgical tools, such as retractors 208 positioned to retract the epidermal tissue 206 and provide access to the patient’s femur F[0110]. The tracked surgical instruments may include retractors, cutting tools, and waste management devices used during a surgical procedure[0042]). There are multiple tools eligible for use in Malackowski and the tool set selected is based on the surgical procedure.
Regarding claim 7, Malackowski discloses the method of claim 1, wherein at least one of the sets is determined based on the patient tracking information and the instrument tracking information(For instance, the trackers may be firmly affixed to patient bones and surgical instruments, such as retractors and the surgical instrument 16. In this way, responsive to determining a position of a tracker in the surgical workspace using the localizer 18, the navigation controller 22 may infer the position of the object to which the tracker is affixed based on the determined position of the tracker[0057]).
Regarding claim 8, Malackowski discloses the method of claim 1, wherein at least one of the sets is determined based on a relative pose between (i) the at least one of the medical instruments and (ii) at least one of the one or more anatomical elements having the fixed spatial relationship relative to the at least one of the patient trackers(For example, the transformation data 83 may set forth the fixed positional relationships between the trackers 34, 36, 38 and the objects firmly affixed to the trackers 34, 36, 38, and a positional relationship between localizer 18 and the vision device 40. The surgical plan 84 may identify patient anatomical structures target volumes involved in the surgical procedure, may identify the instruments being used in the surgical procedure, and may define the planned trajectories of instruments and the planned movements of patient tissue during the surgical procedure[0092]).
Regarding claim 10, Malackowski discloses the method of claim 1, wherein at least one of the sets is determined based on a user input(The user interface 24 may also include one or more input devices that enable user-input to the surgical navigation system 12. The input devices may include a keyboard, mouse, and/or touch screen 28 that can be interacted with by a user to input surgical parameters and control aspects of the navigation controller 22[0049]).
Regarding claim 11, Malackowski discloses the method of claim 1, wherein at least one of the navigation views indicates a relative pose between (i) a medical instrument of the set used to determine the at least one of the navigation views and (ii) at least one of the one or more anatomical elements having the fixed spatial relationship relative to one or more of the at least one of the patient trackers of the same set(During a surgical procedure, the navigation controller 22, such as via the surgical navigator 81, may be configured to display the illustration of FIG. 14 along with an image or virtual model for the surgical instrument 16 at the current position of the surgical instrument 16 in the common coordinate system, such as tracked with the localizer 18, to aid a surgeon in guiding the surgical instrument 16 to the target volume 202[0143]. Furthermore, the spatial relationship between the arrangement of features of the object and the rest of the object may be fixed[0147]).
Regarding claim 12, Malackowski discloses the method of claim 11, wherein the at least one of the navigation views further indicates a pose of another medical instrument of the set or of another set(For instance, the trackers may be firmly affixed to patient bones and surgical instruments, such as retractors and the surgical instrument 16. In this way, responsive to determining a position of a tracker in the surgical workspace using the localizer 18, the navigation controller 22 may infer the position of the object to which the tracker is affixed based on the determined position of the tracker[0057]).
Regarding claim 13, Malackowski discloses the method claim 1, wherein at least one of the following conditions is fulfilled: two or more of the navigation views are triggered to be displayed on a common display unit; and at least one of the navigation views is triggered to be displayed on an individual display unit(The navigation controller 22 may be in operative communication with a user interface 24 of the surgical navigation system 12. The user interface 24 may facilitate user interaction with the surgical navigation system 12 and navigation controller 22. For example, the user interface 24 may include one or more output devices that provide information to a user, such as from the navigation controller 22. The output devices may include a display 25 adapted to be situated outside of a sterile field including the surgical workspace and may include a display 26 adapted to be situated inside the sterile field. The displays 25, 26 may be adjustably mounted to the navigation cart assembly 20[0049]).
Regarding claim 14, Malackowski discloses the method of claim 1, further comprising associating a registration to each of the patient trackers that is usable to transform coordinates from patient image data into real-world coordinates(According to one implementation, the controller can identify a position of the virtual model in the first coordinate system based on the detected position of the tracker in the first coordinate system and a positional relationship between the tracker and the first object in the first coordinate system. According to one implementation, the controller transforms the position of the virtual model in the first coordinate system to a position of the virtual model in a second coordinate system specific to the vision device based on the position of the virtual model in the first coordinate system and a positional relationship between the localizer and the vision device in the second coordinate system[0015]).
Regarding claim 16, Malackowski teaches the method of claim 14, wherein the registrations of all patient trackers having a fixed spatial relationship relative to a common anatomical element differ from one another, but may remain constant relative to one another even if anatomical elements of the patient’s body move over time(Given the fixed spatial relationships between the femur F and tibia T and their trackers 34, 36, the navigation controller 22, such as via the transformation engine 78, may transform the position of the femur F in the femur coordinate system FBONE a position of the femur F in the bone tracker coordinate system BTRK1, and may transform the position of the tibia T in the tibia coordinate system TBONE to a position of the tibia T in the bone tracker coordinate system BTRK2[0101]. For example, the transformation data 83 may set forth the fixed positional relationships between the trackers 34, 36, 38 and the objects firmly affixed to the trackers 34, 36, 38, and a positional relationship between localizer 18 and the vision device 40. The surgical plan 84 may identify patient anatomical structures target volumes involved in the surgical procedure, may identify the instruments being used in the surgical procedure, and may define the planned trajectories of instruments and the planned movements of patient tissue during the surgical procedure[0092]).
Regarding claim 17, Malackowski discloses the method of claim 1, wherein one of more of the following conditions is fulfilled: at least one of the medical instruments is handled by a robot; and at least two of the medical instruments are handled by different persons(The surgical system 10 may include a surgical navigation system 12 and a robotic manipulator 14. The robotic manipulator 14 may be coupled to a surgical instrument 16, and may be configured to maneuver the surgical instrument 16 to treat a target volume of patient tissue, such as at the direction of a surgeon and/or the surgical navigation system 12[0041]).
Regarding claim 18, Malackowski discloses the a system comprising at least one processor configured to: obtain patient tracking information indicative of tracked poses of two or more patient trackers, each of the patient trackers having a fixed spatial relationship relative to one or more anatomical elements of a patient’s body; obtain instrument tracking information indicative of tracked poses of two or more medical instruments; and trigger, based on the patient tracking information and the instrument tracking information, simultaneous display of multiple navigation views, each of the navigation views being determined based on a respective set comprising (i) at least one of the medical instruments and (ii) at least one of the patient trackers, wherein the sets differ from one another in the at least one of the patient trackers(The surgical navigation system 12 may display the relative positions of objects tracked during a surgical procedure to aid the surgeon[0047]. During a surgical procedure, the surgical navigation system 12 may track the position (location and orientation) of objects of interest within a surgical workspace using a combination of tracker-based localization and machine vision. The surgical workspace for a surgical procedure may be considered to include the target volume of patient tissue being treated and the area immediately surrounding the target volume being treated in which an obstacle to treatment may be present. The tracked objects may include, but are not limited to, anatomical structures of the patient, target volumes of anatomical structures to be treated, surgical instruments such as the surgical instrument 16, and anatomical structures of surgical personal such as a surgeon’s hand or fingers. The tracked anatomical structures of the patient and target volumes may include soft tissue such as ligaments, muscle, and skin, may include hard tissue such bone. The tracked surgical instruments may include retractors, cutting tools, and waste management devices used during a surgical procedure[0042]. The “set” of the tracked anatomical tissue and the tracked medical instrument are the objects of interest in the surgical procedure, and the sets or pairings of tissue and instrument differ based on the surgical procedure of the navigation system.
Regarding claim 19, Malackowski discloses the system of claim 18, further comprising at least one of the following components: one or more display units configured to simultaneously display the navigation views; a tracking unit configured to track at least one entity selected from the patient trackers and the medical instruments; and a robot configured to handle at least one of the medical instruments(The user interface 24 may facilitate user interaction with the surgical navigation system 12 and navigation controller 22. For example, the user interface 24 may include one or more output devices that provide information to a user, such as from the navigation controller 22. The output devices may include a display 25 adapted to be situated outside of a sterile field including the surgical workspace and may include a display 26 adapted to be situated inside the sterile field. The displays 25, 26 may be adjustably mounted to the navigation cart assembly 20[0049]).
Regarding claim 20, Malackowski teaches a computer program comprising instructions which, when performed by at least one processor, configure the at least one processor to: obtain patient tracking information indicative of tracked poses of two or more patient trackers, each of the patient trackers having a fixed spatial relationship relative to one or more anatomical elements of a patient’s body; obtain instrument tracking information indicative of tracked poses of two or more medical instruments; and trigger, based on the patient tracking information and the instrument tracking information, simultaneous display of multiple navigation views, each of the navigation views being determined based on a respective set comprising (i) at least one of the medical instruments and (ii) at least one of the patient trackers, wherein the sets differ from one another in the at least one of the patient trackers(The surgical navigation system 12 may display the relative positions of objects tracked during a surgical procedure to aid the surgeon[0047]. During a surgical procedure, the surgical navigation system 12 may track the position (location and orientation) of objects of interest within a surgical workspace using a combination of tracker-based localization and machine vision. The surgical workspace for a surgical procedure may be considered to include the target volume of patient tissue being treated and the area immediately surrounding the target volume being treated in which an obstacle to treatment may be present. The tracked objects may include, but are not limited to, anatomical structures of the patient, target volumes of anatomical structures to be treated, surgical instruments such as the surgical instrument 16, and anatomical structures of surgical personal such as a surgeon’s hand or fingers. The tracked anatomical structures of the patient and target volumes may include soft tissue such as ligaments, muscle, and skin, may include hard tissue such bone. The tracked surgical instruments may include retractors, cutting tools, and waste management devices used during a surgical procedure[0042]). The “set” of the tracked anatomical tissue and the tracked medical instrument are the objects of interest in the surgical procedure, and the sets or pairings of tissue and instrument differ based on the surgical procedure of the navigation system.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 2, 4, 6, and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Malackowski in view of Wassall(EP 3861957 A1).
Regarding claim 2, Malackowski discloses The method of claim 1, but Malackowski fails to explicitly state wherein each of the navigation views is specific for the respective set used to determine the respective navigation view.
However, Wassall teaches “Referring to Figure 6, the tracking cameras 46 on the auxiliary tracking bar has a navigation field-of-view 600 in which the pose (e.g., position and orientation) of the reference array 602 attached to the patient, the reference array 604 attached to the surgical instrument, and the robot arm 20 are tracked. The tracking cameras 46 may be part of the camera tracking system component 6' of Figures 3B and 3C, which includes the computer platform 910 configured to perform the operations described below(see attached copy, page 9, paragraph 4)”.
It would be obvious to one of ordinary skill in the art before the effective date to configure the surgical navigation system of Malackowski with the navigation field of view of the camera tracking system of Wassall. Doing so would specify the navigation views of the system are specific to the navigation set at each moment.
Regarding claim 4, Malackowski discloses the method of claim 3, but fails to explicitly state wherein the common anatomical element differs from one set to another. However, Wassall teaches “The computer platform 910, in some embodiments, can allow planning for use of standard surgical tools and/or implants, e.g., posterior stabilized implants and cruciate retaining implants, cemented and cementless implants, revision systems for surgeries related to, for example, total or partial knee and/or hip replacement and/or trauma(see attached copy, page 14, paragraph 6)”.
It would be obvious to one of ordinary skill in the art before the effective date to configure the surgical navigation system of Malackowski with the specific anatomical examples of the camera tracking system of Wassall. Doing so would specify multiple anatomical tissues or elements that different and can be targeted by the navigation system.
Regarding claim 6, Malackowski discloses the method of claim 1, but fails to explicitly state wherein at least one of the following conditions is fulfilled: at least one of the sets comprises at least one of the medical instruments that is not comprised in any other one of the sets; and at least one of the sets comprises at least one of the medical instruments that is comprised in another one of the sets.
However, Wassall teaches “The computer platform 910 can generate navigation information which provides visual guidance to the surgeon for performing the surgical procedure. When used with the surgical robot 4, the computer platform 910 can provide guidance that allows the surgical robot 4 to automatically move the end effector 26 to a target pose so that the surgical tool is aligned with a target location to perform the surgical procedure on an anatomical structure(see attached copy, page 14, paragraph 2). In some embodiments, the surgical system 900 can use two DRAs to track patient anatomy position, such as one connected to patient tibia and one connected to patient femur. The system 900 may use standard navigated instruments for the registration and checks (e.g. a pointer similar to the one used in Globus ExcelsiusGPS system for spine surgery)(see attached copy, page 14, paragraph 3)”.
It would be obvious to one of ordinary skill in the art before the effective date to configure the surgical navigation system of Malackowski with the instrument usage of the camera tracking system of Wassall. Doing so would specify medical instruments being used during different procedures for different target tissues.
Regarding claim 9, Malackowski discloses the method of claim 8, but fails to explicitly state wherein the at least one of the sets is determined based on the relative pose indicating that at least one of the following conditions is fulfilled: the at least one of the medical instruments is aligned with the at least one of the one or more anatomical elements having the fixed spatial relationship relative to the at least one of the patient trackers; and the at least one of the medical instruments is positioned within a predefined region that is associated with the at least one of the one or more anatomical elements having the fixed spatial relationship relative to the at least one of the patient trackers.
However, Wassall “Using both visible and near infrared tracking coordinate systems can enable any one or more of: (a) identifying tools that would not be identified using a single coordinate system; (b) increased pose tracking accuracy; (c) enabling a wider range of motion without losing tracking of surgical instruments, patient anatomy, and/or a robotic end effector; and (d) naturally track an XR headset in the same coordinate system as the navigated surgical instruments(see attached copy, page 21, paragraph 2)”.
It would be obvious to one of ordinary skill in the art before the effective date to configure the surgical navigation system of Malackowski with the pose alignment of the camera tracking system of Wassall. Doing so would specify aligning the tools and tracking to poses through a coordinate system to ensure accurate movement and placement of instruments during the procedure.
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Malackowski.
Regarding claim 15, Malackowski teaches the method of claim 14, wherein the registrations differ from one set to another(According to one implementation, the controller can identify a position of the virtual model in the first coordinate system based on the detected position of the tracker in the first coordinate system and a positional relationship between the tracker and the first object in the first coordinate system. According to one implementation, the controller transforms the position of the virtual model in the first coordinate system to a position of the virtual model in a second coordinate system specific to the vision device based on the position of the virtual model in the first coordinate system and a positional relationship between the localizer and the vision device in the second coordinate system[0015]). While Malackowski does not explicitly state that the registrations are different from one another, it is an obvious assumption. The registrations represent the tracked position coordinates of each set, and since the positions change for each set, the coordinates and registrations would obviously change as well.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARIA CATHERINE ANTHONY whose telephone number is (703)756-4514. The examiner can normally be reached 7:30 am - 4:30 pm, EST, M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, CARL LAYNO can be reached at (571) 272-4949. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MARIA CATHERINE ANTHONY/Examiner, Art Unit 3796
/CARL H LAYNO/Supervisory Patent Examiner, Art Unit 3796