DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Interpretation - 35 USC § 101
The limitation “triggering a first visual characteristic of the dynamic navigation guide in response to determining the one or more detected changes correspond to the threshold amount of difference away from the proper alignment.”, is considered a practical application of visualizing the instrument accuracy relative to the desired planned path in augmented reality.
Claim Objections
Claims 1, 9, 17 are objected to because of the following informalities: The limitation (“dynamic navigation guide”) should have the quotation marks removed. The examiner notes that an even better option would be to remove the limitation in parenthesis and simply utilize the full consistent language in the independent and dependent claims, dynamic navigation guide virtual object, for improved clarity as it is a virtual object. Appropriate correction is required.
Priority
Instant application is a continuation in-part of 18/208,136 has an effective filing date for independent claims reciting the limitations " determining the one or more detected changes correspond to a threshold amount of difference away from a proper alignment of the physical instrument, the proper alignment representing alignment of the physical instrument with a virtual trajectory towards a target point; and triggering a first visual characteristic of the dynamic navigation guide in response to determining the one or more detected changes corresponds to the threshold amount of difference away from the proper alignment" of 07/08/2024. (See figures 5, and 6 of the drawings dated 07/08/2024 for this feature). Thus, the earliest priority date given is 07/08/2024.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Amanatullah et al. (US 2017/0312031)(Hereinafter referred to as Amanatullah).
Regarding claim 1, Amanatullah teaches A computer-implemented method (One variation of a method for augmenting a surgical field with virtual guidance content includes: accessing a scan representing a tissue of a patient; combining the scan with a generic virtual anatomical model to define a custom virtual anatomical model of the tissue; defining a cut trajectory along an intersection between a virtual model of a surgical implant and the custom virtual anatomical model of the tissue; aligning a virtual cut surface to the cut trajectory to locate the virtual model of the surgical guide relative to the custom virtual anatomical model; accessing an image of a
surgical field; detecting the tissue in the image; aligning the custom virtual anatomical model to the tissue detected in the image; defining a target real location for a real surgical guide in the surgical field; and generating a frame depicting the target real location of the surgical guide in the surgical field. See abstract), comprising:
rendering, in an Augmented Reality (AR) display, a dynamic navigation guide virtual object (“dynamic navigation guide”) (As shown in FIGS. 5, 6A, 6B, and 8, a second method S200 for augmenting a surgical field with virtual guidance content includes, during the surgical operation on a tissue of interest of a patient: at a first time, accessing an image of a surgical field captured by a sensor coupled to a computing device in the surgical field in Block S210; detecting the tissue of interest in the image in Block S220; accessing a virtual model of a surgical implant corresponding to the tissue of interest in Block S230; aligning a generic virtual anatomical model with the tissue of interest in the image to define a custom virtual anatomical model in Block S240; locating the virtual model of the surgical implant within the custom virtual anatomical model in Block S250; defining a virtual cut trajectory along a boundary of an intersection between the virtual model of the surgical implant and the custom virtual anatomical model of the tissue of interest in Block S260; defining a target real cut trajectory of a surgical tool in the surgical field based on the virtual cut trajectory in Block S270; generating a frame depicting the target real cut trajectory in Block S280; and, at approximately the first time, publishing, to a display, the frame depicting the target real cut trajectory in Block S290. See paragraph [0084])( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a
cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128])(See figures 6A and 6B)( The virtual surgical guide can be oriented from a perspective of the surgeon viewing a real human feature-a tissue of interest-within the surgical field environment. The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018]);
detecting one or more changes in at least one of an instrument angular distance and an instrument position of a physical instrument in a unified three-dimensional (3D) coordinate space ( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128]) (Current position of surgical tool changes, see figures 6A and 6B)( For example, the virtual cut trajectory can be a cut plane, traversing the tissue of interest, a drilled bore aligned with an axis of the tissue of interest, a 3D cut surface, or any other cut geometry. In one example implementation, the computer system can then serve the patient specific virtual tissue model to the surgeon and then interface with the surgeon to locate a surgical jog model, a virtual cut plane, a cutting tool trajectory, and/or any other virtual surgical object relative to these discrete tissues to define a surgical plan for the upcoming surgery. See paragraph [0035]); and
determining the one or more detected changes correspond to a threshold amount of difference away from a proper alignment of the physical instrument, the proper alignment representing alignment of the physical instrument with a virtual trajectory towards a target point ( The virtual surgical guide can be oriented from a perspective of the surgeon viewing a real human feature-a tissue of interest-within the surgical field environment. The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018]) ( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128]) ( For example, the virtual cut trajectory can be a cut plane, traversing the tissue of interest, a drilled bore aligned with an axis of the tissue of interest, a 3D cut surface, or any other cut geometry. In one example implementation, the computer system can then serve the patient specific virtual tissue model to the surgeon and then interface with the surgeon to locate a surgical jog model, a virtual cut plane, a cutting tool trajectory, and/or any other virtual surgical object relative to these discrete tissues to define a surgical plan for the upcoming surgery. See paragraph [0035]) (Current position of surgical tool, see figures 6A and 6B)(Tool off of Target Cut Trajectory inside threshold); and
triggering a first visual characteristic of the dynamic navigation guide in response to determining the one or more detected changes correspond to the threshold amount of difference away from the proper alignment (Go, see figure 6B, S292, tool location is off of trajectory, but within threshold from trajectory).
Regarding claim 2, Amanatullah teaches The computer-implemented method of claim 1, wherein the threshold amount of difference corresponds to a buffer zone between both the proper alignment of the physical instrument with the virtual trajectory and a misalignment of the physical instrument with the virtual trajectory (See figure 6A, and 6B, Threshold offset from target, target cut trajectory ).
Regarding claim 3, Amanatullah teaches The computer-implemented method of claim 2, wherein the buffer zone is defined as a portion of an alignment zone; wherein a first boundary of the alignment zone represents alignment between a respective physical instrument and a virtual trajectory path and a second boundary of the alignment zone represents misalignment between a respective physical instrument and a virtual trajectory path; wherein a first boundary of the buffer zone is within the alignment zone and different than the first boundary of the alignment zone; and wherein a second boundary of the buffer zone is the same as the second boundary of the alignment zone (See figure 6B, the claim argues arbitrary regions within regions without performing different display operations and thus they can just be determined as arbitrary regions within figure 6B as they do not change the operation of the display in their current claimed form).
Regarding claim 4, Amanatullah teaches The computer-implemented method of claim 2, further comprising: maintaining display of the first visual characteristic of the dynamic navigation guide while one or more subsequent detected angular distances of the physical instrument continue to fall within boundaries of the buffer zone, wherein a respective angular distance comprises an angular difference between one or more current axis of the physical instrument and one or more of a current axis of the virtual trajectory’s path (See figure 6B, current position of surgical tool falling at position within threshold offset from target “GO”).
Regarding claim 5, Amanatullah teaches The computer-implemented method of claim 2, further comprising: maintaining display of the first visual characteristic of the dynamic navigation guide while one or more subsequent detected positions of the physical instrument continue to fall within boundaries of the buffer zone, the respective subsequent detected positions representing a distance between a tip of the physical instrument and the virtual trajectory’s path (See figure 6B, current position of surgical tool falling at position within threshold offset from target “GO”).
Regarding claim 6, Amanatullah teaches The computer-implemented method of claim 1, wherein triggering the first visual characteristic of the dynamic navigation guide comprises: changing at least a portion of the dynamic navigation guide according to the first visual characteristic (Current tool position changes and “GO” is displayed when tool moves to within threshold offset from target cut trajectory).
Regarding claim 7, Amanatullah teaches The computer-implemented method of claim 1, further comprising: terminating display of the first visual characteristic in response to determining one or more subsequent changes in at least one of: (a) the instrument angular distance and the (b) instrument position of the physical instrument exceeds a shared boundary of a buffer zone and an alignment zone (See figure 6A, “NO GO”, GO no longer displayed).
Regarding claim 8, Amanatullah teaches The computer-implemented method of claim 7, further comprising: triggering display of a second visual characteristic in response to determining one or more subsequent changes in at least one of: (a) the instrument angular distance and the (b) instrument position of the physical instrument exceeds the shared boundary of a buffer zone and an alignment zone (See figure 6A, “NO GO”).
Regarding claim 9, Amanatullah teaches A system comprising one or more processors, and a non-transitory computer-readable medium including one or more sequences of instructions that, when executed by the one or more processors, cause the system to perform operations (Generally, a computer system can execute Blocks of the first method SlO0: to generate a sequence of augmented reality ("AR") frames containing a virtual surgical guide and depicting a surgeon's field of view-a surgical field----or a selected perspective through an AR headset, AR glasses, another AR device, and/or a display (in the surgical field or remote from the surgical field). See paragraph [0018])( The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018])( The computer systems and methods described herein can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a user computer or mobile device, wristband, smartphone, or any suitable combination thereof. See paragraph [0160]) comprising:
rendering, in an Augmented Reality (AR) display, a dynamic navigation guide virtual object (“dynamic navigation guide”) (As shown in FIGS. 5, 6A, 6B, and 8, a second method S200 for augmenting a surgical field with virtual guidance content includes, during the surgical operation on a tissue of interest of a patient: at a first time, accessing an image of a surgical field captured by a sensor coupled to a computing device in the surgical field in Block S210; detecting the tissue of interest in the image in Block S220; accessing a virtual model of a surgical implant corresponding to the tissue of interest in Block S230; aligning a generic virtual anatomical model with the tissue of interest in the image to define a custom virtual anatomical model in Block S240; locating the virtual model of the surgical implant within the custom virtual anatomical model in Block S250; defining a virtual cut trajectory along a boundary of an intersection between the virtual model of the surgical implant and the custom virtual anatomical model of the tissue of interest in Block S260; defining a target real cut trajectory of a surgical tool in the surgical field based on the virtual cut trajectory in Block S270; generating a frame depicting the target real cut trajectory in Block S280; and, at approximately the first time, publishing, to a display, the frame depicting the target real cut trajectory in Block S290. See paragraph [0084])( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a
cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128])(See figures 6A and 6B)( The virtual surgical guide can be oriented from a perspective of the surgeon viewing a real human feature-a tissue of interest-within the surgical field environment. The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018]);
detecting one or more changes in at least one of an instrument angular distance and an instrument position of a physical instrument in a unified three-dimensional (3D) coordinate space ( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128]) (Current position of surgical tool changes, see figures 6A and 6B)( For example, the virtual cut trajectory can be a cut plane, traversing the tissue of interest, a drilled bore aligned with an axis of the tissue of interest, a 3D cut surface, or any other cut geometry. In one example implementation, the computer system can then serve the patient specific virtual tissue model to the surgeon and then interface with the surgeon to locate a surgical jog model, a virtual cut plane, a cutting tool trajectory, and/or any other virtual surgical object relative to these discrete tissues to define a surgical plan for the upcoming surgery. See paragraph [0035]); and
determining the one or more detected changes correspond to a threshold amount of difference away from a proper alignment of the physical instrument, the proper alignment representing alignment of the physical instrument with a virtual trajectory towards a target point ( The virtual surgical guide can be oriented from a perspective of the surgeon viewing a real human feature-a tissue of interest-within the surgical field environment. The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018]) ( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128]) ( For example, the virtual cut trajectory can be a cut plane, traversing the tissue of interest, a drilled bore aligned with an axis of the tissue of interest, a 3D cut surface, or any other cut geometry. In one example implementation, the computer system can then serve the patient specific virtual tissue model to the surgeon and then interface with the surgeon to locate a surgical jog model, a virtual cut plane, a cutting tool trajectory, and/or any other virtual surgical object relative to these discrete tissues to define a surgical plan for the upcoming surgery. See paragraph [0035]) (Current position of surgical tool, see figures 6A and 6B)(Tool off of Target Cut Trajectory inside threshold); and
triggering a first visual characteristic of the dynamic navigation guide in response to determining the one or more detected changes correspond to the threshold amount of difference away from the proper alignment (Go, see figure 6B, S292, tool location is off of trajectory, but within threshold from trajectory).
Regarding claim 10, Amanatullah teaches The system of claim 9, wherein the threshold amount of difference corresponds to a buffer zone between both the proper alignment of the physical instrument with the virtual trajectory and a misalignment of the physical instrument with the virtual trajectory (See figure 6A, and 6B, Threshold offset from target, target cut trajectory ).
Regarding claim 11, Amanatullah teaches The system of claim 10, wherein the buffer zone is defined as a portion of an alignment zone; wherein a first boundary of the alignment zone represents alignment between a respective physical instrument and a virtual trajectory path and a second boundary of the alignment zone represents misalignment between a respective physical instrument and a virtual trajectory path; wherein a first boundary of the buffer zone is within the alignment zone and different than the first boundary of the alignment zone; and wherein a second boundary of the buffer zone is the same as the second boundary of the alignment zone (See figure 6B, the claim argues arbitrary regions within regions without performing different display operations and thus they can just be determined as arbitrary regions within figure 6B as they do not change the operation of the display in their current claimed form).
Regarding claim 12, Amanatullah teaches The system of claim 10, further comprising: maintaining display of the first visual characteristic of the dynamic navigation guide while one or more subsequent detected angular distances of the physical instrument continue to fall within boundaries of the buffer zone (See figure 6B, current position of surgical tool falling at position within threshold offset from target “GO”).
Regarding claim 13, Amanatullah teaches The system of claim 10, further comprising: maintaining display of the first visual characteristic of the dynamic navigation guide while one or more subsequent detected positions of the physical instrument continue to fall within boundaries of the buffer zone, the respective subsequent detected positions representing a distance between a tip of the physical instrument and the virtual trajectory’s path (See figure 6B, current position of surgical tool falling at position within threshold offset from target “GO”).
Regarding claim 14, Amanatullah teaches The system of claim 9, wherein triggering the first visual characteristic of the dynamic navigation guide comprises: changing at least a portion of the dynamic navigation guide according to the first visual characteristic (Current tool position changes and “GO” is displayed when tool moves to within threshold offset from target cut trajectory).
Regarding claim 15, Amanatullah teaches The system of claim 9, further comprising: terminating display of the first visual characteristic in response to determining one or more subsequent changes in at least one of: (a) the instrument angular distance and the (b) instrument position of the physical instrument exceeds a shared boundary of a buffer zone and an alignment zone (See figure 6A, “NO GO”, GO no longer displayed).
Regarding claim 16, Amanatullah teaches The system of claim 15, further comprising: triggering display of a second visual characteristic in response to determining one or more subsequent changes in at least one of: (a) the instrument angular distance and the (b) instrument position of the physical instrument exceeds the shared boundary of a buffer zone and an alignment zone (See figure 6A, “NO GO”).
Regarding claim 17, Amanatullah teaches A computer program product comprising a non-transitory computer-readable medium having a computer-readable program code embodied therein to be executed by one or more processors, the program code including instructions (Generally, a computer system can execute Blocks of the first method SlO0: to generate a sequence of augmented reality ("AR") frames containing a virtual surgical guide and depicting a surgeon's field of view-a surgical field----or a selected perspective through an AR headset, AR glasses, another AR device, and/or a display (in the surgical field or remote from the surgical field). See paragraph [0018])( The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018])( The computer systems and methods described herein can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a user computer or mobile device, wristband, smartphone, or any suitable combination thereof. See paragraph [0160])to:
rendering, in an Augmented Reality (AR) display, a dynamic navigation guide virtual object (“dynamic navigation guide”) (As shown in FIGS. 5, 6A, 6B, and 8, a second method S200 for augmenting a surgical field with virtual guidance content includes, during the surgical operation on a tissue of interest of a patient: at a first time, accessing an image of a surgical field captured by a sensor coupled to a computing device in the surgical field in Block S210; detecting the tissue of interest in the image in Block S220; accessing a virtual model of a surgical implant corresponding to the tissue of interest in Block S230; aligning a generic virtual anatomical model with the tissue of interest in the image to define a custom virtual anatomical model in Block S240; locating the virtual model of the surgical implant within the custom virtual anatomical model in Block S250; defining a virtual cut trajectory along a boundary of an intersection between the virtual model of the surgical implant and the custom virtual anatomical model of the tissue of interest in Block S260; defining a target real cut trajectory of a surgical tool in the surgical field based on the virtual cut trajectory in Block S270; generating a frame depicting the target real cut trajectory in Block S280; and, at approximately the first time, publishing, to a display, the frame depicting the target real cut trajectory in Block S290. See paragraph [0084])( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a
cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128])(See figures 6A and 6B)( The virtual surgical guide can be oriented from a perspective of the surgeon viewing a real human feature-a tissue of interest-within the surgical field environment. The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018]);
detecting one or more changes in at least one of an instrument angular distance and an instrument position of a physical instrument in a unified three-dimensional (3D) coordinate space ( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128]) (Current position of surgical tool changes, see figures 6A and 6B)( For example, the virtual cut trajectory can be a cut plane, traversing the tissue of interest, a drilled bore aligned with an axis of the tissue of interest, a 3D cut surface, or any other cut geometry. In one example implementation, the computer system can then serve the patient specific virtual tissue model to the surgeon and then interface with the surgeon to locate a surgical jog model, a virtual cut plane, a cutting tool trajectory, and/or any other virtual surgical object relative to these discrete tissues to define a surgical plan for the upcoming surgery. See paragraph [0035]); and
determining the one or more detected changes correspond to a threshold amount of difference away from a proper alignment of the physical instrument, the proper alignment representing alignment of the physical instrument with a virtual trajectory towards a target point ( The virtual surgical guide can be oriented from a perspective of the surgeon viewing a real human feature-a tissue of interest-within the surgical field environment. The computer system can present these AR frames to the surgeon through an AR device substantially in real-time, thereby guiding placement of a real surgical guide in the real surgical field and, thus, guiding the surgeon's application of real tools within the real surgical environment with virtual AR objects cooperating with real surgical guides, jigs, and fixtures within the real surgical field. See paragraph [0018]) ( In one implementation of the second method S200 shown in FIGS. 6A and 6B, in response to the real location of the surgical tool within a threshold distance from the target real cut trajectory, the computing system can render a cut approval graphic in the display in Block S292; and, in response to a real location of the surgical tool outside the threshold distance from the target real cut trajectory, rendering a guide frame indicating a direction and a distance of the surgical tool from the target real cut trajectory in Block S294. See paragraph [0128]) ( For example, the virtual cut trajectory can be a cut plane, traversing the tissue of interest, a drilled bore aligned with an axis of the tissue of interest, a 3D cut surface, or any other cut geometry. In one example implementation, the computer system can then serve the patient specific virtual tissue model to the surgeon and then interface with the surgeon to locate a surgical jog model, a virtual cut plane, a cutting tool trajectory, and/or any other virtual surgical object relative to these discrete tissues to define a surgical plan for the upcoming surgery. See paragraph [0035]) (Current position of surgical tool, see figures 6A and 6B)(Tool off of Target Cut Trajectory inside threshold); and
triggering a first visual characteristic of the dynamic navigation guide in response to determining the one or more detected changes correspond to the threshold amount of difference away from the proper alignment (Go, see figure 6B, S292, tool location is off of trajectory, but within threshold from trajectory).
Regarding claim 18, Amanatullah teaches The system of claim 17, wherein the threshold amount of difference corresponds to a buffer zone between both the proper alignment of the physical instrument with the virtual trajectory and a misalignment of the physical instrument with the virtual trajectory (See figure 6A, and 6B, Threshold offset from target, target cut trajectory ).
Regarding claim 19, Amanatullah teaches The system of claim 18, wherein the buffer zone is defined as a portion of an alignment zone; wherein a first boundary of the alignment zone represents alignment between a respective physical instrument and a virtual trajectory path and a second boundary of the alignment zone represents misalignment between a respective physical instrument and a virtual trajectory path; wherein a first boundary of the buffer zone is within the alignment zone and different than the first boundary of the alignment zone; and wherein a second boundary of the buffer zone is the same as the second boundary of the alignment zone (See figure 6B, the claim argues arbitrary regions within regions without performing different display operations and thus they can just be determined as arbitrary regions within figure 6B as they do not change the operation of the display in their current claimed form).
Regarding claim 20, Amanatullah teaches The system of claim 18, further comprising: maintaining display of the first visual characteristic of the dynamic navigation guide while one or more subsequent detected angular distances of the physical instrument continue to fall within boundaries of the buffer zone (See figure 6B, current position of surgical tool falling at position within threshold offset from target “GO”).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS R WILSON whose telephone number is (571)272-0936. The examiner can normally be reached M-F 7:30-5:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at (572)-272-7794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NICHOLAS R WILSON/Primary Examiner, Art Unit 2611