DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
The limitation “based on the detected first and second coordinates, determining a distance from a tip of the first physical instrument to a predefined calibration location of the second physical instrument.” in the independent claims is not included in the claims or specification of corresponding parent cases 18/208,136; 18/244,138; or 18/380,076 and thus the earliest priority given in this application is 08/13/2024.
Claim Rejection - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to abstract idea without significantly more.
Regarding claim 1, it recite(s) A computer-implemented method, comprising: detecting first coordinates for a first physical instrument that correspond to at least one pose of the first physical instrument in a unified three-dimensional (3D) coordinate space; detecting second coordinates for a second physical instrument that correspond to at least one pose of the second physical instrument in a unified 3D coordinate space; and based on the detected first and second coordinates, determining a distance from a tip of the first physical instrument to a predefined calibration location of the second physical instrument.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
detecting first coordinates
detecting second coordinates
determining a distance
detecting first coordinates is merely a mental process of looking at a first object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
detecting second coordinates is merely a mental process of looking at a second object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
Determining a distance can be considered a mental process of visualizing a distance between the two points utilizing the human mind as there is no description of how the specific determination is made or the accuracy of the distance and thus it could be a user visualizing in their mind determining an approximate distance between the two points.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements a computer which merely detect and determine. The computer is recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of the computer only presents the idea of a solution while failing to describe how the computer is used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 2, it recite(s) The computer-implemented method of claim 1, wherein the detecting the first and the second coordinates occurs during a predefined range of time, which merely further defines the time of detecting in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 3, it recite(s) The computer-implemented method of claim 2, further comprising: initiating the predefined range of time in response to a request.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
Initiating the predefined range
Initiating the predefined range is merely a mental process of thinking about now is the time to make the determination of the coordinates in the human mind, as the initiation techniques are not described and it is merely a statement of initiating without limit to how it is being initiated.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements a computer which merely detect and determine. The computer is recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of the computer only presents the idea of a solution while failing to describe how the computer is used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 4, it recite(s) The computer-implemented method of claim 1, wherein the detecting the first and the second coordinates occurs concurrently, which merely further defines the time of detecting concurrently in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 5, it recite(s) The computer-implemented method of claim 1, wherein the detecting the first and the second coordinates comprises: detecting a first set of coordinates that correspond to successive poses of the first physical instrument; and detecting a second set of coordinates that correspond to successive poses of the second physical instrument.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
detecting a first set of coordinates
detecting a second set of coordinates
detecting a first set of coordinates is merely a mental process of looking at a first object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
detecting second set of coordinates is merely a mental process of looking at a second object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements a computer which merely detect and determine. The computer is recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of the computer only presents the idea of a solution while failing to describe how the computer is used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 6, it recite(s)The computer-implemented method of claim 5, wherein the first and the second coordinates respectively comprise: coordinates of reference markers located on each of the first and the second physical instruments, which merely further defines the object the user is looking at in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 7, it recite(s)The computer-implemented method of claim 6, wherein the reference markers located on the second physical instrument are located at respective different locations than the calibration location, which merely further defines the object the user is looking at in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 8, it recite(s) A system comprising one or more processors, and a non-transitory computer-readable medium including one or more sequences of instructions that, when executed by the one or more processors, cause the system to perform operations comprising: detecting first coordinates for a first physical instrument that correspond to at least one pose of the first physical instrument in a unified three-dimensional (3D) coordinate space; detecting second coordinates for a second physical instrument that correspond to at least one pose of the second physical instrument in a unified 3D coordinate space; and based on the detected first and second coordinates, determining a distance from a tip of the first physical instrument to a predefined calibration location of the second physical instrument.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
detecting first coordinates
detecting second coordinates
determining a distance
detecting first coordinates is merely a mental process of looking at a first object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
detecting second coordinates is merely a mental process of looking at a second object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
Determining a distance can be considered a mental process of visualizing a distance between the two points utilizing the human mind as there is no description of how the specific determination is made or the accuracy of the distance and thus it could be a user visualizing in their mind determining an approximate distance between the two points.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements of one or more processors and a non-transitory computer readable-medium which merely detect and determine. The one or more processors and a non-transitory computer readable-medium are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of one or more processors and a non-transitory computer readable-medium only presents the idea of a solution while failing to describe how the one or more processors and a non-transitory computer readable-medium are used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 9, it recite(s) The system of claim 8, wherein the detecting the first and the second coordinates occurs during a predefined range of time, which merely further defines the time of detecting in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 10, it recite(s) The system of claim 9, further comprising: initiating the predefined range of time in response to a request.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
Initiating the predefined range
Initiating the predefined range is merely a mental process of thinking about now is the time to make the determination of the coordinates in the human mind, as the initiation techniques are not described and it is merely a statement of initiating without limit to how it is being initiated.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements of one or more processors and a non-transitory computer readable-medium which merely detect and determine. The one or more processors and a non-transitory computer readable-medium are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of one or more processors and a non-transitory computer readable-medium only presents the idea of a solution while failing to describe how the one or more processors and a non-transitory computer readable-medium are used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 11, it recite(s) The system of claim 8, wherein the detecting the first and the second coordinates occurs concurrently, which merely further defines the time of detecting concurrently in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 12, it recite(s) The system of claim 8, wherein the detecting the first and the second coordinates comprises: detecting a first set of coordinates that correspond to successive poses of the first physical instrument; and detecting a second set of coordinates that correspond to successive poses of the second physical instrument.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
detecting a first set of coordinates
detecting a second set of coordinates
detecting a first set of coordinates is merely a mental process of looking at a first object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
detecting second set of coordinates is merely a mental process of looking at a second object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements of one or more processors and a non-transitory computer readable-medium which merely detect and determine. The one or more processors and a non-transitory computer readable-medium are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of one or more processors and a non-transitory computer readable-medium only presents the idea of a solution while failing to describe how the one or more processors and a non-transitory computer readable-medium are used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 13, it recite(s) The system of claim 12, wherein the first and the second coordinates respectively comprise: coordinates of reference markers located on each of the first and the second physical instruments, which merely further defines the object the user is looking at in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 14, it recite(s) The system of claim 13, wherein the reference markers located on the second physical instrument are located at respective different locations than the calibration location, which merely further defines the object the user is looking at in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 15, it recite(s) A computer program product comprising a non-transitory computer-readable medium having a computer-readable program code embodied therein to be executed by one or more processors, the program code including instructions for: detecting first coordinates for a first physical instrument that correspond to at least one pose of the first physical instrument in a unified three-dimensional (3D) coordinate space; detecting second coordinates for a second physical instrument that correspond to at least one pose of the second physical instrument in a unified 3D coordinate space; and based on the detected first and second coordinates, determining a distance from a tip of the first physical instrument to a predefined calibration location of the second physical instrument.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
detecting first coordinates
detecting second coordinates
determining a distance
detecting first coordinates is merely a mental process of looking at a first object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
detecting second coordinates is merely a mental process of looking at a second object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
Determining a distance can be considered a mental process of visualizing a distance between the two points utilizing the human mind as there is no description of how the specific determination is made or the accuracy of the distance and thus it could be a user visualizing in their mind determining an approximate distance between the two points.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements of one or more processors and a non-transitory computer readable-medium which merely detect and determine. The one or more processors and a non-transitory computer readable-medium are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of one or more processors and a non-transitory computer readable-medium only presents the idea of a solution while failing to describe how the one or more processors and a non-transitory computer readable-medium are used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 16, it recite(s) The computer program product of claim 15, wherein the detecting the first and the second coordinates occurs during a predefined range of time, which merely further defines the time of detecting in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 17, it recite(s) The computer program product of claim 16, further comprising: initiating the predefined range of time in response to a request.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
Initiating the predefined range
Initiating the predefined range is merely a mental process of thinking about now is the time to make the determination of the coordinates in the human mind, as the initiation techniques are not described and it is merely a statement of initiating without limit to how it is being initiated.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements of one or more processors and a non-transitory computer readable-medium which merely detect and determine. The one or more processors and a non-transitory computer readable-medium are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of one or more processors and a non-transitory computer readable-medium only presents the idea of a solution while failing to describe how the one or more processors and a non-transitory computer readable-medium are used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 18, it recite(s)The computer program product of claim 15, wherein the detecting the first and the second coordinates occurs concurrently, which merely further defines the time of detecting concurrently in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Regarding claim 19, it recite(s) The computer program product of claim 15, wherein the detecting the first and the second coordinates comprises: detecting a first set of coordinates that correspond to successive poses of the first physical instrument; and detecting a second set of coordinates that correspond to successive poses of the second physical instrument.
MPEP 2106 III provide a flowchart for the subject matter eligibility test for product and processes. The analysis following the flowchart is as follows:
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes. It recites a method, which is a process.
Step 2A, Prong One: Does the claim recite an abstract idea, law of nature, or nature phenomenon?
Yes. The claim recites an abstract idea.
detecting a first set of coordinates
detecting a second set of coordinates
detecting a first set of coordinates is merely a mental process of looking at a first object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
detecting second set of coordinates is merely a mental process of looking at a second object and mentally determining where it is in real space in the human mind, as the detection techniques are not described and it is merely a statement of detecting without limit to how it is being detected.
Step 2A, Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
No.
This judicial exception is not integrated into a practical application. In particular, the claim only recites the additional elements of one or more processors and a non-transitory computer readable-medium which merely detect and determine. The one or more processors and a non-transitory computer readable-medium are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer components. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
No.
These elements are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. In the instant claim, the use of one or more processors and a non-transitory computer readable-medium only presents the idea of a solution while failing to describe how the one or more processors and a non-transitory computer readable-medium are used or structured to achieve the solution. Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
Regarding claim 20, it recite(s) The computer program product of claim 19, wherein the first and the second coordinates respectively comprise: coordinates of reference markers located on each of the first and the second physical instruments, the reference markers located on the second physical instrument are located a respective different locations than the calibration location, which merely further defines the object the user is looking at in the abstract idea of detecting the first and second coordinates. A narrow abstract idea is still an abstract idea and therefore it does not amount to significantly more.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Chen et al. (US 2023/0352133)(Hereinafter referred to as Chen) in view of Khare (US 2023/0240759)(Hereinafter referred to as Khare).
Regarding claim 1, Chen teaches A computer-implemented method (The one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. The distance between the first tool and the second tool may be measured based at least in part on a geometry of the first tool and the second tool. The distance between the first tool and the second tool may be measured based at least in part on a relative position or a relative orientation of a scope that is used to perform the one or more live surgical procedures. The method may further comprise detecting one or more edges of the first tool or the second tool to determine a position and an orientation of the first tool relative to the second tool. The method may further comprise determining a three-dimensional position of a tool tip of the first tool and a three-dimensional position of a tool tip of the second tool. See paragraph [0011])( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained.
See paragraph [0182])( In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. In some cases, the distance between the first tool and the second tool may be measured based at least in part on a geometry (e.g., a size and/or a shape) of the first tool and the second tool See paragraph [0146]), comprising: detecting first coordinates for a first physical instrument that correspond to at least one pose of the first physical instrument in a unified three-dimensional (3D) coordinate space ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained.
See paragraph [0182]);
detecting second coordinates for a second physical instrument that correspond to at least one pose of the second physical instrument in a unified 3D coordinate space ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]), and based on the detected first and second coordinates, determining a distance from a tip of the first physical instrument to a location of the second physical instrument ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]) but is silent to a predefined calibration location.
Khare teaches a calibration process for surgical tools that utilizes touching fiducial markers and provides the surgeon with a step by step calibration process (The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. For example, the software may inquire to the surgeon if there are more modules requiring calibration. If there are more modules, control returns to box 413 where the software may request that the surgeon calibrate the next module. The process is repeated until, at 416, all modules have been calibrated, at which time the calibration process exits at 417. See paragraph [0062]).
Chen and Khare teach of augmented reality use with medical devices and Khare teaches that the system can provide a sequence of steps for the user to follow to provide a proper calibration, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Chen with the calibration techniques of Khare such that the system could provide the user with accurate calibration and positioning for surgical procedures.
Regarding claim 2, Chen in view of Khare teaches The computer-implemented method of claim 1, wherein the detecting the first and the second coordinates occurs during a predefined range of time (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146])(Coordinates are detected in real-time while using, which is considered the predefined range of time as the distance is based on the current position).
Regarding claim 3, Chen in view of Khare teaches The computer-implemented method of claim 2, further comprising: initiating the predefined range of time in response to a request (Chen; The one or more trained medical models may be configured to (i) receive a set of inputs corresponding to the one or more live surgical procedures or one or more surgical subjects of the one or more live surgical procedures and (ii) implement or perform one or more surgical applications, based at least in part on the set of inputs, to enhance a medical operator's ability to perform the one or more live surgical procedures. See paragraph [0128])(Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]).
Regarding claim 4, Chen in view of Khare teaches The computer-implemented method of claim 1, wherein the detecting the first and the second coordinates occurs concurrently (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]) (Chen; It is clear that the detection occurs concurrently as it is in real-time and needs both points to determine the distance.) ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]).
Regarding claim 5, Chen in view of Khare teaches The computer-implemented method of claim 1, wherein the detecting the first and the second coordinates comprises: detecting a first set of coordinates that correspond to successive poses of the first physical instrument; and detecting a second set of coordinates that correspond to successive poses of the second physical instrument (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]).
Regarding claim 6, Chen in view of Khare teaches The computer-implemented method of claim 5, wherein the first and the second coordinates respectively comprise: coordinates of reference markers located on each of the first and the second physical instruments (Khare; The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. See paragraph [0062]) (Khare; The position and orientation of the handheld device 105B and, more particularly, the position of the tip of the tool 206 held by the handheld device, can be tracked using, for example, an array of fiducial markers 202 attached to the handheld device 105B and tracked by one or more cameras in the surgical theater. See paragraph [0004]) (Khare; In some examples, certain markers, such as fiducial markers that identify individuals, important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system. For example, an IR LED can flash a pattern that conveys a unique identifier to the source of that pattern, providing a dynamic identification mark. Similarly, one or two dimensional optical codes (barcode, QR code, etc.) can be affixed to objects in the theater to provide passive identification that can occur based on image analysis. If these codes are placed asymmetrically on an object, they can also be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image. For example, a QR code may be placed in a corner of a tool tray, allowing the orientation and identity of that tray to be tracked. Other tracking modalities are explained throughout. For example, in some examples, augmented reality headsets can be worn by surgeons and other staff to provide additional camera angles and tracking capabilities. See paragraph [0046]).
Regarding claim 7, Chen in view of Khare teaches The computer-implemented method of claim 6, wherein the reference markers located on the second physical instrument are located at respective different locations than the calibration location (Khare; The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. See paragraph [0062]) (Khare; The position and orientation of the handheld device 105B and, more particularly, the position of the tip of the tool 206 held by the handheld device, can be tracked using, for example, an array of fiducial markers 202 attached to the handheld device 105B and tracked by one or more cameras in the surgical theater. See paragraph [0004])(Khare; In some examples, certain markers, such as fiducial markers that identify individuals, important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system. For example, an IR LED can flash a pattern that conveys a unique identifier to the source of that pattern, providing a dynamic identification mark. Similarly, one or two dimensional optical codes (barcode, QR code, etc.) can be affixed to objects in the theater to provide passive identification that can occur based on image analysis. If these codes are placed asymmetrically on an object, they can also be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image. For example, a QR code may be placed in a corner of a tool tray, allowing the orientation and identity of that tray to be tracked. Other tracking modalities are explained throughout. For example, in some examples, augmented reality headsets can be worn by surgeons and other staff to provide additional camera angles and tracking capabilities. See paragraph [0046]).
Regarding claim 8, Chen teaches A system comprising one or more processors, and a non-transitory computer-readable medium including one or more sequences of instructions that, when executed by the one or more processors, cause the system to perform operations (The computer system 2001 may include a central processing unit (CPU, also “processor” and “computer processor” herein) 2005, which can be a single core or multi core processor, or a plurality of processors for parallel processing. The computer system 2001 also includes memory or memory location 2010 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 2015 (e.g., hard disk), communication interface 2020 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 2025, such as cache, other memory, data storage and/or electronic display adapters See paragraph [0189])( The CPU 2005 can execute a sequence of machine-readable instructions, which can be embodied in a program or software. The instructions may be stored in a memory location, such as the memory 2010. The instructions can be directed to the CPU 2005, which can subsequently program or otherwise configure the CPU 2005 to implement methods of the present disclosure. Examples of operations performed by the CPU 2005 can include fetch, decode, execute, and writeback. See paragraph [0190]) comprising:
detecting first coordinates for a first physical instrument that correspond to at least one pose of the first physical instrument in a unified three-dimensional (3D) coordinate space ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]); detecting second coordinates for a second physical instrument that correspond to at least one pose of the second physical instrument in a unified 3D coordinate space ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]), and based on the detected first and second coordinates, determining a distance from a tip of the first physical instrument to a location of the second physical instrument ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]), but is silent to a predefined calibration location.
Khare teaches a calibration process for surgical tools that utilizes touching fiducial markers and provides the surgeon with a step by step calibration process (The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. For example, the software may inquire to the surgeon if there are more modules requiring calibration. If there are more modules, control returns to box 413 where the software may request that the surgeon calibrate the next module. The process is repeated until, at 416, all modules have been calibrated, at which time the calibration process exits at 417. See paragraph [0062]).
Chen and Khare teach of augmented reality use with medical devices and Khare teaches that the system can provide a sequence of steps for the user to follow to provide a proper calibration, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Chen with the calibration techniques of Khare such that the system could provide the user with accurate calibration and positioning for surgical procedures.
Regarding claim 9, Chen in view of Khare teaches The system of claim 8, wherein the detecting the first and the second coordinates occurs during a predefined range of time (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146])(Coordinates are detected in real-time while using, which is considered the predefined range of time as the distance is based on the current position).
Regarding claim 10, Chen in view of Khare teaches The system of claim 9, further comprising: initiating the predefined range of time in response to a request (Chen; The one or more trained medical models may be configured to (i) receive a set of inputs corresponding to the one or more live surgical procedures or one or more surgical subjects of the one or more live surgical procedures and (ii) implement or perform one or more surgical applications, based at least in part on the set of inputs, to enhance a medical operator's ability to perform the one or more live surgical procedures. See paragraph [0128])(Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]).
Regarding claim 11, Chen in view of Khare teaches The system of claim 8, wherein the detecting the first and the second coordinates occurs concurrently (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]) (Chen; It is clear that the detection occurs concurrently as it is in real-time and needs both points to determine the distance.) ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]).
Regarding claim 12, Chen in view of Khare teaches The system of claim 8, wherein the detecting the first and the second coordinates comprises: detecting a first set of coordinates that correspond to successive poses of the first physical instrument; and detecting a second set of coordinates that correspond to successive poses of the second physical instrument (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]).
Regarding claim 13, Chen in view of Khare teaches The system of claim 12, wherein the first and the second coordinates respectively comprise: coordinates of reference markers located on each of the first and the second physical instruments (Khare; The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. See paragraph [0062]) (Khare; The position and orientation of the handheld device 105B and, more particularly, the position of the tip of the tool 206 held by the handheld device, can be tracked using, for example, an array of fiducial markers 202 attached to the handheld device 105B and tracked by one or more cameras in the surgical theater. See paragraph [0004]) (Khare; In some examples, certain markers, such as fiducial markers that identify individuals, important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system. For example, an IR LED can flash a pattern that conveys a unique identifier to the source of that pattern, providing a dynamic identification mark. Similarly, one or two dimensional optical codes (barcode, QR code, etc.) can be affixed to objects in the theater to provide passive identification that can occur based on image analysis. If these codes are placed asymmetrically on an object, they can also be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image. For example, a QR code may be placed in a corner of a tool tray, allowing the orientation and identity of that tray to be tracked. Other tracking modalities are explained throughout. For example, in some examples, augmented reality headsets can be worn by surgeons and other staff to provide additional camera angles and tracking capabilities. See paragraph [0046]).
Regarding claim 14, Chen in view of Khare teaches The system of claim 13, wherein the reference markers located on the second physical instrument are located at respective different locations than the calibration location (Khare; The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. See paragraph [0062]) (Khare; The position and orientation of the handheld device 105B and, more particularly, the position of the tip of the tool 206 held by the handheld device, can be tracked using, for example, an array of fiducial markers 202 attached to the handheld device 105B and tracked by one or more cameras in the surgical theater. See paragraph [0004])(Khare; In some examples, certain markers, such as fiducial markers that identify individuals, important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system. For example, an IR LED can flash a pattern that conveys a unique identifier to the source of that pattern, providing a dynamic identification mark. Similarly, one or two dimensional optical codes (barcode, QR code, etc.) can be affixed to objects in the theater to provide passive identification that can occur based on image analysis. If these codes are placed asymmetrically on an object, they can also be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image. For example, a QR code may be placed in a corner of a tool tray, allowing the orientation and identity of that tray to be tracked. Other tracking modalities are explained throughout. For example, in some examples, augmented reality headsets can be worn by surgeons and other staff to provide additional camera angles and tracking capabilities. See paragraph [0046]).
Regarding claim 15, Chen teaches A computer program product comprising a non-transitory computer-readable medium having a computer-readable program code embodied therein to be executed by one or more processors, the program code including instructions (The computer system 2001 may include a central processing unit (CPU, also “processor” and “computer processor” herein) 2005, which can be a single core or multi core processor, or a plurality of processors for parallel processing. The computer system 2001 also includes memory or memory location 2010 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 2015 (e.g., hard disk), communication interface 2020 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 2025, such as cache, other memory, data storage and/or electronic display adapters See paragraph [0189])( The CPU 2005 can execute a sequence of machine-readable instructions, which can be embodied in a program or software. The instructions may be stored in a memory location, such as the memory 2010. The instructions can be directed to the CPU 2005, which can subsequently program or otherwise configure the CPU 2005 to implement methods of the present disclosure. Examples of operations performed by the CPU 2005 can include fetch, decode, execute, and writeback. See paragraph [0190]) for:
detecting first coordinates for a first physical instrument that correspond to at least one pose of the first physical instrument in a unified three-dimensional (3D) coordinate space ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]);
detecting second coordinates for a second physical instrument that correspond to at least one pose of the second physical instrument in a unified 3D coordinate space ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]); and based on the detected first and second coordinates, determining a distance from a tip of the first physical instrument to a location of the second physical instrument ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]), but is silent to a predefined calibration location.
Khare teaches a calibration process for surgical tools that utilizes touching fiducial markers and provides the surgeon with a step by step calibration process (The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. For example, the software may inquire to the surgeon if there are more modules requiring calibration. If there are more modules, control returns to box 413 where the software may request that the surgeon calibrate the next module. The process is repeated until, at 416, all modules have been calibrated, at which time the calibration process exits at 417. See paragraph [0062]).
Chen and Khare teach of augmented reality use with medical devices and Khare teaches that the system can provide a sequence of steps for the user to follow to provide a proper calibration, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Chen with the calibration techniques of Khare such that the system could provide the user with accurate calibration and positioning for surgical procedures.
Regarding claim 16, Chen in view of Khare teaches The computer program product of claim 15, wherein the detecting the first and the second coordinates occurs during a predefined range of time (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146])(Coordinates are detected in real-time while using, which is considered the predefined range of time as the distance is based on the current position).
Regarding claim 17, Chen in view of Khare teaches The computer program product of claim 16, further comprising: initiating the predefined range of time in response to a request (Chen; The one or more trained medical models may be configured to (i) receive a set of inputs corresponding to the one or more live surgical procedures or one or more surgical subjects of the one or more live surgical procedures and (ii) implement or perform one or more surgical applications, based at least in part on the set of inputs, to enhance a medical operator's ability to perform the one or more live surgical procedures. See paragraph [0128])(Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]).
Regarding claim 18, Chen in view of Khare teaches The computer program product of claim 15, wherein the detecting the first and the second coordinates occurs concurrently (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]) (Chen; It is clear that the detection occurs concurrently as it is in real-time and needs both points to determine the distance.) ( As shown in FIG. 6B, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a three-dimensional (3D) position of the tool tips 412a and 412b relative to the scope 420. The position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 431 and 432 between the scope 420 and the one or more medical tools 410a and 410b. In some cases, the position of the tool tips 412a and 412b may be used in combination with the detected tool edges and a known diameter of the plurality of surgical tools to estimate a distance 433 between the tool tips 412a and 412b of the one or more medical tools 410a and 410b. FIG. 7 illustrates an augmented reality view of the surgical scene showing a tip-to-tip distance 433 between the one or more medical tools and tip-to-scope distances 431 and 432 between the scope and the one or more medical tools. The tip-to-tip distance 433 between the one or more medical tools and the tip-to-scope distances 431 and 432 between the scope and the one or more medical tools may be computed and/or updated in real-time as the surgical video of the surgical scene is being captured or obtained. See paragraph [0182]).
Regarding claim 19, Chen in view of Khare teaches The computer program product of claim 15, wherein the detecting the first and the second coordinates comprises: detecting a first set of coordinates that correspond to successive poses of the first physical instrument; and detecting a second set of coordinates that correspond to successive poses of the second physical instrument (Chen; In some cases, the one or more surgical applications may comprise measuring a distance between a first tool and a second tool in real time. See paragraph [0146]).
Regarding claim 20, Chen in view of Khare teaches The computer program product of claim 19, wherein the first and the second coordinates respectively comprise: coordinates of reference markers located on each of the first and the second physical instruments, the reference markers located on the second physical instrument are located a respective different locations than the calibration location (Khare; The calibration process may comprise, for example, touching the tip of the tool 310 at a predetermined spot such that the position of the fiducial markers 202 attached to the handheld device 105B are visible to the tracking system 115. The software may provide instructions to the surgeon as to the exact steps need to calibrate each module and may further provide a verification of the successful completion of the calibration. At 416, it is determined whether all modules which been prepared in anticipation of the surgical procedure have been calibrated. See paragraph [0062]) (Khare; The position and orientation of the handheld device 105B and, more particularly, the position of the tip of the tool 206 held by the handheld device, can be tracked using, for example, an array of fiducial markers 202 attached to the handheld device 105B and tracked by one or more cameras in the surgical theater. See paragraph [0004]) (Khare; In some examples, certain markers, such as fiducial markers that identify individuals, important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system. For example, an IR LED can flash a pattern that conveys a unique identifier to the source of that pattern, providing a dynamic identification mark. Similarly, one or two dimensional optical codes (barcode, QR code, etc.) can be affixed to objects in the theater to provide passive identification that can occur based on image analysis. If these codes are placed asymmetrically on an object, they can also be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image. For example, a QR code may be placed in a corner of a tool tray, allowing the orientation and identity of that tray to be tracked. Other tracking modalities are explained throughout. For example, in some examples, augmented reality headsets can be worn by surgeons and other staff to provide additional camera angles and tracking capabilities. See paragraph [0046].
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS R WILSON whose telephone number is (571)272-0936. The examiner can normally be reached M-F 7:30-5:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at (572)-272-7794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NICHOLAS R WILSON/Primary Examiner, Art Unit 2611