Prosecution Insights
Last updated: April 19, 2026
Application No. 18/706,452

SURGICAL NAVIGATION SYSTEM HAVING IMPROVED INSTRUMENT TRACKING AND NAVIGATION METHOD

Non-Final OA §102§103§112
Filed
May 01, 2024
Examiner
EDUN, DEAN NAWAAB
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
B. Braun New Ventures GmbH
OA Round
4 (Non-Final)
43%
Grant Probability
Moderate
4-5
OA Rounds
3y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 43% of resolved cases
43%
Career Allow Rate
15 granted / 35 resolved
-27.1% vs TC avg
Strong +65% interview lift
Without
With
+65.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
48 currently pending
Career history
83
Total Applications
across all art units

Statute-Specific Performance

§101
6.3%
-33.7% vs TC avg
§103
48.1%
+8.1% vs TC avg
§102
16.7%
-23.3% vs TC avg
§112
27.5%
-12.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 35 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/21/2025 has been entered. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. DE10 2021 128 478.3, filed on November 2, 2021. Status of Claims This Office Action is responsive to the claims filed on 10/21/2025. Claims 1-14 were previously cancelled. Claims 16, 29, and 30 have been cancelled. Claims 15, 19, and 21 have been amended. Claims 35 and 36 are newly presented. Claims 15, 17-28, and 31-36 are presently pending in this application. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “a data provision unit” in claim 15, ln. 7; “a tracking system” in claim 15, ln. 13; “a control unit” in claim 15, ln. 19. The corresponding structure for the “data provision unit” defined within the specification is an SSD memory (page 13, Line 6) and any functional equivalents. The corresponding structure for the “storage unit” defined within the specification is SSD memory (page 13, Line 6) and any functional equivalents. The corresponding structure for the “tracking system” defined within the specification is “an infrared-based camera system and/or electromagnetic-based system and/or inertial measurement unit based system” (page 9, Lines 18-19) and any functional equivalents. The corresponding structure for the “control unit” defined within the specification is “a computer” (page 9, Line 28) and any functional equivalents. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 15, 17-28, and 31-36 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claim 15 recites the claimed limitation “an arm, the arm comprising the end effector” in line 4, and further recites “a medical imaging device comprising… a medical endoscope” in line 11, and further recites “the medical imaging device having an imaging head mounted to the end effector of the arm” in line 13. The claimed medical imaging being an endoscope mounted to the end of an effector arm is not described in sufficient detail that it would be clear that the applicant had possession of the claimed invention. Specification pg. 6, ln. 6-20 describes a medical imaging device being an endoscope in general, but fails to describe how the medical imaging device is mounted in any way, and further fails to describe the endoscope having a imaging head mounted to the end effector of the arm as claimed. Furthermore, the associated Figures 1 and 8 depict the microscope mounted on an arm but further Figures and Specification does not include any other specific mention of the arrangement of an endoscope as claimed. Thus, such limitations are rendered as new matter. As such the claims are not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor at the time the application was filed, had possession of the claimed invention. Therefore, the claims are rejected for including new matter. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 15, 17-28, and 31-36 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 15 recites the claimed limitation “an arm, the arm comprising the end effector” in line 4, and further recites “a medical imaging device comprising… a medical endoscope” in line 11, and further recites “the medical imaging device having an imaging head mounted to the end effector of the arm” in line 13 is indefinite because it is unclear whether this endoscope is mounted to the end effector in any way; OR the system includes other imaging devices and the endoscope is not part of the imaging head attached to the end effector. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 15, 17, 18, 21-24, 28, 29, 31, 33, and 34 are rejected under 35 U.S.C. 103 as being unpatentable over Stopp (US 20200193622) in view of Polchin (US 20220401178) and Calloway (US 20210228281). Regarding claim 15, Stopp teaches a surgical navigation system (Abstract; A medical tracking method for tracking a spatial position of a medical instrument; Paragraph [0052]; tracking system, Sheet 1 Fig. 1) for navigation during a surgical intervention (Paragraph [0009]; tracking a spatial position of at least one medical instrument within a medical workspace) on a patient (Paragraph [0013]; a medical microscope that may in particular look down on a patient lying on a patient couch), and tracking of at least one medical instrument (Paragraph [0009]; tracking a spatial position of at least one medical instrument), the surgical navigation system comprising: an arm (Paragraph [0052]; articulated arm, Sheet 1 Fig. 1), the arm comprising an end effector (Paragraph [0052]; microscope 8 is coupled to an articulated arm); a display device for displaying visual content (Paragraph [0085]; The computer is preferably operatively coupled to a display device which allows information outputted by the computer to be displayed); the at least one medical instrument (Paragraph [0052]; instrument 1 within the medical workspace; Sheet 1 Fig. 1), the at least one medical instrument comprising an optically visible outer side (Paragraph [0052]; determined by analyzing the contours of the instrument 1) and a predetermined optical pattern directly integrated into the optically visible outer side (Paragraph [0062]; a medical instrument having a body section and at least three tracking markers which run circumferentially around different parts of the body section; Paragraph [0099]; Each of the tracking markers 2 and 3 is provided directly on the outer surface of the instrument body 1 and may be for example printed or lasered onto the surface; Sheet 3, Fig. 1), the at least one medical instrument having a respective local coordinate system (Paragraph [0052]; Rather the spatial position of instrument 1 within the coordinate system of camera 4 is determined); a data provision unit (Paragraph [0033]; computer for example comprises at least one processor and for example at least one memory in order to (technically) process the data, for example electronically and/or optically), which is adapted to provide digital three-dimensional (3D) imaging data of the patient (Paragraph [0057]; a three-dimensional scan of the patient's body/anatomical structure is performed, for obtaining a three-dimensional image dataset); a medical imaging device comprising a surgical microscope (Paragraph [0052]; medical microscope 8, Sheet 1 Fig. 1) detection for tracking (Paragraph [0052]; the microscope-integrated camera 4 can be utilized for tracking the instrument 1 within the medical workspace), the medical imaging device having an imaging head mounted to the end effector of the arm (Paragraph [0052]; the microscope-integrated camera 4, Sheet 1 Fig. 1), the imaging head being adapted to create an image of a portion of an intervention region of the patient (Paragraph [0057]; that the microscope-integrated camera 4 can observe the medical workspace containing the anatomical structure 2 and the instrument 1) as well as to detect and to track the at least one medical instrument with respect to the imaging head (Paragraph [0058]; as pointer 1 can be seen in the microscope video image, computer 3 connected to both cameras 4 and 5 is able to calculate the spatial position of each of the landmarks with respect to the microscope), the imaging head being adapted to detect the predetermined optical pattern and track a position and/or orientation of the medical instrument relative to the imaging head (Paragraph [0052]; Rather the spatial position of instrument 1 within the coordinate system of camera 4 is determined by analyzing the contours of the instrument 1 within a monoscopic video image obtained from camera 4.; Paragraph [0099]; tracking-portion with three circumferential tracking markers 2 having a smaller diameter d and two circumferential tracking markers 3… example printed or lasered onto the surface; Paragraph [0100]; This perspective effect will help an optical tracking system to recognize and calculate the spatial orientation more accurately by measuring the distance between the tracking markers) via machine vision (Paragraph [0117]; at least some of the information as to the spatial orientation of objects seen on the image can be derived by computer vision algorithms including edge detection), the medical imaging device having a respective local coordinate system (Paragraph [0053]; the coordinate system of camera 4; Paragraph [0058]; spatial position of each of the landmarks with respect to the microscope. As the relative position of the microscope; positions with respect to the microscope are considered to be a respective coordinate system of the microscope as understood in its broadest reasonable interpretation); a tracking system (Paragraph [0053]; a second video camera 5, Sheet 1 Fig. 1), the tracking system being movably mounted relative to the arm (Paragraph [0054]; Additionally, camera 5 can be pivoted between two known orientations; Paragraph [0057]; Further, the second video camera 5 is adjusted to observe the corresponding tracking marker), the tracking system being adapted to detect and track the imaging head of the medical imaging device (Paragraph [0054]; While tracking marker 7 is fixedly coupled to the microscope 8 and camera 4; allowing camera 5 to recognize tracking marker 7; Paragraph [0053]; the position of the tracking marker 7 within the coordinate system of tracking camera 5 can be calculated; Sheet 1 Fig. 2) as well as to detect and track at least a partial portion of the patient for registration (Paragraph [0054]; allowing camera 5 to recognize the anatomical structure 2 and instrument 1 for registration purposes) to the 3D imaging data provided by the data provision unit (Paragraph [0019]-[0023]; the step of acquiring registration data may involve… a video registration using the medical microscope or a second video camera), the tracking system having a respective local coordinate system (Paragraph [0053]; the coordinate system of tracking camera 5); and a control unit (Paragraph [0052]; contains a computer 3 the processor of which is adapted to perform all of the method-steps described), which is adapted to process data of the medical imaging device (Paragraph [0058]; computer 3 connected to both cameras 4 and 5 is able to calculate the spatial position of each of the landmarks with respect to the microscope), the data of the tracking system (Paragraph [0058]; computer 3 connected to both cameras 4 and 5… a patient-invariant coordinate system can be calculated from the video image of camera 5) as well as the 3D imaging data (Claim 8; at the computer, registration data describing a spatial correspondence of a pre-acquired image dataset of the anatomical structure and the anatomical structure within the medical workspace) and to determine a position and/or orientation of the at least one medical instrument (Paragraph [0052]; the spatial position of instrument 1 within the coordinate system of camera 4 is determined) by linking the tracking from the tracking system (Paragraph [0054]; allowing camera 5 to recognize the anatomical structure 2 and instrument 1 for registration purposes.) as well as the tracking from the imaging head to the at least one medical instrument and to create a correlation representation with the 3D imaging data registered for the patient and the position and/or orientation of the at least one medical instrument (Paragraph [0017]; step of acquiring registration data that describes a spatial correspondence of a pre-acquired dataset of the anatomical structure and the actual anatomical structure within the medical workspace. Such image registration allows for calculating and displaying a virtual representation of the at least one medical instrument in a correct spatial arrangement relative to a two-dimensional or three-dimensional image obtained from a pre-acquired image dataset of the patient showing the anatomical structure.; Paragraph [0057]; The microscope zoom may then be set to a minimum so that it can observe the patient's head during the following registration procedure), and to output this by the display device (Paragraph [0053]; a medical navigation system is able to calculate and display a correct positional alignment of a virtual representation of the instrument 1 with respect to an image-representation of the anatomical structure (head) 2.), wherein the tracking system is adapted to provide a first transformation of the respective local coordinate system of the medical imaging device and the respective local coordinate system of the tracking system (Paragraph [0053]; as soon as the position of camera 4 with respect to camera 5 is known, the position of both, instrument 1 and tracking marker 7 can be transformed into one common coordinate system; Paragraph [0058]; As the relative position of the microscope and a patient-invariant coordinate system can be calculated from the video image of camera 5, the spatial position of each of the landmarks can be transformed into a patient-invariant coordinate system), wherein the imaging head is adapted to provide a second transformation of the respective local coordinate system of the at least one medical instrument and the respective local coordinate system of the medical imaging device (Paragraph [0061]; As the relative position of the microscope and a patient-invariant coordinate system can be calculated from the video image of camera 5, the spatial position of each of the landmarks can be transformed into a patient-invariant coordinate system), and wherein the control unit is adapted to perform a sequential tracking from the at least one medical instrument via the medical imaging device to the tracking system by processing the first transformation and the second transformation into a total transformation of the respective local coordinate system of the at least one medical instrument and the respective local coordinate system of the tracking system. Stopp does not explicitly teach the surgical microscope is adapted to perform three-dimensional detection; linking the tracking from the tracking system to the imaging head; and the control unit is adapted to perform a sequential tracking from the at least one medical instrument via the medical imaging device to the tracking system by processing the first transformation and the second transformation into a total transformation of the respective local coordinate system of the at least one medical instrument and the respective local coordinate system of the tracking system. Polchin, however, teaches a surgical navigation system (Paragraph [0001]; robotic surgical navigation system) comprising a medical imaging device (Paragraph [0053]; a second camera 130 (or microscope), Fig. 1 and 9) that is a surgical microscope for surgery (Paragraph [0053]; microscope; Paragraph [0003]; surgical camera or microscope) adapted to perform three-dimensional detection for tracking (Paragraph [0062]; The trackers 120 and 150 may be imaged by the stereoscopic camera 130 and used by the navigation computer system 302 to provide patient and/or tool tip registration and/or tracking). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the surgical microscope of Stopp to have been adapted to perform three-dimensional detection as taught by Polchin because it would improve 3D registration of the imaged surgical site with the pre-surgical images by allowing matching of 3D structures detected in the stereoscopic view with the 3D model (Paragraph [0150]-[0151]). Together Stopp and Polchin do not explicitly teach linking the tracking from the tracking system to the imaging head; and the control unit is adapted to perform a sequential tracking from the at least one medical instrument via the medical imaging device to the tracking system by processing the first transformation and the second transformation into a total transformation of the respective local coordinate system of the at least one medical instrument and the respective local coordinate system of the tracking system. Calloway, however, teaches a surgical navigation system (Paragraph [0004]-[0005]; computer assisted navigation during surgery; surgical system) for navigation during a surgical intervention (Paragraph [0035]; assist surgeons during medical procedures) on a patient (Paragraph [0033]; defining a target pose for a surgical tool to be used during a surgical procedure on an anatomical structure of the patient; Paragraph [0036]; o track the present pose and movement of the pose of tracks portions of the surgical robot 4 and the patient), and tracking of at least one medical instrument (Paragraph [0113]; real-time tracking of surgical instruments relative to floating patient anatomy); linking the tracking (Paragraph [0145]; various coordinate systems can be chained together by virtue of independent observations the various camera systems) from the tracking system to the imaging head (Paragraph [0072]-[0073]; include position sensor 832 and camera converter 834. Tracking subsystem 830 may correspond to the camera tracking system component 6 of FIG. 3; tracking subsystem 830 and the computer subsystem 820 can be included in the computer platform 910, which can be transported by the camera tracking system component 6′ of FIGS. 3A and 3B); and the control unit is adapted to perform a sequential tracking (Paragraph [0125]; computer operations that combine (chain) measured poses in ways that can improve optimization of one or more of the above three parameters by incorporating additional navigation cameras mounted to one or more XR headsets; Paragraph [0145]; various coordinate systems can be chained together) from the at least one medical instrument via the medical imaging device to the tracking system (Paragraph [0145]; The location of (E) with respect to (N3) can still be computed) by processing the first transformation and the second transformation into a total transformation of the respective local coordinate system of the at least one medical instrument and the respective local coordinate system of the tracking system (Paragraph [0145]-[0146]; If the patient reference (R) is seen by (N3) and either one of (N) or (N2), the pose of (E) with respect to (N3) can be solved directly by either one of the following two equation; The second equation describes using three transformations between the tracking system and imaging camera to obtain the coordinate system of the medical instrument E). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin to have further included linking the tracking from the tracking system to the imaging head; and adapted the control unit to perform a sequential tracking from the at least one medical instrument via the medical imaging device to the tracking system by processing the first transformation and the second transformation into a total transformation of the respective local coordinate system of the at least one medical instrument and the respective local coordinate system of the tracking system as taught by Calloway because it would have been known methods of synchronizing the poses between multiple imaging methods, thereby providing higher degree of accuracy while tracking (Paragraphs [0150]-[0151]) while further allow accurate tracking of objects in the scene when the view of one of the cameras is partially obstructed (Paragraph [0169]). Furthermore, transforming the coordinates from the medical imaging device to a the coordinates of the tracking system would allow all objects in the scene to be registered in a single coordinate system and allowed displays to be overlapped in an extended reality image which (Paragraph [0066]) would improve the ability to determine the position of objects in the display during the operation. Regarding claim 17, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp does not explicitly teach the imaging head of the medical imaging device comprises a stereo camera for a stereo image, and the control unit is adapted to detect a position and/or orientation of the at least one medical instrument relative to the imaging head from the stereo image via machine vision. Calloway further teaches the imaging head of the medical imaging device comprises a stereo camera for a stereo image (Paragraph [0092]; Each XR headset 1200 and 1210 can include one or more cameras; Paragraph [0146]; The chains can be arbitrarily long and are enabled by having more than one stereo camera system (e.g., N, N2, N3); Paragraph [0127]; using stereo matching to jointly identify pose of the DRA fiducials), and the control unit is adapted to detect a position and/or orientation of the at least one medical instrument relative to the imaging head from the stereo image via machine vision (Paragraph [0124]; navigated surgery can include computer vision tracking and determination of pose (e.g., position and orientation in a six degree-of-freedom coordinate system) of surgical instruments). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin and Calloway such that the imaging head of the medical imaging device comprises a stereo camera for a stereo image, and the control unit is adapted to detect a position and/or orientation of the at least one medical instrument relative to the imaging head from the stereo image via machine vision as further taught by Calloway because it would have been a known method for detecting medical instruments relative to a camera and further would allow combining the observed poses or partial poses in ways that can improve accuracy, robustness, and/or ergonomics during navigated surgery (Paragraph [0125]). Regarding claim 18, together Stopp, Polchin, and Calloway teach all of the limitations of claim 17 as noted above. Stopp does not explicitly teach the control unit is adapted to determine the position and/or orientation of the at least one medical instrument via triangulation of the stereo image and/or via reconstruction of a disparity overlap of the stereo image. Calloway, however, further teaches the control unit is adapted to determine the position and/or orientation of the at least one medical instrument via triangulation of the stereo image (Paragraph [0127]; aligned sufficiently to perform 3D DRA fiducials triangulation operations using stereo matching to jointly identify pose of the DRA fiducials). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the control unit of Stopp in view of Polchin and Calloway to been adapted to determine the position and/or orientation of the at least one medical instrument via triangulation of the stereo image as further taught by Calloway because have been a well-known and understood method of tracking objects from images and further can enable any one or more of identifying tools that would not be identified using a single coordinate system; increased pose tracking accuracy; and enabling a wider range of motion without losing tracking of surgical instruments, patient anatomy, and/or a robotic end effector (Paragraph [0127]). Regarding claim 21, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp does not explicitly teach the imaging head comprises infrared markers, and the tracking system comprises an infrared-based camera system configured to detect the infrared markers of the imaging head to track the position and/or orientation of the imaging head relative to the tracking system. Calloway further teaches the imaging head comprises infrared markers (Paragraph [0095]; A set of DRA fiducials, e.g., dots are painted or attached in a spaced apart known arranged on one or both sides of the headset; Paragraph [0072]; his tracking may be conducted in a manner consistent with the present disclosure including the use of infrared… technology that tracks the location of active or passive elements of DRAs 52), and the tracking system comprises an infrared-based camera system (Paragraph [0072]; This tracking may be conducted in a manner consistent with the present disclosure including the use of infrared light technology; Paragraph [0123]; Any plural number of near infrared cameras can be used) configured to detect the infrared markers of the imaging head to track the position and/or orientation of the imaging head relative to the tracking system (Paragraph [0127]; include near infrared tracking cameras and/or visible light tracking cameras that are configured to track fiducials of DRAs connected to surgical instruments, patient anatomy, other XR headset(s), and/or a robotic end effector; near infrared tracking coordinate systems enables the coordinate systems to be aligned sufficiently to perform 3D DRA fiducials triangulation operations using stereo matching to jointly identify pose of the DRA fiducials between the visible and near infrared tracking coordinate systems). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin and Calloway to have further comprised infrared markers, and the tracking system comprises an infrared-based camera system configured to detect the infrared markers of the imaging head to track the position and/or orientation of the imaging head relative to the tracking system as further taught by Calloway because it would have provided additional tracking volume coverage beyond what cameras on a single auxiliary tracking bar can provide. Adding near infrared tracking cameras to the existing auxiliary tracking system allows for the medical imager location to be tracked more robustly but less accurately than in visible light (Paragraph [0127]). Regarding claim 22, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp further teaches the surgical navigation system comprises a camera system (Paragraph [0053]; camera 4, a second video camera 5). Stopp does not explicitly teach the control unit is adapted to perform a spatial three-dimensional detection of a pose of the at least one medical instrument from at least two image perspectives provided by the camera system via machine vision. Calloway further teaches the surgical navigation system comprises a camera system (Paragraph [0123]; auxiliary tracking bar 46 having two pairs of stereo navigation cameras), and the control unit is adapted to perform a spatial three-dimensional detection of a pose of the at least one medical instrument (Paragraph [0058]; overlaid graphical representations of models of instruments that are positioned in the display screens relative to the anatomical structure; Paragraph [0111]; and 3D models on the display screen 1302) from at least two image perspectives provided by the camera system via machine vision (Paragraph [0124]; navigated surgery can include computer vision tracking and determination of pose (e.g., position and orientation in a six degree-of-freedom coordinate system) of surgical instruments,). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin and Calloway such that the control unit is adapted to perform a spatial three-dimensional detection of a pose of the at least one medical instrument from at least two image perspectives provided by the camera system via machine vision as further taught by Calloway because it would have been a well-known and understood method of determining the pose of the medical instrument that further would have allowed registering the pose of the instrument in the visual overlay of the model during the operation, thereby improving the navigation during the operation. Regarding claim 23, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp further teaches a mobile medical navigation tower (Paragraph [0052]; a mobile trolley which contains a computer 3, Sheet 1 Fig. 1), comprising: the surgical navigation system according to claim 15 as described above; and a mobile cart with wheels for mobile placement of the mobile medical navigation tower (Paragraph [0052]; a mobile trolley, Sheet 1 Fig. 1). Regarding claim 24, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Calloway further teaches a navigation method for tracking of at least one medical instrument in the surgical navigation system (Paragraph [0006]; Related methods by a camera tracking system and related computer program products are disclosed) according to claim 15 as noted above, the navigation method comprising the steps of: registering a partial portion of a patient with respect to 3D imaging data of the patient (Paragraph [0073]; the computer platform 910 can also include a navigation controller that is configured to use the determined poses to provide navigation information to users that guides their movement of tracked tools relative to position-registered patient images and/or tracked anatomical structures during a planned surgical procedure; Paragraph [0112]; a graphical representation 1600 of the tool can be displayed in 2D and/or 3D images in relation to a graphical representation 1610 of the anatomical structure, Fig. 16); detecting and tracking the at least one medical instrument by an imaging head of a medical imaging device (Paragraph [0046]; computer platform 910 in combination with the camera tracking system component 6 or other 3D localization system are configured to track in real-time the pose (e.g., positions and rotational orientations) of the DRA; Dynamic reference arrays, also referred to as “DRAB” herein, are rigid bodies which may be disposed on an XR headsets being worn by personnel in the operating room, 1310, Figs. 12 and 13); detecting and tracking the imaging head by a tracking system (Paragraph [0046]; The computer platform 910 in combination with the camera tracking system component 6 or other 3D localization system are configured to track in real-time the pose (e.g., positions and rotational orientations)…; This tracking of 3D coordinates of the DRA can allow the surgical system 2 to determine the pose of the DRA in any multidimensional space in relation to the target anatomical structure of the patient 50); determining a position and/or orientation of the at least one medical instrument by linking the tracking of the imaging head and the tracking of the at least one medical instrument (Paragraph [0145]-[0146]; If the patient reference (R) is seen by (N3) and either one of (N) or (N2), the pose of (E) with respect to (N3) can be solved directly by either one of the following two equation; The second equation describes using three transformations between the tracking system and imaging camera to obtain the coordinate system of the medical instrument E); and outputting a correlation representation with the 3D imaging data and with the position and/or orientation of the at least one medical instrument by a display device (Paragraph [0111]-[0115]; a graphical representation 1600 of the tool can be displayed in 2D and/or 3D images in relation to a graphical representation 1610 of the anatomical structure.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin and Calloway to include the method steps of registering a partial portion of a patient with respect to 3D imaging data of the patient; detecting and tracking the at least one medical instrument by an imaging head of a medical imaging device; detecting and tracking the imaging head by a tracking system; determining a position and/or orientation of the at least one medical instrument by linking the tracking of the imaging head and the tracking of the at least one medical instrument; and outputting a correlation representation with the 3D imaging data and with the position and/or orientation of the at least one medical instrument by a display device as further taught by Calloway because it would have been well-known and understood methods of registering imaging data from different sources and using the registered image data to output a 3D image of the registered scene and further would have allowed combining visualizations of images captured by the cameras and the 3D models obtained prior to the operation thereby assisting with navigation and view of the surgical site during the operation (Paragraph [0032]). Regarding claim 28, together Stopp, Polchin, and Calloway teach all of the limitations of claim 24 as noted above. Calloway further teaches a computer-readable storage medium comprising instructions stored in a non-transitory manner which, when executed by a computer, cause the computer to perform the navigation method (Paragraph [0175]; computer-implemented methods, apparatus (systems and/or devices) and/or computer program products… can be implemented by computer program instructions that are performed by one or more computer circuits) according to claim 24 as noted above. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin and Calloway to have further included the computer-readable storage medium comprising instructions stored in a non-transitory manner which, when executed by a computer, cause the computer to perform the navigation method according to claim 24 as further taught by Calloway because it would have been a well-known and understood methods of registering imaging data from different sources and using the registered image data to output a 3D image of the registered scene and further would have allowed combining visualizations of images captured by the cameras and the 3D models obtained prior to the operation thereby assisting with navigation and view of the surgical site during the operation (Paragraph [0032]). Regarding claim 31, together Stopp, Polchin, and Calloway teach all of the limitations of claim 24 as noted above. Stopp does not explicitly teach transferring the position and/or orientation of the at least one medical instrument to the 3D imaging data. Calloway further teaches transferring the position and/or orientation of the at least one medical instrument to the 3D imaging data (Paragraph [0145]-[0146]; If the patient reference (R) is seen by (N3) and either one of (N) or (N2), the pose of (E) with respect to (N3) can be solved directly by either one of the following two equation; The second equation describes using three transformations between the tracking system and imaging camera to obtain the coordinate system of the medical instrument E; Paragraph [0073]; the computer platform 910 can also include a navigation controller that is configured to use the determined poses to provide navigation information to users that guides their movement of tracked tools relative to position-registered patient images and/or tracked anatomical structures during a planned surgical procedure; Paragraph [0112]; a graphical representation 1600 of the tool can be displayed in 2D and/or 3D images in relation to a graphical representation 1610 of the anatomical structure, Fig. 16). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the method of Stopp in view of Polchin and Calloway to have included transferring the position and/or orientation of the at least one medical instrument to the 3D imaging data as further taught by Calloway because it would have improved the ability to graphically represent the 3D location and image of the tool with respect to the visualizations of the anatomical structure. Regarding claim 33, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp does not explicitly teach the tracking system comprises an inertial measurement unit (IMU)-based tracking system. Calloway further teaches the tracking system comprises an inertial measurement unit (IMU)-based tracking system (Paragraph [0103]-[0104]; Electrical components of the XR headset 920 can include a pose sensor (e.g., inertial measurement unit (IMU)); the pose sensor 1446, e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 920 along one or more defined coordinate axes.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin and Calloway to have further comprised an inertial measurement unit (IMU)-based tracking system as further taught by Calloway because it would have been a known method of tracking objects in a navigation system that further would have improved the detection and measurement of the pose and motion of the medical imager during the operation. Regarding claim 34, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp further teaches the medical imaging device is arranged closer to the at least one medical instrument than the tracking system (Paragraph [0053]; relative position between the anatomical structure 2 and camera 4, a second video camera 5 is provided at a fixed position with respect to the microscope 8 and video camera 4; Sheet 1 Fig. 1 shows the camera 4 is located closer to the instrument 1 than the second camera 5). Regarding claim 36, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp further teaches the arm and the tracking system are connected to a single medical navigation tower (Paragraph [0052]; microscope 8 is coupled to an articulated arm of a mobile trolley which contains a computer 3; Paragraph [0053]; the relative position between the anatomical structure 2 and camera 4, a second video camera 5 is provided at a fixed position with respect to the microscope 8 and video camera 4; Sheet 1 Fig. 1 shows the microscope and tracking camera are connected to single medical navigation tower.). Claims 19, 26, 27, and 35 are rejected under 35 U.S.C. 103 as being unpatentable over Stopp in view of Polchin and Calloway as applied to claims 15 and 24 above, respectively, and further in view of Sidar (US 20160235340). Regarding claim 19, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Together Stopp, Polchin, and Calloway does not teach the control unit is adapted to decode the predetermined optical pattern or to compare it with a reference stored in the data provision unit and to determine a position of an instrument tip relative to the predetermined optical pattern or a geometry of the at least one medical instrument based on the predetermined optical pattern. Sidar, however, teaches a surgical navigation system (Paragraph [0010]; systems for detecting the location of an endoscopic device inside a patient's body) comprising a pre-determined optical pattern that is a QR code (Paragraph [0116]; specific graphical pattern 247; The two dimensional barcodes shown in Fig. 2C are QR codes) and/or a barcode (Paragraph [0116]; plurality of codes 227, two dimensional barcode, Fig. 2B and 2C), and the control unit is adapted to decode the pre-determined optical pattern (Paragraph [0121]; optical reader automatically scans the images of the codes/marks… processing unit configured for decoding the codes/marks) or to compare it with a reference stored in the data provision unit (Paragraph [0117]; read coded markings with codes in a look-up table) and to determine a position of an instrument tip relative to the pre-determined optical pattern (Paragraph [0082]; set of codes that aid in establishing the insertion depth of the device with respect to a reference point… a reference point for measuring the insertion depth of an endoscope is a front panel of the tip section of the endoscope, Figs. 2B and 2C) or a geometry of the at least one medical instrument based on the pre-determined optical pattern (Paragraph [0112]; 208 records the codes 207 at different locations, indicative of insertion depth and relative position around the periphery; Paragraph [0121]; uses this image information to decode the exact insertion depth). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the control unit of Stopp in view of Polchin and Calloway to be adapted decode the pre-determined optical pattern or to compare it with a reference stored in the data provision unit and to determine a position of an instrument tip relative to the pre-determined optical pattern or a geometry of the at least one medical instrument based on the pre-determined optical pattern as taught by Sidar because doing so would enhance the reliability of location data estimation, thereby allowing calculation of the exact insertion depth and rotational angle of the instrument (Sidar, Paragraph [0121]). Regarding claim 26, together Stopp, Polchin, and Calloway teach all of the limitations of claim 24 as noted above. Calloway further teaches detecting a pre-determined optical pattern (Paragraph [0127]; near infrared tracking cameras and/or visible light tracking cameras that are configured to track fiducials of DRAs connected to surgical instruments,… enables the coordinate systems to be aligned sufficiently to perform 3D DRA fiducials). Together Stopp, Polchin, and Calloway do not teach the pre-determined optical pattern is an information carrier for a distance from the pre-determined optical pattern to an instrument tip or for a geometry of the at least one medical instrument; and determining the position of an instrument tip of the at least one medical instrument based on information derived from the pre-determined optical pattern. Sidar, however, teaches a navigation method (Paragraph [0047]; method of obtaining a real time image map of an endoscope tip) comprising detecting a pre-determined optical pattern (Paragraph [0107]; processor 209 which is configured to receive the information on codes), in particular a QR code (Paragraph [0116]; specific graphical pattern 247; The two dimensional barcodes shown in Fig. 2C are QR codes) as an information carrier for a distance from the pre-determined optical pattern to an instrument tip (Paragraph [0082]; set of codes that aid in establishing the insertion depth of the device with respect to a reference point… a reference point for measuring the insertion depth of an endoscope is a front panel of the tip section of the endoscope, Figs. 2B and 2C) or for a geometry of the at least one medical instrument (Paragraph [0112]; 208 records the codes 207 at different locations, indicative of insertion depth and relative position around the periphery; Paragraph [0121]; uses this image information to decode the exact insertion depth); and determining the position of an instrument tip of the at least one medical instrument based on information derived from the pre-determined optical pattern (Paragraph [0082]; set of codes that aid in establishing the insertion depth of the device with respect to a reference point… a reference point for measuring the insertion depth of an endoscope is a front panel of the tip section of the endoscope, Figs. 2B and 2C). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the method of Stopp in view of Polchin and Calloway to have further included using the pre-determined optical pattern is an information carrier for a distance from the pre-determined optical pattern to an instrument tip or for a geometry of the at least one medical instrument; and determining the position of an instrument tip of the at least one medical instrument based on information derived from the pre-determined optical pattern as taught by Sidar because doing so would enhance the reliability of location data estimation, thereby allowing calculation of the exact insertion depth and rotational angle of the instrument (Sidar, Paragraph [0121]). Regarding claim 27, together Stopp, Polchin, and Calloway teach all of the limitations of claim 26 as noted above. Stopp discloses the invention as claimed and discussed above, but fails to explicitly disclose the pre-determined optical pattern is a Quick Response (QR) code and a distance from the QR code to the instrument tip is encoded in the QR code, and the navigation method comprises the steps of: decoding the QR code; reading the distance to the instrument tip; and determining the position of the instrument tip relative to the imaging head and via the imaging head relative to the 3D imaging data. Sidar, however, teaches the pre-determined optical pattern is a QR code (Paragraph [0116]; specific graphical pattern 247; The two dimensional barcodes shown in Fig. 2C are QR codes) and a distance from the QR code to the instrument tip is encoded in the QR code (Paragraph [0121]; codes/marks calculates the exact insertion depth), and the navigation method comprises the steps of: decoding the QR code and reading the distance to the instrument tip and via the imaging head relative to the 3D imaging data (Paragraph [0117]; coded markings and a detector/reading unit to determine position… matched to images to associate specific images with a specific position in the lumen; associate the image with an insertion depth; Paragraph [0121]; uses this image information to decode the exact insertion depth). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the method of Stopp in view of Polchin and Calloway and Sidar such that the pre-determined optical pattern is a QR code and a distance from the QR code to the instrument tip is encoded in the QR code, and the navigation method comprises the steps of: decoding the QR code; and reading the distance to the instrument tip and via the imaging head relative to the 3D imaging data as taught by Sidar because doing so would enhance the reliability of location data estimation, thereby allowing calculation of the exact insertion depth and rotational angle of the instrument (Sidar, Paragraph [0121]). Regarding claim 35, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Stopp does not explicitly teach the predetermined optical pattern comprises a data matrix code. Sidar, however, teaches the predetermined optical pattern comprises a data matrix code (Paragraph [0116]; specific graphical pattern 247; two dimensional barcodes are used which comprise layers of a specific graphical pattern 247 marked over the outer surface of insertion tube 240. The encoding schemes comprising two dimensional barcodes provide a more advanced/detailed level of encoding that puts less load on the processor; The two dimensional barcodes shown in Fig. 2C are QR codes; QR codes are considered to be a data matrix code as understood in its broadest reasonable interpretation). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the predetermined optical pattern of Stopp in view of Polchin and Calloway to have further comprised a data matrix code as taught by Sidar because it would have allowed encoding various information about the instrument on the instrument which could be read by the computer and a two dimensional barcodes provide a more advanced/detailed level of encoding that puts less load on the processor (Paragraphs [0114]-[0118]). Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Stopp in view of Polchin and Calloway as applied to claim 15 above, and further in view of Colmenares (US 10949986). Regarding claim 20, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Together Stopp, Polchin, and Calloway do not teach a geometric shape of the at least one medical instrument is stored in the data provision unit, and the control unit determines the position of an instrument tip based on a partial portion of the at least one medical instrument detected by the imaging head and a stored geometric form. Colmenares, however, teaches a surgical navigation system (Col. 1, ln. 66-Col. 2, ln. 4; mediated-reality imaging systems, such as for use in surgical procedures, Figs. 1 and 2) wherein a geometric shape of the at least one medical instrument (Col. 13, ln. 55-Col. 14, ln. 18; calibration data C, Fig. 9A; The calibration data from the markers to the tip is considered to read on the claimed limitation of a geometric shape of the medical instrument in its broadest reasonable interpretation) is stored in the data provision unit (Col. 6, ln. 46-65; stored or distributed on computer-readable media) (Col. 13, ln. 55-Col. 14, ln. 18; Data from at least two of the trackers 114 is needed so that the position of the markers 105 can be triangulated from the positional data, Fig. 9A), and the control unit determines the position of an instrument tip (Col. 13, ln. 55-Col. 14, ln. 18; the system 100 estimates a position of the tip 103, Fig. 9A) based on a partial portion of the at least one medical instrument detected by the imaging head (Col. 15, ln. 13-20; the 3D position of the tip 103 can be determined by projecting the position of the tip 103 in the 2D image from the camera 112) and a stored geometric form (Col. 13, ln. 55-Col. 14, ln. 18; estimates a position of the tip 103 (shown as tip position 103′) based on a calibrated offset C from the markers 105). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Stopp in view of Polchin and Calloway to have included a geometric shape of the at least one medical instrument is stored in the data provision unit and the control unit determines the position of an instrument tip based on a partial portion of the at least one medical instrument detected by the imaging head and a stored geometric form as taught by Colmenares because it would have allowed for more precise position estimation of the tip of the instrument with reduced processing requirements and thus allow rendering of the tool in the virtual environment in real-time with high precision (Col. 15, ln. 53-Col. 16, Ln. 4). Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Stopp in view of Polchin and Calloway as applied to claim 24 above, and further in view of Aghdasi (US 11295460). Regarding claim 25, together Stopp, Polchin, and Calloway teach all of the limitations of claim 24 as noted above. Calloway further teaches creating a stereo image by the medical imaging device (Paragraph [0092]; Each XR headset 1200 and 1210 can include one or more cameras; Paragraph [0146]; The chains can be arbitrarily long and are enabled by having more than one stereo camera system (e.g., N, N2, N3); Paragraph [0127]; using stereo matching to jointly identify pose of the DRA fiducials). Together Stopp, Polchin, and Calloway do not explicitly teach creating, based on the stereo image, a depth map with depth information by triangulation and/or by reconstruction of a disparity overlap; and determining the position and/or orientation of the at least one medical instrument based on the stereo image and the depth map. Aghdasi, however, teaches a surgical navigation system (Col. 2, Ln. 32-45; mediated-reality imaging systems, such as for use in surgical procedures) for navigation during a surgical intervention on a patient (Col. 2, Ln. 32-53; portion of a patient in the scene, such as a spine of a patient undergoing a spinal surgical procedure), and tracking of at least one medical instrument (Col. 5, Ln. 44-Col. 6, Ln. 3; a tool 101, e.g., a surgical tool, Fig. 1); creating a stereo image by the medical imaging device (Col. 7, ln. 6-36; synthesize the output image as a 3D (or stereoscopic 2D) rendering of the scene); creating, based on the stereo image, a depth map with depth information (Col. 7, ln. 44-67; processing device 102 can select a stereoscopic pair of images…; the image processing device 103 (and/or the depth sensor 114) is configured to estimate a depth for each surface point of the scene 108) by reconstruction of a disparity overlap (Col. 7, ln. 44-67; using techniques such as stereo block matching, correspondence, defocus…); and determining the position and/or orientation of the at least one medical instrument based on the stereo image and the depth map (Col. 8, Ln. 29-53; process positional data captured by the trackers 113 to track objects, e.g., the tool 101; can compute the 3D position of the markers 111; Col. 8, Ln. 32-36; determine the position of the markers 111 in the 2D images… compute the 3D position of the markers 111 via triangulation of the 2D positional data). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have configured the system of Stopp in view of Polchin and Calloway to have created a stereo image by the medical imaging device, performed steps of creating, based on the stereo image, a depth map with depth information by triangulation and/or by reconstruction of a disparity overlap; and determining the position and/or orientation of the at least one medical instrument based on the stereo image and the depth map as taught by Aghdasi because it would allow the registration to be based on a point cloud depth map, which would allow the registration to be run in parallel to the generation of the 3D mesh and subsequent synthesis of the 3D virtual image, thereby increasing the processing speed of the imaging system (Aghdasi, Col. 3, ln. 52-Col. 4, ln. 3). Claim 32 is rejected under 35 U.S.C. 103 as being unpatentable over Stopp in view of Polchin and Calloway as applied to claim 15 above, and further in view of McKinnon (US 20200275976). Regarding claim 32, together Stopp, Polchin, and Calloway teach all of the limitations of claim 15 as noted above. Together Calloway and Bay do not explicitly teach the tracking system comprises an electromagnetic-based system. McKinnon, however, teaches a surgical navigation system (Paragraph [0064]; surgical procedures that utilize surgical navigation systems) comprising a tracking system (Paragraph [0075]; The Tracking System 115, Fig. 1), wherein the tracking system comprises an electromagnetic-based system (Paragraphs [0075] and [0077]-[0079]; electromagnetic (EM) tracking systems). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the tracking system of Calloway in view of Bay to have included an electromagnetic-based system as taught by McKinnon because it would allow tracking without restrict the movement of a surgeon or medical professional (McKinnon, Paragraph [0077]) and further have the advantage of allowing tracking surgical tools without requiring a line of sight. Response to Arguments Claim Interpretation under – 35 U.S.C. § 112(f) Examiner maintains claim interpretations under 35 U.S.C. § 112(f) for the “data provision unit”, “tracking system”, and “control unit”. Claim Rejections under – 35 U.S.C. § 112 Examiner acknowledges the amendments to the claims and withdraws all previous rejections under 35 USC 112(a) and 35 USC 112(b). The amendments to the claims raises new rejections under 35 USC 112(b) which are now presented. Claim Rejections under – 35 U.S.C. § 102 and 103 Applicant’s arguments with respect to the previous 35 U.S.C. § 103 rejections have been considered but are moot in view of the updated grounds of rejection necessitated by amendments. Applicant’s arguments with respect to new claim 35 have been fully considered but they are not persuasive. Applicant presents new claim 35 with limitations toward “the predetermined optical pattern comprises a data matrix code” and further asserts the reference of Sidar does not teach a matrix and only teaches a linear code. Examiner respectfully disagrees. Examiner would like to point out that the Sidar explicitly teaches “two dimensional barcodes” on the outer surface of the instrument as described in at least paragraph [0116]. Furthermore, Sidar depicts the 2D barcodes as being QR codes as depicted in Figure 2C. These QR codes as 2D barcodes are considered to be data matrix codes as understood in its broadest reasonable interpretation and in view of Applicant specification and Figure 6 which describes the optical patterns as being QR codes. For these reasons, rejections of new claim 35 are presented. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Dean N Edun whose telephone number is (571)270-3745. The examiner can normally be reached M-F 8am-5:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anh Tuan Nguyen can be reached at (571)272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DEAN N EDUN/Examiner, Art Unit 3797 /ANH TUAN T NGUYEN/Supervisory Patent Examiner, Art Unit 3795 03/02/26
Read full office action

Prosecution Timeline

May 01, 2024
Application Filed
Aug 15, 2024
Non-Final Rejection — §102, §103, §112
Nov 22, 2024
Response Filed
Dec 30, 2024
Non-Final Rejection — §102, §103, §112
Apr 14, 2025
Response Filed
May 13, 2025
Final Rejection — §102, §103, §112
Oct 21, 2025
Request for Continued Examination
Oct 24, 2025
Response after Non-Final Action
Feb 25, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582376
CONSTITUTIVE EQUATION FOR NON-INVASIVE BLOOD PRESSURE MEASUREMENT SYSTEMS AND METHODS
2y 5m to grant Granted Mar 24, 2026
Patent 12575750
ASYMMETRIC SENSORS FOR RING WEARABLE
2y 5m to grant Granted Mar 17, 2026
Patent 12543967
APPARATUS AND METHOD FOR QUANTIFICATION OF THE MAPPING OF THE SENSORY AREAS OF THE BRAIN
2y 5m to grant Granted Feb 10, 2026
Patent 12521019
SYSTEMS AND METHODS OF RELATIVE ONSET FLUORESCENCE DELAY FOR MEDICAL IMAGING
2y 5m to grant Granted Jan 13, 2026
Patent 12426852
CATHETER WITH ACOUSTIC LENS ARRANGEMENT FOR LOCALIZED ULTRASONIC WAVE TRANSMISSION
2y 5m to grant Granted Sep 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
43%
Grant Probability
99%
With Interview (+65.0%)
3y 5m
Median Time to Grant
High
PTA Risk
Based on 35 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month