Prosecution Insights
Last updated: April 19, 2026
Application No. 18/100,826

METHOD OF MEDICAL CALIBRATION

Non-Final OA §102§103
Filed
Jan 24, 2023
Examiner
ZAK, JACQUELINE ROSE
Art Unit
2666
Tech Center
2600 — Communications
Assignee
Carl Zeiss Meditec AG
OA Round
3 (Non-Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
55%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
8 granted / 12 resolved
+4.7% vs TC avg
Minimal -11% lift
Without
With
+-11.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
46 currently pending
Career history
58
Total Applications
across all art units

Statute-Specific Performance

§101
5.7%
-34.3% vs TC avg
§103
56.3%
+16.3% vs TC avg
§102
21.1%
-18.9% vs TC avg
§112
13.8%
-26.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 12 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/02/2025 has been entered. Claim Status Claims 1-17 are pending for examination in the application filed 12/02/2025. Claims 1, 14-15, and 17 have been amended. Priority Acknowledgement is made of Applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent application DE102022102698.1 filed on 02/04/2022. The certified copy has been filed in parent application DE102022125798.3 filed on 10/06/2022. Response to Arguments and Amendments Applicant's arguments filed 12/02/2025 have been fully considered but they are not persuasive. Applicant argues on page 7 of the Remarks filed 12/02/2025 that Feilkas does not disclose the amended claim language of a method and/or system that determines a spatial pose of a medical marker, the spatial pose including a position and an orientation of the medical marker. As stated on pages 4-5 of the Final Office Action filed 09/02/2025: Feilkas teaches…determining a spatial pose of a medical marker on the basis of an image representation of the medical marker and of a medical instrument connected therewith ([0009] At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument, wherein the spatial position of the instrument can be ascertained based on the markers via a navigation system using a known method. The markers can be formed as reflecting surfaces or spheres, for example. A camera that can detect infrared light emitted or reflected from the markers can be provided for detecting the position of the instrument); Feilkas additionally teaches the spatial pose including a position and an orientation of the medical marker ([0010] Preferably, the respective views or outlines of the instrument model that correspond to the orientation or position of the actual instrument relative to the camera, as measured via the markers, can be calculated from a three-dimensional data model or software model of the instrument). Applicant further argues on page 7-8 of the Remarks that “Feilkas does not teach or suggest that the compared features include the spatial pose including a position and an orientation of the medical marker”. It is unclear what Applicant is referring to because the claims do not use the language of “compared features”. The claims describe determining the deviation between the positions of the tracking point and adjusting information relating to the spatial pose relationship between the medical marker and the tracking point based on this deviation. Thus, the position and orientation of the medical marker, as Applicant is seemingly referring to as “the compared features”, is distinct from the positions of the tracking point. Examiner further notes that “information relating to a spatial pose relationship between the medical marker and the tracking point” is distinct from the earlier claimed “spatial pose”. Arguments related to the adjustment step were additionally previously addressed in the Response to Arguments and Amendments in the Final Office Action filed 09/02/2025: For further clarification, Feilkas describes the pre-calibration data as: [0018] the pre-calibration data can include information regarding the geometry, the dimensions, the spatial arrangement of combinable elements (e.g., an instrument with exchangeable tips or an instrument for placing implants in connection with the selected implant), and/or regarding possible degrees of freedom (e.g., joints or possible deformations) of the instrument. The information relating to the spatial pose relationship between the medical marker and the tracking point is thus taught by Feilkas as the pre-calibration data (see Fig. 4 for instrument with medical markers and tracking point). Feilkas further explains that the pre-calibration data can be adjusted based on a deviation from image data: [0019] By comparing the image data detected by a camera and the pre-calibration data, it is possible to check whether an instrument is within a given specification, for example, which can be indicated in the pre-calibration data as a tolerance relating to dimensions of the instrument. [0020] It is further possible for the data recorded by the camera with respect to the actual condition or the configuration of the instrument to be used to adapt or modify the pre-calibration data, such that the data ascertained by means of the camera with respect to the actual configuration of an instrument can be made available to the navigation system, for example, to precisely navigate said instrument. Therefore, as cited in the Non-Final Office Action, Feilkas teaches adjusting the information relating to the spatial pose relationship between medical marker and tracking point on the basis of the determined deviation. Thus, the rejections of independent claim 1 and all depending claims are maintained. Please see below for the 35 U.S.C. 102(a)(2) and 35 U.S.C. 103 rejections of claims 1-17 including newly added amendments. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-3, 6-10, and 12-17 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Feilkas (US20060173356A1). Regarding claim 1, Feilkas teaches a method of medical calibration ([0002] The present invention relates to a device and a method for calibrating an instrument), comprising the method steps of: determining a spatial pose of a medical marker on the basis of an image representation of the medical marker and of a medical instrument connected therewith ([0009] At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument, wherein the spatial position of the instrument can be ascertained based on the markers via a navigation system using a known method. The markers can be formed as reflecting surfaces or spheres, for example. A camera that can detect infrared light emitted or reflected from the markers can be provided for detecting the position of the instrument), the spatial pose including a position and an orientation of the medical marker ([0010] Preferably, the respective views or outlines of the instrument model that correspond to the orientation or position of the actual instrument relative to the camera, as measured via the markers, can be calculated from a three-dimensional data model or software model of the instrument); determining a first position of a tracking point of the medical instrument on the basis of the determined spatial pose and on the basis of information relating to a spatial pose relationship between medical marker and tracking point, and determining a second position of the tracking point on the basis of the captured image representation ([0040] FIG. 3 shows the principle of superimposing the video of the pre-calibration data image onto an actual image 8 detected by the video camera 4. In a video recording 9a, the actual image 8 of the instrument 5 is shown in front of the video camera 4. The computational unit 1 calculates a similar image 9b based on the positional information of the instrument 5 calculated by the tracking system 2 and the calibration of the video camera 4. This image 9b contains a computer-generated version of the representation of the instrument 10. [0041] FIG. 4 shows an enlarged representation of the superimposed image 11 shown in FIG. 3, wherein the actual image 8 and the computer-generated image 10 do not completely align up to one another (e.g., they are not in the same location in the superimposed image 11). In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated); determining a deviation between the first position and the second position ([0041] In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated. If a number of differences with respect to corresponding corner points or edges are calculated, then it is possible to ascertain the preciseness in which the two images 8 and 10 match); and adjusting the information relating to the spatial pose relationship between medical marker and tracking point on the basis of the determined deviation ([0010] If a difference or deviation from the pre-calibration data is established, however, an error message can be generated, e.g., the instrument requires calibration, or the pre-calibration data, which may be used for subsequent navigation, can be adapted to the optically detected data). Regarding claim 2, Feilkas teaches the method of claim 1. Feilkas further teaches capturing a multiplicity of image representations of the medical marker and of the medical instrument connected therewith ([0014] If an instrument having a more complex shape is to be calibrated, the instrument may be moved or turned to record a number of views of the object using the camera, wherein software can provide a corresponding instruction for moving the instrument); determining, for each of the captured image representations, the first position and the second position of the tracking point in the captured image representation ([0016] Although a single view may be used, it is preferable that at least two or more lateral views of the instrument are detected by the optical camera, wherein the instrument can be turned, shifted or rotated in the viewing range of the camera. Preferably, the visibility of certain points, such as the tip of an instrument, for example, can be checked or verified. To this end, it is possible to check whether specific points defined in the pre-calibration data, such as corner points, edges or tips of the instrument, for example, also are visible or concealed in the optically detected recording) and a deviation between the first position and the second position ([0015] Further, the method can ensure that only calibrated instruments or implants are used for surgical procedures, since if the shape of an instrument deviates from the pre-calibration data, an error message can be output or the navigation system may not enable navigation due to detection of a faulty or incorrect instrument); and adjusting the information relating to the spatial pose relationship between medical marker and tracking point on the basis of the multiplicity of determined deviations ([0010] If a difference or deviation from the pre-calibration data is established, however, an error message can be generated, e.g., the instrument requires calibration, or the pre-calibration data, which may be used for subsequent navigation, can be adapted to the optically detected data). Regarding claim 3, Feilkas teaches the method of claim 1. Feilkas further teaches wherein calibrated information relating to the spatial pose relationship between medical marker and tracking point is available if a deviation between the first position and the second position is determined as being below a predetermined threshold value (zero) ([0010] These stored, so-called pre-calibration data can be compared with the geometry of the instrument detected by the camera (e.g., the optically detected data) to establish whether the optically detected data representing the actual geometry or form of the instrument match the pre-calibration data. Thus, so-called tracking data and a camera image data can be assigned, wherein if the camera image data match the pre-calibration data, the instrument may be said to be calibrated, verified and/or validated. If a difference or deviation from the pre-calibration data is established, however, an error message can be generated). Regarding claim 6, Feilkas teaches the method of claim 1. Feilkas further teaches wherein the medical marker has a geometry and the spatial pose of the medical marker is determined on the basis of the captured image representation of the medical marker and on the basis of information relating to the geometry of the medical marker ([0009] A method for calibrating, verifying and/or validating an instrument or implant for medical use (referred to below as an instrument) is provided. At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument, wherein the spatial position of the instrument can be ascertained based on the markers via a navigation system using a known method). Regarding claim 7, Feilkas teaches the method of claim 6. Feilkas further teaches wherein the image representation of the medical marker contains image representations of at least three marker elements, the information relating to the geometry of the medical marker including information relating to the relative spatial pose of the marker elements with respect to one another and determining the spatial pose of the medical marker including the determination of the spatial poses of the marker elements ([0009] A method for calibrating, verifying and/or validating an instrument or implant for medical use (referred to below as an instrument) is provided. At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument, wherein the spatial position of the instrument can be ascertained based on the markers via a navigation system using a known method. [0012] The data can be stored in a database, for example, as pre-calibration data (e.g., as a description of the three-dimensional object for a navigation system). Further, the data can represent a three-dimensional model that describes the precise shape of an object or instrument and the position of each marker or reference array on the object or instrument). Regarding claim 8, Feilkas teaches the method of claim 7. Feilkas further teaches wherein the information relating to the spatial pose relationship between medical marker and tracking point defines a relative spatial pose of the tracking point with respect to the marker elements ([0012] The data can be stored in a database, for example, as pre-calibration data (e.g., as a description of the three-dimensional object for a navigation system). Further, the data can represent a three-dimensional model that describes the precise shape of an object or instrument and the position of each marker or reference array on the object or instrument [0043] For the points 15 and 18, these positions are known from the pre-calibration data) and/or with respect to a target point that has a fixed spatial relationship with respect to the marker elements. Regarding claim 9, Feilkas teaches the method of claim 6. Feilkas further teaches wherein the information relating to the geometry of the medical marker is predetermined ([0009] At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument). Regarding claim 10, Feilkas teaches the method of claim 6. Feilkas further teaches capturing a first image representation of the medical marker in a first perspective; capturing a second image representation of the medical marker in a second perspective ([0014] If an instrument having a more complex shape is to be calibrated, the instrument may be moved or turned to record a number of views of the object using the camera, wherein software can provide a corresponding instruction for moving the instrument); and determining the information relating to the geometry of the medical marker on the basis of the first image representation and the second image representation ([0012] The data can be stored in a database, for example, as pre-calibration data (e.g., as a description of the three-dimensional object for a navigation system). Further, the data can represent a three-dimensional model that describes the precise shape of an object or instrument and the position of each marker or reference array on the object or instrument). Regarding claim 12, Feilkas teaches the method of claim 1. Feilkas further teaches wherein the image representations of the medical marker and of the medical instrument connected therewith are captured by means of a monocular system ([0009] A camera that can detect infrared light emitted or reflected from the markers can be provided for detecting the position of the instrument. The camera preferably is calibrated and the spatial position of the camera may be unknown, known or defined. Furthermore, the geometry (e.g., one or more views, mappings or outlines of the instrument from one or more various directions) can be optically detected via a camera, wherein the camera can be the same camera used for detecting the position of the markers). Regarding claim 13, Feilkas teaches the method of claim 1. Feilkas further teaches wherein at least one image representation of the medical marker and of the medical instrument connected therewith is captured by means of a first camera and a second camera ([0036] The optical tracking system 2 includes two infrared cameras 2a and 2b that can detect light signals reflected by the three markers 6a of the reference star 6, so as to detect the position of the medical instrument 5), or image representations of the medical marker and of the medical instrument connected therewith are captured immediately before and after a defined robotic actuation of a capturing camera or of the medical marker and of the medical instrument connected therewith. Regarding claim 14, Feilkas teaches a medical calibration system ([0002] The present invention relates to a device and a method for calibrating an instrument) comprising: an imaging sensor ([0009] A camera that can detect infrared light emitted or reflected from the markers can be provided for detecting the position of the instrument); a medical marker ([0009] At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument); a medical instrument ([0009] A method for calibrating, verifying and/or validating an instrument or implant for medical use (referred to below as an instrument) is provided); a control unit (computer) connected to the imaging sensor ([0013] A calibrated video signal may be an input obtained from a standard video camera, the properties or parameters of which (e.g., the position and/or the detection function of the camera) can be determined and calculated for a so-called "virtual camera". This virtual camera can be used by the computer to calculate images based on three-dimensional objects (e.g., by projecting in the detection direction of the actual camera), which match the views or objects actually present or detected); and a storage unit connected to the control unit and comprising commands which, upon execution by the control unit, cause the control unit to ([0023] In accordance with another aspect, the invention provides a computer program which, when loaded onto a computer or running on a computer, carries out one or more of the method steps described above. [0024] The invention further provides a program storage medium or computer program product comprising such a program): determine a spatial pose of a medical marker on the basis of an image representation of the medical marker and of a medical instrument connected therewith ([0009] At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument, wherein the spatial position of the instrument can be ascertained based on the markers via a navigation system using a known method. The markers can be formed as reflecting surfaces or spheres, for example. A camera that can detect infrared light emitted or reflected from the markers can be provided for detecting the position of the instrument), the spatial pose including a position and an orientation of the medical marker ([0010] Preferably, the respective views or outlines of the instrument model that correspond to the orientation or position of the actual instrument relative to the camera, as measured via the markers, can be calculated from a three-dimensional data model or software model of the instrument); determine a first position of a tracking point of the medical instrument on the basis of the determined spatial pose and on the basis of information relating to a spatial pose relationship between medical marker and tracking point, and determine a second position of the tracking point on the basis of the captured image representation ([0040] FIG. 3 shows the principle of superimposing the video of the pre-calibration data image onto an actual image 8 detected by the video camera 4. In a video recording 9a, the actual image 8 of the instrument 5 is shown in front of the video camera 4. The computational unit 1 calculates a similar image 9b based on the positional information of the instrument 5 calculated by the tracking system 2 and the calibration of the video camera 4. This image 9b contains a computer-generated version of the representation of the instrument 10. [0041] FIG. 4 shows an enlarged representation of the superimposed image 11 shown in FIG. 3, wherein the actual image 8 and the computer-generated image 10 do not completely align up to one another (e.g., they are not in the same location in the superimposed image 11). In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated); determine a deviation between the first position and the second position ([0041] In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated. If a number of differences with respect to corresponding corner points or edges are calculated, then it is possible to ascertain the preciseness in which the two images 8 and 10 match); and adjust the information relating to the spatial pose relationship between medical marker and tracking point on the basis of the determined deviation ([0010] If a difference or deviation from the pre-calibration data is established, however, an error message can be generated, e.g., the instrument requires calibration, or the pre-calibration data, which may be used for subsequent navigation, can be adapted to the optically detected data). Regarding claim 15, Feilkas teaches a computer program comprising commands which, upon execution by the control unit of a system, cause the control unit to carry out steps of ([0023] In accordance with another aspect, the invention provides a computer program which, when loaded onto a computer or running on a computer, carries out one or more of the method steps described above. [0024] The invention further provides a program storage medium or computer program product comprising such a program): determining a spatial pose of a medical marker on the basis of an image representation of the medical marker and of a medical instrument connected therewith ([0009] At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument, wherein the spatial position of the instrument can be ascertained based on the markers via a navigation system using a known method. The markers can be formed as reflecting surfaces or spheres, for example. A camera that can detect infrared light emitted or reflected from the markers can be provided for detecting the position of the instrument), the spatial pose including a position and an orientation of the medical marker ([0010] Preferably, the respective views or outlines of the instrument model that correspond to the orientation or position of the actual instrument relative to the camera, as measured via the markers, can be calculated from a three-dimensional data model or software model of the instrument); determining a first position of a tracking point of the medical instrument on the basis of the determined spatial pose and on the basis of information relating to a spatial pose relationship between medical marker and tracking point, and determining a second position of the tracking point on the basis of the captured image representation ([0040] FIG. 3 shows the principle of superimposing the video of the pre-calibration data image onto an actual image 8 detected by the video camera 4. In a video recording 9a, the actual image 8 of the instrument 5 is shown in front of the video camera 4. The computational unit 1 calculates a similar image 9b based on the positional information of the instrument 5 calculated by the tracking system 2 and the calibration of the video camera 4. This image 9b contains a computer-generated version of the representation of the instrument 10. [0041] FIG. 4 shows an enlarged representation of the superimposed image 11 shown in FIG. 3, wherein the actual image 8 and the computer-generated image 10 do not completely align up to one another (e.g., they are not in the same location in the superimposed image 11). In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated); determining a deviation between the first position and the second position ([0041] In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated. If a number of differences with respect to corresponding corner points or edges are calculated, then it is possible to ascertain the preciseness in which the two images 8 and 10 match); and adjusting the information relating to the spatial pose relationship between medical marker and tracking point on the basis of the determined deviation ([0010] If a difference or deviation from the pre-calibration data is established, however, an error message can be generated, e.g., the instrument requires calibration, or the pre-calibration data, which may be used for subsequent navigation, can be adapted to the optically detected data). Regarding claim 16, Feilkas teaches the method of claim 1. Feilkas further teaches wherein the image representation images a spatial shape of the medical marker ([0009] A method for calibrating, verifying and/or validating an instrument or implant for medical use (referred to below as an instrument) is provided. At least one marker and preferably three markers (e.g., a reference star) or a number of markers having a known geometry (e.g., fixed or variable in accordance with the configuration of the instrument) can be attached to the instrument, wherein the spatial position of the instrument can be ascertained based on the markers via a navigation system using a known method. The markers can be formed as reflecting surfaces or spheres, for example. A camera that can detect infrared light emitted or reflected from the markers can be provided for detecting the position of the instrument). Regarding claim 17, Feilkas teaches the method of claim 1. Feilkas further teaches wherein the spatial pose relationship comprises a position and an alignment of the tracking point relative to the medical marker ([0018] the pre-calibration data can include information regarding the geometry, the dimensions, the spatial arrangement of combinable elements (e.g., an instrument with exchangeable tips or an instrument for placing implants in connection with the selected implant), and/or regarding possible degrees of freedom (e.g., joints or possible deformations) of the instrument. [0040] FIG. 3 shows the principle of superimposing the video of the pre-calibration data image onto an actual image 8 detected by the video camera 4. Fig. 3 shows instrument with medical markers and tracking point). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Feilkas in view of Burger (US20090099445A1). Regarding claim 4, Feilkas teaches the method of claim 3. Burger, in the same field of endeavor of medical instrument calibration, teaches a continuous determination of first positions of the tracking point on the basis of continuously captured image representations of the medical marker and of the medical instrument connected therewith ([0047] The position and orientation of the marker array is continuously determined by the tracking system. The position of the centre of the marker array is known at any time, the relative position of the further marker and centre of the marker array has been determined and corresponds to vector 117, the position of the tip relative to the further marker is stored and known by the tracking system and corresponds to vector 111. Hence, the position of the tip in the reference frame of the tracking system at any point in time corresponds to the position of the centre of the marker array plus vector 117 plus vector 111) and on the basis of the calibrated information relating to the spatial pose relationship between medical marker and tracking point ([0043] Hence the tracking system can determine the position of the working part of the instrument, tip 120, relative to the fixed marker 110 and stores this calibration information for the instrument). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the method of Feilkas with the teachings of Burger to continuously determine the first position of the tracking point because "If reference array 112 is moved at any time during use, for example, to make it easier to hold or use the instrument, the tracking system simply re-determines the positions of the centre of the marker array…and hence the position of the tip can continue to be accurately tracked" (Burger 0048). Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Feilkas in view of Burger and Kostler (US20240074815A1). Regarding claim 5, Feilkas teaches the method of claim 1. Feilkas further teaches determining second positions of the tracking point ([0041] In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated); determining deviations between the first position and the second position of the tracking point ([0041] In order to obtain a direct superimposition or comparison of the images 8 and 10, corresponding corner points 14 and 15 or 17 and 18 of the images 8 and 10 can be used, and the differences 16 and 19 between the corresponding corner points can be calculated. If a number of differences with respect to corresponding corner points or edges are calculated, then it is possible to ascertain the preciseness in which the two images 8 and 10 match); and adjusting the information relating to the spatial pose relationship between medical marker and tracking point if one of the determined deviations exceeds a predetermined limit value ([0010] If a difference or deviation from the pre-calibration data is established, however, an error message can be generated, e.g., the instrument requires calibration, or the pre-calibration data, which may be used for subsequent navigation, can be adapted to the optically detected data). Feilkas does not teach a selection of continuously captured image representations of the medical marker and of the medical instrument connected therewith. Burger teaches a selection of continuously captured image representations of the medical marker and of the medical instrument connected therewith ([0047] The position and orientation of the marker array is continuously determined by the tracking system. The position of the centre of the marker array is known at any time, the relative position of the further marker and centre of the marker array has been determined and corresponds to vector 117, the position of the tip relative to the further marker is stored and known by the tracking system and corresponds to vector 111. Hence, the position of the tip in the reference frame of the tracking system at any point in time corresponds to the position of the centre of the marker array plus vector 117 plus vector 111). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the method of Feilkas with the teachings of Burger to continuously capture image representations because "If reference array 112 is moved at any time during use, for example, to make it easier to hold or use the instrument, the tracking system simply re-determines the positions of the centre of the marker array…and hence the position of the tip can continue to be accurately tracked" (Burger 0048). Feilkas does not teach intermittently determining second positions of the tracking point. Kostler, in the same field of endeavor of medical instrument calibration, teaches intermittently determining second positions of the tracking point ([0035] For example, the computer may recognize the occurrence of a system modification when such modification is indicated by a surgeon via a manual input on a man-machine-interface of a medical navigation system… As soon as an image analysis of the camera images shows that the medical system has been modified, for example by recognizing that structures which move in accordance with a tracking marker array assigned to the system have their geometry altered, have been added to the system and/or have been removed from the system, a recalibration may be automatically initiated, or its necessity may at least be indicated to a surgeon operating the system. [0036] In order to reduce the time and effort required for recalibration to a necessary minimum, the modification data may describe, particularly exclusively describe a spatial relation between a first section of interest and a second section of interest of at least one, particularly of each component of the modified setup [0037] which was not included in the initial setup; and/or [0038] a geometric property of which has been altered as compared to the initial setup). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the method of Feilkas with the teachings of Kostler to intermittently determine second positions of the tracking point because "For devices and instruments which are adapted to connect to a plurality of objects in an ongoing procedure…a calibration procedure needs to be performed many times as the overall geometry of the multi-component system changes with each new system setup and components, implants or screws of the system being added, removed or replaced" [Kostler 0003]. Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Feilkas in view of Ishrak (US20200397511A1). Regarding claim 11, Feilkas teaches the method of claim 1. Ishrak, in the same field of endeavor of medical image analysis, teaches wherein the second position of the tracking point is determined by means of image recognition and/or a machine learning algorithm in the captured image representation ([0164] Once the imaging of the area of interest of the patient becomes obstructed or unavailable, the earlier reference (clean) image of the same area of interest of the patient may be overlaid, underlaid, merged or otherwise displayed by the controller, e.g., in a semi-transparent fashion, on the display device using one or more common anatomical reference points (e.g., the coronary sinus or the aorta) between the live image and the stored reference image, thereby anchoring the two images together. The selection of common anatomical regions or features between the two images may be accomplished using image recognition techniques or software or machine learning (e.g., utilizing image recognition/machine learning)). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the method of Feilkas with the teachings of Ishrak to use image recognition or machine learning for tracking point determination because "When using an ultrasound imaging system for navigation in a region of a patient's body, visual obstructions such as image shadowing or other artifacts may be caused due to reflections of ultrasound energy by a medical instrument" [Ishrak 0007] and "registration of the markers with the physiological landmarks may aid the clinician in guiding the medical instrument or medical device 130 to the target region of the patient" [Ishrak 0126]. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jacqueline R Zak whose telephone number is (571) 272-4077. The examiner can normally be reached M-F 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emily Terrell can be reached at (571) 270-3717. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JACQUELINE R ZAK/Examiner, Art Unit 2666 /EMILY C TERRELL/Supervisory Patent Examiner, Art Unit 2666
Read full office action

Prosecution Timeline

Jan 24, 2023
Application Filed
Apr 16, 2025
Non-Final Rejection — §102, §103
Jul 21, 2025
Response Filed
Aug 27, 2025
Final Rejection — §102, §103
Dec 02, 2025
Request for Continued Examination
Dec 17, 2025
Response after Non-Final Action
Feb 09, 2026
Non-Final Rejection — §102, §103
Mar 31, 2026
Examiner Interview Summary
Mar 31, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586340
PIXEL PERSPECTIVE ESTIMATION AND REFINEMENT IN AN IMAGE
2y 5m to grant Granted Mar 24, 2026
Patent 12462343
MEDICAL DIAGNOSTIC APPARATUS AND METHOD FOR EVALUATION OF PATHOLOGICAL CONDITIONS USING 3D OPTICAL COHERENCE TOMOGRAPHY DATA AND IMAGES
2y 5m to grant Granted Nov 04, 2025
Patent 12373946
ASSAY READING METHOD
2y 5m to grant Granted Jul 29, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
55%
With Interview (-11.4%)
2y 10m
Median Time to Grant
High
PTA Risk
Based on 12 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month