Prosecution Insights
Last updated: April 17, 2026
Application No. 18/244,826

COMPUTER ASSISTED SURGICAL SYSTEMS AND METHODS

Final Rejection §102§103§112
Filed
Sep 11, 2023
Examiner
EDUN, DEAN NAWAAB
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
unknown
OA Round
2 (Final)
43%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 43% of resolved cases
43%
Career Allow Rate
15 granted / 35 resolved
-27.1% vs TC avg
Strong +65% interview lift
Without
With
+65.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
48 currently pending
Career history
83
Total Applications
across all art units

Statute-Specific Performance

§101
6.3%
-33.7% vs TC avg
§103
48.1%
+8.1% vs TC avg
§102
16.7%
-23.3% vs TC avg
§112
27.5%
-12.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 35 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgement is made to Applicant’s claim to priority to U.S. Provisional App. No. 63/405,450 filed 09/11/2022. Status of Claims This Office Action is responsive to the claims filed on 10/29/2025. Claims 1, 6, 9, 13, 15, and 18-20 have been amended. Claims 1-20 are presently pending in this application. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “at least one registering computing device” in claim 1, line 2. The corresponding structure for the “registering computing device” defined within the specification is a computer (Paragraph [0083]-[0084]) and any functional equivalents. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 5 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 5, line 3 recites the claim limitation “a fluoroscopy imaging device” which is indefinite because it is unclear if this “fluoroscopy imaging device” is the same fluoroscopy imaging device that is already recited in claim 1, line 12-13; OR a different fluoroscopy imaging device. For the purpose of examination, this is understood to mean the same fluoroscopy imaging device that is already recited in claim 1, line 12-13; OR a different fluoroscopy imaging device. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over Berend (US 20160278868) in view of Kienzle (US 6285902 B1). Regarding claim 1, Berend teaches a computer assisted medical procedure system (Paragraph [0011]; A system for planning and performing a surgical procedure comprises a motion capture system that records a video of anatomical movements of a patient; Fig. 1), comprising: at least one registering computing device (Paragraph [0028]; Computer 112 that comprises processor 126, working memory 128, core surgical navigation utilities 130; Core surgical navigation utilities 130 are the basic operating programs, and include image registration, image acquisition, location algorithms, orientation algorithms; Fig. 2); a medical device locator (Paragraph [0032]-[0034]; Tracking system 120… incorporating markers; Arrays 122, which can include arrays 26A-26C… can have any number of markers; Fig. 1) operatively coupled to a medical device (Paragraph [0039]; Positional information obtained from arrays 26A-26C corresponding to the instruments or tools; Paragraph [0072]; Surgical instrument 610 includes marker array 626A; Fig. 6) placed within a medical procedure theater (Paragraph [0020]; operating room 10 in which surgeon 12, clinician 14 and patient 16 can be located to perform a medical procedure, Fig. 1), wherein the at least one registering computing device determines the location of the medical device locator and medical device relative to a patient (Paragraph [0039]; Positional information obtained from arrays 26A-26C corresponding to the instruments or tools or patient anatomy associated with or attached to each array is used by computer 112… , computer 112 compares or correlates the location of one or more instruments or tools within the three-dimensional space and can then consult the surgical plan to determine at which step the surgical procedure is at in the surgical plan; Paragraph [0074]; tracking system detects the location of surgical instrument 610, in this case an acetabular impactor, relative to femur 612 and pelvis 614 by referencing the position of marker arrays 626A as it moves with respect to reference arrays 626B and 626C, which are fixably attached to the patient.); and a patient position fiducial associated with anatomy of the patient (Paragraph [0074]; o reference arrays 626B and 626C, which are fixably attached to the patient; Fig. 6) being subjected to a medical procedure (Paragraph [0080]; With the systems and methods described herein, surgeon 16 is better able to perform surgical operations. In particular, the entire surgical procedure can be more accurately and more rapidly completed); wherein the at least one registering computing device detects the position of the patient position fiducial relative to the medical device locator in order to determine a relative position of the medical device to the anatomy of the patient (paragraph [0074]; tracking system detects the location of surgical instrument 610, in this case an acetabular impactor, relative to femur 612 and pelvis 614 by referencing the position of marker arrays 626A as it moves with respect to reference arrays 626B and 626C, which are fixably attached to the patient). Berend does not explicitly teach the at least one registering computing device is further configured to perform a calibration process prior to the medical procedure to determine a scanline geometry of a fluoroscopy device using fiducials coupled to the fluoroscopy device and a fiducial positioned on or near a patient support surface. Kienzle, however, teaches a computer assisted medical procedure system (Abstract; image guided surgery system to enable a surgeon to move a surgical tool into a desired position relative to a body part) comprising: at least one registering computing device (Col. 8, ln. 54-68; the invention comprises a… computer 142, Fig. 2; Col. 18, ln. 27-48; the registration object takes the form of a wand 235 and is held such that at least three non-collinear localizing emitters 236, 237 are seen simultaneously by both sensors 230, 231 and their poses recorded); wherein the at least one registering computing device is further configured to perform a calibration process (Col. 12, ln. 20-32; the mapping function parameters are determined by a calibration process) prior to the medical procedure (Col. 4, ln. 34-45; the calculation of conic projection parameters through interpolation of preoperatively determined pose-specific calibration data) to determine a scanline geometry (Col. 11, ln. 46-Col. 45, ln. 20; mapping functions must be established between points on the input plane (i.e., the input surface of the x-ray imager 114) and their corresponding pixel locations in the acquired image... A coordinate frame, C, is assigned to an arbitrary point on the image plane and the locations of the localizing emitters 153 relative to this frame; The mapping between points between the emitter and imager are considered to be scanlines as understood in its broadest reasonable interpretation) of a fluoroscopy device (Col. 8, ln. 14-39; a mobile fluoroscopic imaging device 110, such as what is commonly referred to as a C-arm, Fig. 1) using fiducials coupled to the fluoroscopy device (Col. 10, ln. 60-Col. 11, ln. 16; three or more localizing emitters 153 are affixed to the flat panel imager housing 152) and a fiducial (Col. 13, ln. 42-Col. 14, ln. 2; The parameters tx, ty, and f are determined preferably by a calibration process employing a grid 170 similar to the one previously described but with localizing emitters 171 mounted at known locations relative to the grid holes 172) positioned on or near a patient support surface (Col. 13, ln. 42-Col. 14, ln. 2; the grid 170 is held at an intermediate distance between the x-ray source 115 and the x-ray imager 114, Fig. 6; Col. 16, ln. 49-60; Col. a standard fracture table 102, Fig. 1). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the registering computing device of Berend to further include performing a calibration process prior to the medical procedure to determine a scanline geometry of a fluoroscopy device using fiducials coupled to the fluoroscopy device and a fiducial positioned on or near a patient support surface as taught by Kienzle because it would have been a known method of calibrating a fluoroscopy system for use during surgery and further improved the ability to localize objects within the imaging field which would have allowed representing the tools over the x-ray images (Col. 5, ln. 57-Col. 6, ln. 13; Col. 10, ln. 28-52). Regarding claim 5, together Berend and Kienzle teach all of the limitations of claim 1 as noted above. Berend further teaches the patient position fiducial is partially opaque to fluoroscopy such that the patient position fiducial is represented in an x-ray image (Paragraph [0052]; Patient 16 can wear a suit having an array of markers, such as visual markers and/or radiopaque markers, that are identifiable by the software application in the video; Paragraph [0083]; comprising obtaining orthogonal x-rays of the patient wearing the body suit having the visual and/or radiopaque markers) captured by a fluoroscopy imaging device (Paragraph [0023]; Operating room 10 can include an imaging system, such as C-arm fluoroscope system 30 with fluoroscope display image 32 to show a real-time image of the patient's knee on computer display). Claims 2-4 and 6-8 are rejected under 35 U.S.C. 103 as being unpatentable over Berend (US 20160278868) in view of Kienzle (US 6285902 B1) as applied to claim 1 above, and further in view of Nikou (US 20210369353). Regarding claim 2, Berend and Kienzle teach all of the limitations of claim 1 as noted above. Berend does not explicitly teach the at least one registering computing device is a head-mounted display device. Nikou, however, teaches at least one registering computing device is a head-mounted display device (Paragraph [0060]; wear an Augmented Reality (AR) Head Mounted Device (HMD); Paragraph [0054]; In addition to the camera array, which in some embodiments is affixed to a cart, additional cameras can be placed throughout the surgical theatre. For example, handheld tools or headsets worn by operators/surgeons can include imaging capability that communicates images back to a central processor to correlate those images with images captured by the camera array; Paragraph [0056]; augmented reality headsets can be worn by surgeons to provide additional tracking capabilities). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified at least one registering computing device of Berend in view of Kienzle to have been a head-mounted display device as it could give a more robust image of the environment for modeling using multiple perspectives (Nikou, Paragraph [0054]) and furthermore allow specific objects to be manually registered by a surgeon with the system preoperatively or intraoperatively (Nikou, Paragraph [0055]). Regarding claim 3, together Berend and Kienzle teaches all of the limitations of claim 1 as noted above. Berend does not explicitly teach a medical server configured to maintain a medical image of the patient. Nikou, however, teaches a medical server (Paragraph [0115]; FIG. 2C illustrates a “cloud-based” implementation in which the Surgical Computer 150 is connected to a Surgical Data Server 180 via a Network 175) configured to maintain a medical image of the patient (Paragraph [0115]; It should also be noted that the EMR Database 170 may be used for both pre-operative and post-operative data. For example, assuming that the Patient 160 has given adequate permissions, the Surgical Data Server 180 may collect the EMR of the Patient pre-surgery. Then, the Surgical Data Server 180 may continue to monitor the EMR for any updates post-surgery). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend in view of Kienzle to have included a medical server configured to maintain a medical image of the patient as taught by Nikou because it would have been a known method in the art for transferring data for processing (Paragraph [0117]) that further would have allowed more robust computation via cloud computation and thus improve the accuracy of the registration and improved registration through accessing historical data (Paragraphs [0119] and [0142]). Regarding claim 4, together Berend, Kienzle, and Nikou teaches all of the limitations of claim 3 as noted above. Berend does not explicitly teach the medical server is configured to present the medical image of the patient to a medical professional with an image of the medical device being overlayed on the medical image describing a relative position of the medical device to the anatomy of the patient. Nikou, however, further teaches the medical server is configured to present the medical image of the patient to a medical professional (Paragraph [0061]; The remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks; Paragraph [0115]; It should also be noted that the EMR Database 170 may be used for both pre-operative and post-operative data) with an image of the medical device being overlayed on the medical image describing a relative position of the medical device to the anatomy of the patient (Paragraph [0060]; Surgeon 111 is wearing an AR HMD 155 that may, for example, overlay pre-operative image data on the patient or provide surgical planning suggestions; Paragraph [0057]; By impinging the tip of the tool against the surface of the bone, a three-dimensional surface can be mapped for that bone that is associated with a position and orientation relative to the frame of reference of that fiducial mark. By optically tracking the position and orientation (pose) of the fiducial mark associated with that bone, a model of that surface can be tracked with an environment through extrapolation). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have further modified the system of Berend in view of Kienzle and Nikou such that the medical server is configured to present the medical image of the patient to a medical professional with an image of the medical device being overlayed on the medical image describing a relative position of the medical device to the anatomy of the patient because the remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks (Paragraph [0061]) and furthermore the overlay would provide surgical planning suggestions which would aid the medical professional in making decisions during the surgery (Paragraph [0060]). Regarding claim 6, Berend and Kienzle teaches all of the limitations of claim 1 as noted above. Berend does not explicitly teach the medical device locator further comprises a quick response (QR) code configured to identify a location where the registering computing device is located within the medical procedure theater. Nikou, however, teaches the medical device locator further comprises a quick response (QR) code configured to identify a location where a registering computing device is located within the medical procedure theater (Paragraph [0054]-[0056]; some imaging devices may be of suitable resolution or have a suitable perspective on the scene to pick up information stored in quick response (QR) codes or barcodes. This can be helpful in identifying specific objects not manually registered with the system; picked up by a camera or camera array associated with the tracking system… one or two dimensional optical codes (barcode, QR code, etc.) can be affixed to objects in the theater to provide passive identification that can occur based on image analysis). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the medical device locator of Berend in view of Kienzle to have further comprised a quick response (QR) code configured to identify a location where a registering computing device is located within the medical procedure theater as taught by Nikou because it would allow identifying specific objects not manually registered with the system and further conveys a unique identifier to the source of that pattern, providing a dynamic identification mark and be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image (Paragraphs [0054]-[0056]). Regarding claim 7, together Berend and Kienzle teaches all of the limitations of claim 1 as noted above. Berend further teaches the at least one registering computing device comprises a plurality of cameras (Paragraph [0022]; two CCD (charge couple device) cameras 27; Fig. 1 shows cameras 27) such that the plurality of cameras capture independent images within the medical procedure theater (Paragraph [0041]; In this particular arrangement, visual data collector 208 can include a camera 212 that is capable of capturing multiple frames of visual information; include such numerical values that represent the angle of the hip bones of patient 204, as well as other joints and bones, the patient's global position and global orientation relative to environment 202); and identifying fixed camera three-dimensional extrinsic points within the medical procedure theater including the medical device locator (Paragraph [0041]; Operations of computer system 210 can also include computing the location of the camera and its visual properties, such as the camera's field of view, lens distortion, and orientation, while a sequence of images is being recorded. For example, operations of computer system 210 can derive the position of camera 212 given enough markers 214 and information associated with the markers, such as the number, identification, and position of markers 214 captured by camera 212) and triangulating a location of the fixed camera three-dimensional extrinsic points to identify locations of the medical device and the patient position fiducial relative to each other (Paragraph [0073]-[0074]; To accomplish this, cameras 627 of optical locator 624 detect the position of marker arrays 626A-626C in space by using triangulation methods; an acetabular impactor, relative to femur 612 and pelvis 614 by referencing the position of marker arrays 626A as it moves with respect to reference arrays 626B and 626C). Berend does not explicitly teach the plurality of cameras operatively coupled to a medical server; and the medical server identifies the fixed camera three-dimensional extrinsic points and triangulates the location. Nikou, however, teaches a plurality of cameras (Paragraph [0056]; important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system) operatively coupled to a medical server (Paragraph [0061]; Surgical Computer 150… collects data from those components, and provides general processing for various data; connected to a remote server over one or more computer networks (e.g., the Internet)); and the medical server identifies fixed camera three-dimensional extrinsic points and triangulates a location (Paragraph [0061]; The remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend in view of Kienzle such that the plurality of cameras are operatively coupled to a medical server; and the medical server identifies the fixed camera three-dimensional extrinsic points and triangulates the location as taught by Nikou because it would have been a known method in the art for transferring data for processing (Paragraph [0117]) that further would have allowed more robust computation via cloud computation and thus improve the accuracy of the registration and improved registration through accessing historical data (Paragraphs [0119] and [0142]). Regarding claim 8, together Berend, Kienzle, and Nikou teach all of the limitations of claim 7 as noted above. Berend further teaches the plurality of cameras are fixed to a support structure (Fig. 1 shows cameras 27 fixed to a support structure). Claims 9-17 are rejected under 35 U.S.C. 103 as being unpatentable over Berend (US 20160278868) in view of Nikou (US 20210369353) and Kienzle (US 6285902 B1). Regarding claim 9, Berend teaches a computer assisted medical procedure system (Paragraph [0011]; A system for planning and performing a surgical procedure comprises a motion capture system that records a video of anatomical movements of a patient; Fig. 1), comprising: a plurality of cameras (Paragraph [0022]; two CCD (charge couple device) cameras 27; Fig. 1 shows cameras 27) fixed to a support structure (Fig. 1 shows cameras 27 fixed to a support structure) placed within a medical procedure theater (Paragraph [0020]; operating room 10 in which surgeon 12, clinician 14 and patient 16 can be located to perform a medical procedure, Fig. 1) that each capture independent images within the medical procedure theater (Paragraph [0041]; In this particular arrangement, visual data collector 208 can include a camera 212 that is capable of capturing multiple frames of visual information; include such numerical values that represent the angle of the hip bones of patient 204, as well as other joints and bones, the patient's global position and global orientation relative to environment 202); a first medical device locator configured to be operatively coupled to a first medical device placed within the medical procedure theater (Paragraph [0023]; The tracking system can also detect the location of diagnostic scope 22A including its reference array 26A, Fig. 1); a second medical device locator configured to be operatively coupled to a second medical device positioned within the medical procedure theater (Paragraph [0023]; as well as reference arrays 26B and 26C, which can be attached to … another surgical instrument; In various examples, surgical instrument 22A can comprise a scope and surgical instrument 22B can comprise a drill, reamer or inserter.); and a patient position fiducial configured to be placed near anatomy of a patient (Paragraph [0023]; as well as reference arrays 26B and 26C, which can be attached to the patient's anatomy; Paragraph [0072]; while femur 612 includes marker array 626B and pelvis includes marker array 626C, Fig. 6); wherein the medical server determines the location of the first medical device and second medical device (Paragraph [0024]; Optical locator 24 is positioned in operating room 10 so that cameras 27 have complete coverage of table 25 and cart 46. Surgical instruments 22A and 22B can be located at various locations throughout operating room 10, such as being held by surgeon 12, or in tray 48 of cart 46.) relative to each other (Paragraph [0033]; identify the position of tracked instruments as they move relative to the patient's anatomy… the system is able to determine the position of the surgical instruments in space, as well as superimpose their relative positions onto pre-operatively captured CT images of the patient) by receiving the captured independent images from the plurality of cameras and identifies fixed camera three-dimensional extrinsic points within the captured independent images (Paragraph [0041]; Operations of computer system 210 can also include computing the location of the camera and its visual properties, such as the camera's field of view, lens distortion, and orientation, while a sequence of images is being recorded. For example, operations of computer system 210 can derive the position of camera 212 given enough markers 214 and information associated with the markers, such as the number, identification, and position of markers 214 captured by camera 212); and detecting the position of the patient position fiducial to determine the position of the first medical device and second medical device relative to the anatomy of the patient (paragraph [0074]; tracking system detects the location of surgical instrument 610, in this case an acetabular impactor, relative to femur 612 and pelvis 614 by referencing the position of marker arrays 626A as it moves with respect to reference arrays 626B and 626C, which are fixably attached to the patient; Paragraph [0026]; Guiding of surgical instruments 22A and 22B… The location and orientation of the prosthetic implant is determined from the surgical plan, which can be made using kinematic evaluation of the anatomy of patient 16 captured with visual motion tracking system 200; Paragraph [0033]; the system is able to determine the position of the surgical instruments in space, as well as superimpose their relative positions onto pre-operatively captured CT images of the patient). Berend does not explicitly teach a medical server; wherein each of the plurality of cameras are operatively coupled to the medical server; the medical server detects the position of the patient position fiducial to determine the position of the first medical device and second medical device relative to the anatomy of the patient; and wherein the first medical device locator and the second medical device locator each comprise a tracking array having a shape and geometry known to the medical server with the tracking array configured to be coupled to a distal end of each of the first medical device and second medical device; wherein the medical server comprises a medical device look-up table describing the geometry of each of the first medical device and second medical device with the medical server determining, in real-time, locations of distal ends of the first medical device and second medical device based on the geometry of each of the first medical device and second medical device. Nikou, however, teaches a medical server (Paragraph [0115]; FIG. 2C illustrates a “cloud-based” implementation in which the Surgical Computer 150 is connected to a Surgical Data Server 180 via a Network 175); wherein each of the plurality of cameras (Paragraph [0056]; important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system) are operatively coupled to the medical server (Paragraph [0061]; Surgical Computer 150… collects data from those components, and provides general processing for various data; connected to a remote server over one or more computer networks (e.g., the Internet)); the medical server detects the position of the patient position fiducial to determine the position of the first medical device and second medical device relative to the anatomy of the patient (Paragraph [0061]; The remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks; Paragraph [0176]; a tracker to be securely and repeatably mounted in a plurality of positions; exact location and dimensions of the two surfaces (i.e., positions) are known relative to one another; Paragraph [0178]; then landmarks on a bone or skeletal structure that are registered relative to the tracker position can be converted when the tracker 703 is placed in the secondary position. In some embodiments, these relative location determinations may be carried out using coordinate transform arithmetic). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend to include a medical server, wherein each of the plurality of cameras are operatively coupled to the medical server and the medical server detects the position of the patient position fiducial to determine the position of the first medical device and second medical device relative to the anatomy of the patient as taught by Nikou because it would have been a known method in the art for transferring data for processing (Paragraph [0117]) that further would have allowed more robust computation via cloud computation and thus improve the accuracy of the registration and improved registration through accessing historical data (Paragraphs [0119] and [0142]), and furthermore the remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks (Paragraph [0061]). Together Berend and Nikou do not explicitly teach the first medical device locator and the second medical device locator each comprise a tracking array having a shape and geometry known to the medical server with the tracking array configured to be coupled to a distal end of each of the first medical device and second medical device; and the medical server comprises a medical device look-up table describing the geometry of each of the first medical device and second medical device with the medical server determining, in real-time, locations of distal ends of the first medical device and second medical device based on the geometry of each of the first medical device and second medical device. Kienzle, however, teaches a computer assisted medical procedure system (Abstract; image guided surgery system to enable a surgeon to move a surgical tool into a desired position relative to a body part) comprising: at least one registering computing device (Col. 8, ln. 54-68; the invention comprises a… computer 142, Fig. 2; Col. 18, ln. 27-48; the registration object takes the form of a wand 235 and is held such that at least three non-collinear localizing emitters 236, 237 are seen simultaneously by both sensors 230, 231 and their poses recorded); wherein the first medical device locator (Col. 15, ln. 12-26; the tool's localizing emitters, Fig. 8a) and the second medical device locator each comprise a tracking array (Col. 24, ln. 34-43; The system comprises a controller… an optical localizer, and two surgical tools with localizing emitters, Fig. 11, step 210) having a shape and geometry known to the medical server (Col. 15, ln. 12-26; the tool's localizing emitters… mounted in a known pose relative to localizing emitters 129… stored in the long term memory of the optical localizer controller 124, Fig. 8a) with the tracking array configured to be coupled to a distal end of each of the first medical device and second medical device (Fig. 8 shows the array is coupled to a distal end as understood in its broadest reasonable interpretation); wherein the medical server comprises a medical device look-up table describing the geometry of each of the first medical device and second medical device (Col. 15, ln. 12-26; a description of this relationship, a tool emitter location data file, is encoded into a computer data file and stored in the long term memory of the optical localizer controller 124; Col. 15, ln. 62-Col. 16, ln. 8; The tool model is encoded into a computer data file and stored in the long term memory of the system controller 121; Fig. 9 shows the data appears to be in the form of look-up tables as understood in its broadest reasonable interpretation) with the medical server determining, in real-time, locations of distal ends of the first medical device and second medical device based on the geometry of each of the first medical device and second medical device (Col. 16, ln. 60-Col. 17, ln. 35; the process of superimposing a tool representation on x-ray images; a representation of the tool at that pose is calculated 206, is passed through the conic projection model 207 and the mapping model 208 and then superimposed on the appropriate image). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend in view of Nikou and Kienzle such that the first medical device locator and the second medical device locator each comprise a tracking array having a shape and geometry known to the medical server with the tracking array configured to be coupled to a distal end of each of the first medical device and second medical device; and the medical server comprises a medical device look-up table describing the geometry of each of the first medical device and second medical device with the medical server determining, in real-time, locations of distal ends of the first medical device and second medical device based on the geometry of each of the first medical device and second medical device as taught by Kienzle because it would have been a known method of representing tools within an x-ray image during surgery and further improved the ability to localize objects within the imaging field (Col. 5, ln. 57-Col. 6, ln. 13; Col. 10, ln. 28-52). Regarding claim 10, together Berend, Nikou, and Kienzle teach all of the limitations of claim 9 as noted above. Berend does not explicitly teach the medical server includes a medical image database to provide a medical image of the patient during a medical procedure. Nikou, however, further teaches the medical server includes a medical image database to provide a medical image of the patient during a medical procedure (Paragraphs [0115]-[0117]; Surgical Data Server 180… additional data sources: the Patient 160, Healthcare Professional(s) 165, and an EMR Database 170; Paragraph [0195]; Optionally, the system 1100 can also include a display device 1140 and a database 1150. In an example, these components can be combined to provide navigation and control of the implant positioning device 1130 during an orthopedic (or similar) prosthetic implant surgery; Paragraph [0197]; planning module 1112 can be used to manipulate a virtual model of the implant in reference to a virtual implant host model. The implant host model can be constructed from scans of the target patient. Such scans may include computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomographic (PET), or ultrasound scans of the joint and surrounding structure). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have further modified the medical server of Berend in view of Nikou and Kienzle to have included a medical image database to provide a medical image of the patient during a medical procedure as further taught by Nikou because it would allow providing navigation and control of the implant positioning device during an orthopedic (or similar) prosthetic implant surgery and thereby improve the surgeon’s ability to perform the surgical procedure (Nikou, Paragraphs [0195] and [0147]). Regarding claim 11, together Berend, Nikou, and Kienzle teach all of the limitations of claim 10 as noted above. Berend further teaches the medical image of the patient is presented to a medical professional with an image of the second medical device being overlayed on the medical image describing a relative position of the first medical device to the anatomy of the patient. Regarding claim 12, together Berend, Nikou, and Kienzle teach all of the limitations of claim 9 as noted above. Berend further teaches the patient position fiducial is partially opaque to fluoroscopy such that the patient position fiducial is represented in an x-ray image (Paragraph [0052]; Patient 16 can wear a suit having an array of markers, such as visual markers and/or radiopaque markers, that are identifiable by the software application in the video; Paragraph [0083]; comprising obtaining orthogonal x-rays of the patient wearing the body suit having the visual and/or radiopaque markers) captured by a fluoroscopy imaging device (Paragraph [0023]; Operating room 10 can include an imaging system, such as C-arm fluoroscope system 30 with fluoroscope display image 32 to show a real-time image of the patient's knee on computer display). Regarding claim 13, together Berend, Nikou, and Kienzle teach all of the limitations of claim 9 as noted above. Berend does not explicitly teach the first medical device locator including a quick response (QR) code used to identify the first medical device and second medical device and a location of the first medical device and second medical device within the medical procedure theater. Nikou, however, further teaches the first medical device locator including a quick response (QR) code used to identify the first medical device and second medical device and a location of the first medical device and second medical device within the medical procedure theater (Paragraph [0054]-[0056]; one or two dimensional optical codes (barcode, QR code, etc.) can be affixed to objects in the theater to provide passive identification that can occur based on image analysis. If these codes are placed asymmetrically on an object, they can also be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image. For example, a QR code may be placed in a corner of a tool tray, allowing the orientation and identity of that tray to be tracked). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have further modified the first medical device locator of Berend in view of Nikou and Kienzle to have included a quick response (QR) code used to identify the first medical device and second medical device and a location of the first medical device and second medical device within the medical procedure theater as taught by Nikou because it would allow identifying specific objects not manually registered with the system and further conveys a unique identifier to the source of that pattern, providing a dynamic identification mark and be used to determine an orientation of an object by comparing the location of the identifier with the extents of an object in an image (Paragraphs [0054]-[0056]). Regarding claim 14, together Berend, Nikou, and Kienzle teach all of the limitations of claim 9 as noted above. Berend further teaches the plurality of cameras fixed to the support structure are used to calibrate a fluoroscopy device (Paragraph [0052]; At step 414, the patient is video recorded wearing the body suit. In one example, system 200 of FIG. 3 is used, as discussed, to capture motion data of patient 16 using body suit 2016 with markers 214… another example, the posture of patient 16 can be used to correlate and combine the implant to the anatomy and the anatomy to the visual markers; Paragraph [0054]; orthogonal x-rays of patient 16 wearing the suit are obtained in order to calibrate the video motion to the bones of patient 16) used to capture an image of the anatomy of the patient for use by medical personnel in performing a medical procedure (Paragraph [0025]; The imaging system can be any system capable of producing images that represent the patient's anatomy such as a fluoroscope producing x-ray two-dimensional images). Regarding claim 15, Berend teaches a computer assisted medical device calibration system (Paragraph [0011]; A system for planning and performing a surgical procedure comprises a motion capture system that records a video of anatomical movements of a patient; Fig. 1) comprising: a plurality of cameras (Paragraph [0022]; two CCD (charge couple device) cameras 27; Fig. 1 shows cameras 27) fixed to a support structure (Fig. 1 shows cameras 27 fixed to a support structure) placed within a medical procedure theater (Paragraph [0020]; operating room 10 in which surgeon 12, clinician 14 and patient 16 can be located to perform a medical procedure, Fig. 1) that each capture independent images within the medical procedure theater (Paragraph [0041]; In this particular arrangement, visual data collector 208 can include a camera 212 that is capable of capturing multiple frames of visual information; include such numerical values that represent the angle of the hip bones of patient 204, as well as other joints and bones, the patient's global position and global orientation relative to environment 202),; and a medical device locator (Paragraph [0032]-[0034]; Tracking system 120… incorporating markers; Arrays 122, which can include arrays 26A-26C… can have any number of markers; Fig. 1) operatively coupled to a medical device (Paragraph [0039]; Positional information obtained from arrays 26A-26C corresponding to the instruments or tools; Paragraph [0072]; Surgical instrument 610 includes marker array 626A; Fig. 6) placed within the medical procedure theater (Paragraph [0020]; operating room 10 in which surgeon 12, clinician 14 and patient 16 can be located to perform a medical procedure, Fig. 1) and determining the location of the medical device (Paragraph [0039]; Positional information obtained from arrays 26A-26C corresponding to the instruments or tools or patient anatomy associated with or attached to each array is used by computer 112… , computer 112 compares or correlates the location of one or more instruments or tools within the three-dimensional space and can then consult the surgical plan to determine at which step the surgical procedure is at in the surgical plan) relative to the plurality of cameras fixed to the support structure by receiving the captured independent images from the plurality of cameras and identifying fixed camera three-dimensional extrinsic points within the captured independent images (Paragraph [0041]; Operations of computer system 210 can also include computing the location of the camera and its visual properties, such as the camera's field of view, lens distortion, and orientation, while a sequence of images is being recorded. For example, operations of computer system 210 can derive the position of camera 212 given enough markers 214 and information associated with the markers, such as the number, identification, and position of markers 214 captured by camera 212). Berend does not explicitly teach a medical server; each of the plurality of cameras are operatively coupled to the medical server; and the medical server determines the location of the medical device; wherein the medical server is further configured to perform a calibration process prior to a computer assisted medical procedure to determine a scanline geometry of a fluoroscopy device using fiducials coupled to the fluoroscopy device and a fiducial positioned on or near a patient support surface; wherein the medical device locator comprises a tracking array having a shape and geometry known to the medical server with the tracking array configured to be operatively coupled to a distal end of the medical device; and wherein the medical server comprises a medical device look-up table describing the geometry of the medical device with the medical server determining a location of a distal end of the medical device based on the geometry of the medical device. Nikou, however, teaches a medical server (Paragraph [0115]; FIG. 2C illustrates a “cloud-based” implementation in which the Surgical Computer 150 is connected to a Surgical Data Server 180 via a Network 175); each of the plurality of cameras (Paragraph [0056]; important tools, or bones in the theater may include passive or active identifiers that can be picked up by a camera or camera array associated with the tracking system) are operatively coupled to the medical server (Paragraph [0061]; Surgical Computer 150… collects data from those components, and provides general processing for various data; connected to a remote server over one or more computer networks (e.g., the Internet)); and the medical server determines the location of the medical device (Paragraph [0061]; The remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks; Paragraphs [0195]-[0196]; the system 1100 can include a control system 1110, a tracking system 1120… and a database 1150; one or more computing devices configured to coordinate information received from the tracking system 1120 and provide control to the implant positioning device 1130.; Paragraph [0204]; For example, as long as the tracking system 1120 can image three of the tracking spheres on a tracking marker, such as tracking marker 1160, the tracking system 1120 can utilize image processing algorithms to generate points within the 3-D coordinate system… to triangulate an accurate 3-D position and orientation associated with the device to which the tracking marker is affixed). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend to have included a medical server wherein each of the plurality of cameras are operatively coupled to the medical server and the medical server determines the location of the medical device as taught by Nikou because it would have been a known method in the art for transferring data for processing (Paragraph [0117]) that further would have allowed more robust computation via cloud computation and thus improve the accuracy of the registration and improved registration through accessing historical data (Paragraphs [0119] and [0142]), and furthermore the remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks (Paragraph [0061]). Berend and Nikou does not explicitly teach a medical server; each of the plurality of cameras are operatively coupled to the medical server; and the medical server determines the location of the medical device; wherein the medical server is further configured to perform a calibration process prior to a computer assisted medical procedure to determine a scanline geometry of a fluoroscopy device using fiducials coupled to the fluoroscopy device and a fiducial positioned on or near a patient support surface; wherein the medical device locator comprises a tracking array having a shape and geometry known to the medical server with the tracking array configured to be operatively coupled to a distal end of the medical device; and wherein the medical server comprises a medical device look-up table describing the geometry of the medical device with the medical server determining a location of a distal end of the medical device based on the geometry of the medical device. Kienzle, however, teaches a computer assisted medical procedure system (Abstract; image guided surgery system to enable a surgeon to move a surgical tool into a desired position relative to a body part) comprising: at least one registering computing device (Col. 8, ln. 54-68; the invention comprises a… computer 142, Fig. 2; Col. 18, ln. 27-48; the registration object takes the form of a wand 235 and is held such that at least three non-collinear localizing emitters 236, 237 are seen simultaneously by both sensors 230, 231 and their poses recorded); wherein the at least one registering computing device is further configured to perform a calibration process (Col. 12, ln. 20-32; the mapping function parameters are determined by a calibration process) prior to the medical procedure (Col. 4, ln. 34-45; the calculation of conic projection parameters through interpolation of preoperatively determined pose-specific calibration data) to determine a scanline geometry (Col. 11, ln. 46-Col. 45, ln. 20; mapping functions must be established between points on the input plane (i.e., the input surface of the x-ray imager 114) and their corresponding pixel locations in the acquired image... A coordinate frame, C, is assigned to an arbitrary point on the image plane and the locations of the localizing emitters 153 relative to this frame; The mapping between points between the emitter and imager are considered to be scanlines as understood in its broadest reasonable interpretation) of a fluoroscopy device (Col. 8, ln. 14-39; a mobile fluoroscopic imaging device 110, such as what is commonly referred to as a C-arm, Fig. 1) using fiducials coupled to the fluoroscopy device (Col. 10, ln. 60-Col. 11, ln. 16; three or more localizing emitters 153 are affixed to the flat panel imager housing 152) and a fiducial (Col. 13, ln. 42-Col. 14, ln. 2; The parameters tx, ty, and f are determined preferably by a calibration process employing a grid 170 similar to the one previously described but with localizing emitters 171 mounted at known locations relative to the grid holes 172) positioned on or near a patient support surface (Col. 13, ln. 42-Col. 14, ln. 2; the grid 170 is held at an intermediate distance between the x-ray source 115 and the x-ray imager 114, Fig. 6; Col. 16, ln. 49-60; Col. a standard fracture table 102, Fig. 1); wherein the first medical device locator (Col. 15, ln. 12-26; the tool's localizing emitters, Fig. 8a) and the second medical device locator each comprise a tracking array (Col. 24, ln. 34-43; The system comprises a controller… an optical localizer, and two surgical tools with localizing emitters, Fig. 11, step 210) having a shape and geometry known to the medical server (Col. 15, ln. 12-26; the tool's localizing emitters… mounted in a known pose relative to localizing emitters 129… stored in the long term memory of the optical localizer controller 124, Fig. 8a) with the tracking array configured to be coupled to a distal end of each of the first medical device and second medical device (Fig. 8 shows the array is coupled to a distal end as understood in its broadest reasonable interpretation); wherein the medical server comprises a medical device look-up table describing the geometry of each of the first medical device and second medical device (Col. 15, ln. 12-26; a description of this relationship, a tool emitter location data file, is encoded into a computer data file and stored in the long term memory of the optical localizer controller 124; Col. 15, ln. 62-Col. 16, ln. 8; The tool model is encoded into a computer data file and stored in the long term memory of the system controller 121; Fig. 9 shows the data appears to be in the form of look-up tables as understood in its broadest reasonable interpretation) with the medical server determining, in real-time, locations of distal ends of the first medical device and second medical device based on the geometry of each of the first medical device and second medical device (Col. 16, ln. 60-Col. 17, ln. 35; the process of superimposing a tool representation on x-ray images; a representation of the tool at that pose is calculated 206, is passed through the conic projection model 207 and the mapping model 208 and then superimposed on the appropriate image). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the registering computing device of Berend to further include performing a calibration process prior to the medical procedure to determine a scanline geometry of a fluoroscopy device using fiducials coupled to the fluoroscopy device and a fiducial positioned on or near a patient support surface as taught by Kienzle because it would have been a known method of calibrating a fluoroscopy system for use during surgery and further improved the ability to localize objects within the imaging field which would have allowed representing the tools over the x-ray images (Col. 5, ln. 57-Col. 6, ln. 13; Col. 10, ln. 28-52). It further would have been obvious to have modified the system to such that the medical device locator comprises a tracking array having a shape and geometry known to the medical server with the tracking array configured to be operatively coupled to a distal end of the medical device; and the medical server comprises a medical device look-up table describing the geometry of the medical device with the medical server determining a location of a distal end of the medical device based on the geometry of the medical device as further taught by Kienzle because it would have been a known method of representing tools within an x-ray image during surgery and further improved the ability to localize objects within the imaging field (Col. 5, ln. 57-Col. 6, ln. 13; Col. 10, ln. 28-52). Regarding claim 16, together Berend, Nikou, and Kienzle teach all of the limitations of claim 15 as noted above. Berend further teaches the fixed camera three-dimensional extrinsic points including edges and points on medical device locator (Paragraph [0034]; Arrays 122 can have any number of markers, but typically have three or more markers to define real-time position (x, y, and z location) and orientation (rotation around x, y, and z axes)… In some embodiments, there are at least two arms and some embodiments can have three arms, four arms, or more. The arms are typically arranged asymmetrically to facilitate specific array and marker identification by the tracking system) operatively coupled to the medical device (Paragraph [0073]; Surgical instrument 610 includes marker array 626A, Fig. 6). Regarding claim 17, together Berend, Nikou, and Kienzle teach all of the limitations of claim 15 as noted above. Berend further teaches triangulating the identified fixed camera three-dimensional extrinsic points using a first captured image from a first camera of the plurality of cameras and a second captured image from a second camera of the plurality of cameras (Paragraph [0022]; which can have two CCD (charge couple device cameras 27 that detect the positions of the arrays in space by using triangulation methods; Paragraph [0032]; Tracking system 120 can be any system that can determine the three-dimensional location of devices carrying or incorporating markers that serve as tracking indicia). Berend does not explicitly teach using the medical server to triangulate the identified fixed camera three-dimensional extrinsic points. Nikou, however, further teaches using the medical server (Paragraph [0061]; The remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks) to triangulate the identified fixed camera three-dimensional extrinsic points (Paragraph [0204]; Subsequently, the tracking system 1120 (or the navigation module 1114 (FIG. 11) within the control system 1110) can use the 3 points to triangulate an accurate 3-D position and orientation associated with the device to which the tracking marker is affixed). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have further modified the system of Berend in view of Nikou such that the medical server triangulates the identified fixed camera three-dimensional extrinsic points because it would have been a known method in the art for transferring data for processing (Paragraph [0117]) that further would have allowed more robust computation via cloud computation and thus improve the accuracy of the registration and improved registration through accessing historical data (Paragraphs [0119] and [0142]), and furthermore the remote server can be used, for example, for storage of data or execution of computationally intensive processing tasks (Paragraph [0061]). Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Berend in view of Nikou and Kienzle as applied to claim 15 above, and further in view of Gibby (US 20220291741). Regarding claim 18, together Berend, Nikou, and Kienzle teach all of the limitations of claim 15 as noted above. Berend does not explicitly teach the medical device includes a c-arm type fluoroscopy device with a first fluoroscopy device locator placed at a x-ray emission node, a second medical device locator placed at an x-ray detection node, and a third medical device locator coupled to a c-arm of the c-arm type fluoroscopy device such that the plurality of cameras fixed to the support structure capture independent images of the first medical device locator, the second medical device locator, and third medical device locator in order to determine a scanline between the x-ray emission node and the x-ray detection node. Gibby, however, teaches the medical device includes a c-arm type fluoroscopy device (Paragraph [0052]; in the fluoroscopic image 654, and optical codes 652, 658 on the fluoroscopic device (e.g., a C-arm device)) with a first fluoroscopy device locator placed at a x-ray emission node (Paragraph [0052]; optical codes 658, Fig. 7 and 8), a second medical device locator placed at an x-ray detection node (Paragraph [0052]; optical codes 652, Fig. 7 and 8), and a third medical device locator coupled to a c-arm of the c-arm type fluoroscopy device (Paragraph [0052]; optical codes on the fluoroscopic device e.g., a C-arm device; Fig. 8 shows a third optical code on the c-arm device) such that the plurality of cameras fixed to the support structure capture independent images of the first medical device locator, the second medical device locator, and third medical device locator in order to determine a scanline between the x-ray emission node and the x-ray detection node (Paragraph [0021]; fluoroscopic image may be aligned to overlay the portion of the body of the person being imaged with the X-ray beam; Paragraph [0055]; Further, the fluoroscopic image 654 may be positioned and oriented using the position and orientation of the fluoroscopic device 660 with respect to the body of the person 606; Paragraph [0060]; enable the orientation of the fluoroscopic image 654 and the image projection 656 to be aligned in the appropriate orientation with respect to the body of the person 606a as viewed through AR headset… defined by the optical codes on the body, image visible marker, and/or as defined by the modified position and/or orientation of the fluoroscopic device. Thus, position and orientation of the image projection 656 and fluoroscopic image 654 changes when the position and orientation of the X-ray beam changes.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend in view of Nikou and Kienzle to have included a c-arm type fluoroscopy device with a first medical device locator placed at a x-ray emission node, a second medical device locator placed at an x-ray detection node, and a third medical device locator coupled to a c-arm of the c-arm type fluoroscopy device such that the plurality of cameras fixed to the support structure capture independent images of the first medical device locator, the second medical device locator, and third medical device locator in order to determine a scanline between the x-ray emission node and the x-ray detection node as taught by Gibby. This would have enabled the AR system to reconstruct the image projection so the image projection is parallel to the fluoroscopic image obtained from the fluoroscopic detector so that a medical professional may see the anatomical structures of the person or patient using the image projection as an overlay to the fluoroscopic image, thereby allowing the medical professional to see fluoroscopic images and get a better view of the surgical area during the medical procedure and thus more accurately perform the medical procedure in the target region (Gibby, Paragraphs [0060]-[0062]). Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Berend in view of Nikou and Kienzle as applied to claim 15 above, and further in view of Gregerson (US 20180185113). Regarding claim 19, together Berend, Nikou, and Kienzle teach all of the limitations of claim 15 as noted above. Berend does not explicitly teach the medical device includes a needle and the medical device locator includes a needle location array that has a geometry known by the medical server, wherein the captured independent images from the plurality of cameras fixed to the support structure are used by the medical server to determine the relative position of the needle location array to a tip of the needle. Nikou, however, further teaches captured independent images from the plurality of cameras fixed to the support structure are used by the medical server (Paragraph [0189]-[0193]; the system may detect that, for example, the tracking probe 1004 was in the second divot by determining the probe-to-bone tracker distance; a system that allows for more direct visibility of bone trackers to one or more cameras regardless of the position of the patient) to determine the relative position of a location array to a probe (Paragraph [0191]; as the tracking probe is moved (e.g., pivoted around within the divot), the tracking system will be able to discern which divot the probe tip 1003 is in based on the relative distance from the tracking frame 1002 to the stationary probe tip). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend in view of Nikou such that the captured independent images from the plurality of cameras fixed to the support structure are used by the medical server to determine the relative position of a location array to a tip of probe because it would allow allows for more direct visibility of bone trackers to one or more cameras regardless of the position of the patient and further the system may automatically detect that a probe tracker is being used and automatically begin the calibration/calculations as required (Paragraph [0193] and [0189]), thereby allowing more accurate probe and surgical equipment placement during the procedure. Together Berend and Nikou do not explicitly teach the medical device includes a needle and the medical device locator includes a needle location array that has a geometry known by the medical server at the distal end of the needle. Gregerson, however, teaches a medical device (Paragraph [0049]; a tool 104, Fig. 1 and 10) includes a needle (Paragraph [0049]; The tool 104 may also be a surgical instrument, such as a needle,) and the medical device locator includes a needle location array (Paragraph [0114]; an array of markers 1014 may be attached to the opposite end, Fig. 10B) that has a geometry known by the medical server (Paragraph [0049]; graphical depiction may be based on a known geometry of the tool 104, end effector 102 or other object.), wherein the captured independent images from the plurality of cameras fixed to the support structure (Paragraph [0047]; During imaging scans, the optical sensing device 111 may track the position and orientation of the patient 200 with respect to the camera position, which is in a known, fixed geometric relationship with the isocenter of the imaging device 103) are used by the medical server to determine the relative position of the needle location array to a tip of the needle at the distal end of the needle (Paragraph [0063]; The motion tracking system 105 may determine the location of a portion of the end effector 102, such as a tip end 607 of the end effector (e.g., a tip of a cannula 609 or other tool holder), which may have a known fixed geometric relationship to the marker device 202). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend in view of Nikou and Kienzle such that the medical device includes a needle and the medical device locator includes a needle location array that has a geometry known by the medical server as taught by Gregerson because it would improve the image display by more accurately displaying the tip of the needle with respect to the patient in the image dataset (Paragraphs [0063]-[0067]). Furthermore, it would allow the surgeon to visualize multiple trajectories or paths extending from the patient's skin surface through the patient's anatomy to the target position, and allow the surgeon to view a set of trajectories in multiple planes by moving the tip end (Paragraph [0070]). Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Berend in view of Nikou, Kienzle, and Gregerson as applied to claim 19 above, and further in view of Lefauconnier (US 20240138931). Regarding claim 20, together Berend, Nikou, and Gregerson teach all of the limitations of claim 19 as noted above. Berend does not explicitly teach determining the relative position of the needle location array to a tip of the needle based on needle type input data provided by a medical professional at the medical server. Lefauconnier, however, teaches determining the relative position of the needle location array to a tip of the needle (Paragraph [0049]; a snap-lock, or other type of geometrically-defined lock between optical marker part 50 and the screw head SH, so that the optical marker part 50 can be connected to screw head SH at a precisely defined position) based on needle type input data provided by a medical professional at the medical server (Paragraph [0116]; and step U240 receiving input data from surgeon or operator O that actually selects the different pedicle markers PM or guide wires GW that have been detected, analogously to steps D25, D30, U40.; C220 of calculating the geometry can be performed by data processing device 100, where virtual attachment points AP_V can be calculated, being a specific geometric location where fixation rod R will most likely be located with respect to a corresponding pedicle screw PS, the pedicle screw PS not yet being attached or anchored to the vertebrae V). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to have modified the system of Berend in view of Nikou, Kienzle, and Gregerson to include determining the relative position of the needle location array to a tip of the needle based on needle type input data provided by a medical professional at the medical server as taught by Lefauconnier because it would have allowed measurement and visualization of the specific needle based on prestored information on statistical data of the geometric relationship between positions and orientations of different needles (Paragraph [0116]), thereby allowing more accurate visualization when performing surgery with different needles. Response to Arguments Claim Objections Examiner acknowledges the amendments to the claims and withdraws all objections to the claims. Claim Rejections under – 35 U.S.C. § 112(b) Examiner acknowledges the amendments to claims 1 and 6 and withdraws all previous rejections under 35 USC 112(b). The amendments to the claims raises new rejections under 35 USC 112(b) which are now presented. Claim Rejections under – 35 U.S.C. § 102 and 103 Applicant’s arguments with respect to the previous 35 U.S.C. § 102 and 103 rejections have been considered but are moot in view of the updated grounds of rejection necessitated by amendments. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Dean N Edun whose telephone number is (571)270-3745. The examiner can normally be reached M-F 8am-5:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anh Tuan Nguyen can be reached at (571)272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DEAN N EDUN/Examiner, Art Unit 3797 /ANH TUAN T NGUYEN/Supervisory Patent Examiner, Art Unit 3795 1/28/26
Read full office action

Prosecution Timeline

Sep 11, 2023
Application Filed
Jul 23, 2025
Non-Final Rejection — §102, §103, §112
Oct 27, 2025
Applicant Interview (Telephonic)
Oct 27, 2025
Examiner Interview Summary
Oct 29, 2025
Response Filed
Jan 23, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582376
CONSTITUTIVE EQUATION FOR NON-INVASIVE BLOOD PRESSURE MEASUREMENT SYSTEMS AND METHODS
2y 5m to grant Granted Mar 24, 2026
Patent 12575750
ASYMMETRIC SENSORS FOR RING WEARABLE
2y 5m to grant Granted Mar 17, 2026
Patent 12543967
APPARATUS AND METHOD FOR QUANTIFICATION OF THE MAPPING OF THE SENSORY AREAS OF THE BRAIN
2y 5m to grant Granted Feb 10, 2026
Patent 12521019
SYSTEMS AND METHODS OF RELATIVE ONSET FLUORESCENCE DELAY FOR MEDICAL IMAGING
2y 5m to grant Granted Jan 13, 2026
Patent 12426852
CATHETER WITH ACOUSTIC LENS ARRANGEMENT FOR LOCALIZED ULTRASONIC WAVE TRANSMISSION
2y 5m to grant Granted Sep 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
43%
Grant Probability
99%
With Interview (+65.0%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 35 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in for Full Analysis

Enter your email to receive a magic link. No password needed.

Free tier: 3 strategy analyses per month