Prosecution Insights
Last updated: April 19, 2026
Application No. 18/122,802

SURGICAL SYSTEMS, METHODS, AND DEVICES EMPLOYING AUGMENTED REALITY (AR) GRAPHICAL GUIDANCE

Final Rejection §102§103§112
Filed
Mar 17, 2023
Examiner
CHOI, YOUNHEE JEON
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Depuy Synthes Products Inc.
OA Round
2 (Final)
72%
Grant Probability
Favorable
3-4
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
133 granted / 186 resolved
+1.5% vs TC avg
Strong +50% interview lift
Without
With
+49.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
29 currently pending
Career history
215
Total Applications
across all art units

Statute-Specific Performance

§101
2.5%
-37.5% vs TC avg
§103
42.8%
+2.8% vs TC avg
§102
16.8%
-23.2% vs TC avg
§112
33.5%
-6.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 186 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments, see pg. 6, filed 03 Oct 2025 with respect to the Claim Interpretation under 35 U.S.C. 112(f) have been fully considered but they are not persuasive. Applicant merely argues that “For example, "an augmented reality (AR) system" and "a position tracking system" are terms that connote definite structure in the context of a computer aided surgery (CAS) system.” However, the Examiner respectfully disagrees. As noted in the Non-Final Office Action of 03 July 2025, Applicant’s original specification discloses structures for the claimed AR system and position tracking system that can be explicitly recited in at least the independent claims to overcome the claim interpretation under 35 U.S.C. 112(f). Since the amended claims of 03 Oct 2025 do not explicitly recite any structures for the claimed functions, the Claim Interpretation of 03 Jul 2025 under 35 U.S.C. 112(f) is hereby maintained. Status of Claims Claims 1-6 and 8-21 are currently under examination. Claim 7 has been cancelled and claim 21 has been newly added since the Non-Final Office Action of 03 Jul 2025. Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the new limitation “wherein the representation is at least one of: a cone; a cuboid” recited in claims 1 and 19 as well as the new limitations “wherein the representation is a cone, …” recited in claim 2 and “wherein, when the representation is the cone, … when the representation is the cuboid …” recited in claim 20 must be shown or the feature(s) canceled from the claim(s). Specifically, Fig. 3 of the instant application discloses a frustoconical, not a cone, and Fig. 4 discloses a trapezoidal prism, not a cuboid. Additionally, the new limitations “wherein the representation is an overlay of a virtual patient nerve adjacent to the instrument” recited in claim 16 and “wherein the representation is an overlay of a virtual patient vascular structure adjacent to the instrument” recited in claim 17 must be shown or the feature(s) cancelled from the claim(s). Specifically, Fig. 6 discloses only an overlay of a bone, but not a nerve nor a vascular structure. No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Objections Claims 1, 5, and 19 are objected to because of the following informalities: “a cuboid; or a travel path; or” should read “a cuboid; or a travel path” (claims 1 and 19); “the determined position” should read “the determined first position” (claims 1 and 19); “a position on a patient” should read “a position on the patient” (claim 5); and “the tissue of a patient” should read “the tissue of the patient” (claim 19). Appropriate correction is required. Claim Interpretation The Claim Interpretation of 03 Jul 2025 under 35 U.S.C. 112(f) is hereby maintained. See the Non-Final Office Action of 03 Jul 2025. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-6, and 8-21 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1 and 19 each recites the new limitation “wherein the representation is at least one of: … a cuboid …”. A review of the original specification of the instant application does not disclose the representation being a cuboid. Specifically, [0032] of the original specification discloses the representation including a cube, not a cuboid, and Fig. 4 discloses a trapezoidal prism, not a cuboid. Therefore, the new limitation in claims 1 and 19 introduces new matter. Claims 2-5, 8-18, and 20 inherit the deficiency by the nature of their dependency on claims 1 and 19, respectively. Claim 2 recites the new limitation “the cone's orientation representing a field of view of the instrument extending from a tip of the distal end of the instrument based on a determined axis of the instrument”. A review of the original specification of the instant application does not disclose the field of view of the instrument being “based on a determined axis of the instrument”. Specifically, [0029] of the original specification merely discloses the “field of view of the instrument extending from the distal end of the instrument”. Claims 3-4 inherit the deficiency by the nature of their dependency on claim 2. Claim 16 recites the new limitation “wherein the representation is an overlay of a virtual patient nerve adjacent to the instrument”. A review of the original specification of the instant application does not disclose the representation being an overlay of a virtual patient nerve adjacent to the instrument. Specifically, [0045] of the original specification discloses nerve positions, nerve scan, or neuromonitoring results being overlaid, but not a virtual patient nerve. Therefore, the new limitation in claim 16 introduces new matter. Claim 17 recites the new limitation wherein the representation is an overlay of a virtual patient vascular structure adjacent to the instrument”. A review of the original specification of the instant application does not disclose the representation being an overlay of a virtual patient vascular structure adjacent to the instrument. Specifically, [0046] of the original specification discloses a blood vessel position being overlaid, but not a virtual patient vascular structure. Therefore, the new limitation in claim 17 introduces new matter. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1-6 and 8-21 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1 and 19 each recites the new limitation “wherein the representation is at least one of: a cone”. The metes and bounds of the representation being a cone are unclear. Specifically, [0029] of the original specification discloses “FIG. 3 depicts a schematic of an AR display with camera orientation visualization … The controller may be configured to cause the AR system to display (such as in an overlay view or X-ray view) a representation that is a field of view of the instrument extending from the distal end of the instrument. The representation may include an orientation of the camera and a projection of the field of view cone.” yet Fig. 3 discloses the representation as a frustoconical, not a cone. The original specification does not specially define the term “cone”. Therefore, it is unclear whether “a cone” recited in the claims is referring to a shape of cone well known in the art or is referring to a frustoconical. Claims 2-5, 8-18, and 20 inherit the deficiency by the nature of their dependency on claims 1 and 19, respectively. For the purposes of the examination, the limitation in claims 1 and 19 is being given a broadest reasonable interpretation as “wherein the representation is at least one of: a cone or frustoconical”. Claims 1 and 19 each recites the new limitation “wherein the representation is at least one of: … a cuboid …”. The metes and bounds of the representation being a cuboid are unclear. Specifically, [0032] of the original specification discloses “the predicted working volume may be illustrated as a cube (a projected working volume)” yet Fig. 4 discloses the representation as a trapezoidal prism, not a cuboid. Therefore, it is unclear whether “a cuboid” recited in the claims is referring to a shape of cube well known in the art or is referring to a trapezoidal prism. Claims 2-5, 8-18, and 20 inherit the deficiency by the nature of their dependency on claims 1 and 19, respectively. For the purposes of the examination, the limitation in claims 1 and 19 is being given a broadest reasonable interpretation as “wherein the representation is at least one of: … a cuboid or trapezoidal prism …”. Claim 2 recites the new limitation “wherein the representation is a cone, the cone's orientation representing a field of view of the instrument extending from a tip of the distal end of the instrument based on a determined axis of the instrument”. First, it is unclear whether” a cone” recited in the limitation is the same or different from “a cone” recited in claim 1, to which claim 2 depends. Further, the antecedent basis for “the cone” is unclear between “a cone” recited in claim 1 and “a cone” also recited in the new limitation in claim 2. Second, the metes and bounds of the field of view being based on a determined axis of the instrument are unclear, especially when the original specification of the instant application does not disclose any algorithm of determining the axis of the instrument for the field of view of the instrument. Claims 3-4 inherit the deficiency by the nature of their dependency on claim 2. For purposes of the examination, the limitation is being given a broadest reasonable interpretation as “the cone's orientation representing a field of view of the instrument extending from a tip of the distal end of the instrument”. Claim 3 recites the new limitation “further comprising a three dimensional representation of a field of view of an endoscopic camera overlay”. It is unclear whether “a three dimensional representation” is the same or different from “a cone” recited in claim 1 or 2 or otherwise. A cone is well known in the art to be three dimensional. For purposes of the examination, the limitation is being given a broadest reasonable interpretation as “wherein the representation is the field of view of an endoscopic camera”. Claim 5 recites the new limitation “the representation is a cuboid which represents a working volume of the instrument based on the planning information”. It is unclear whether “a cuboid” recited in the limitation is the same or different from “a cuboid” recited in claim 1, to which claim 5 depends. Claims 6 and 8 inherit the deficiency by the nature of their dependency on claim 5. For purposes of the examination, the limitation is being given a broadest reasonable interpretation as “the representation is the cuboid which represents a working volume of the instrument based on the planning information”. Claim 9 recites the limitation “wherein the representation is a travel path of the distal end of the instrument over time, the travel path comprising a series of consecutive positions of a tip of the instrument”. It is unclear whether “a travel path” recited in the limitation is the same or different from newly recited “a travel path” recited in claim 1, to which claim 9 depends. Claims 10-15 inherit the deficiency by the nature of their dependency on claim 9. For purposes of the examination, the limitation is being given a broadest reasonable interpretation as “wherein the representation is the travel path of the distal end of the instrument over time, the travel path comprising a series of consecutive positions of a tip of the instrument”. Claim 20 recites the new limitation “wherein the representation is a cone, the cone's orientation representing a field of view of the instrument extending from a tip of the distal end of the instrument based on a determined axis of the instrument”. The metes and bounds of the field of view being based on a determined axis of the instrument are unclear, especially when the original specification of the instant application does not disclose any algorithm of determining the axis of the instrument for the field of view of the instrument. For purposes of the examination, the limitation is being given a broadest reasonable interpretation as “the cone's orientation representing a field of view of the instrument extending from a tip of the distal end of the instrument”. Claim 21 recites the new limitation “further comprising providing virtual geofencing, wherein an alert is generated when the distal end of the instrument is about to enter a no-go area of the patient”. First, “providing virtual geofencing” in the limitation recites a method within specifying the structure within the claimed system of claim 21 that is configured to perform the method. Thus, it is unclear whether claim 21 is dependent on claim 1 or the method claim of claim 19. Second, it is also unclear whether “wherein an alert is generated when the distal end of the instrument is about to enter a no-go area of the patient” in the limitation is reciting an intended result or attempts to claim a method without specifying the structure within the claimed system of claim 21 that is configured to perform the method. For purposes of the examination, the limitation is being given a broadest reasonable interpretation as “wherein the controller further configured to provide a virtual geofencing, and generate an alert when the distal end of the instrument is about to enter a no-go area of the patient”. Claim Rejections - 35 USC § 102 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1 and 19 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Qian et al. (Qian et al. ARssist: augmented reality on a head-mounted display for the first assistant in robotic surgery. Healthc Technol Lett. 2018 Sep 17; 5(5): 194-200. doi: 10.1049/htl.2018.5065. A copy provided by the Applicant in the IDS of 02 Oct 2023) – hereinafter referred to as Qian. Regarding claim 1, Qian discloses a computer aided surgery (CAS) system (at least Fig. 2) comprising: an augmented reality (AR) system (Fig. 2: Head-mounted display (HMD)) configured to display augmented reality information (pg. 194: 1. Introduction: use augmented reality (AR), based on an optical see through head-mounted display (HMD), which provides an unhindered and instantaneous real-world view, with computer graphics presented to the user on top of the real-view through optical combiners); a position tracking system (Fig. 2, Table 1, and pg. 194-195: 3.1. Components and transformation map in ARssist) configured to track positions of objects (Fig. 2: Kinematics, simultaneous localisation and mapping (SLAM), Fiducial Tracking; pg. 195: 3.1. Components and transformation map in ARssist: fiducial tracking, kinematics data, robot model, and pivot calibration for determining marker pose relative to a robot joint); an instrument (Fig. 4f: endoscopy) coupled to a navigational tracker detectable by the position tracking system (Fig. 2: robotic instrument (RI) with marker (M1) and Fig. 4f: fiducial marker on endoscopy; pg. 195: 3.1. Components and transformation map in ARssist: fiducial tracking, kinematics data, robot model, and pivot calibration for determining marker pose relative to a robot joint; pg. 196: 3.3.3. Endoscopy registered with the endoscope frustum: endoscope held by a robotic arm and the kinematics of the endoscopic arm for the pose of the endoscope at runtime; pg. 197: 4.1.2. Microsoft HoloLens: fiducial marker tracking implemented based on ARToolKit by HoloLens); and a controller (Fig. 5-6; pg. 197: 4.1.1. da Vinci research kit (dVRK): desktop computer with Ubuntu 16.04 operating system, Xeon(R) E5-1620 CPU and 28.8 G RAM; 4.1.2. Microsoft HoloLens: Microsoft HoloLens is a binocular OST-HMD featuring a holographic waveguide-based optical system, stable self-localisation capability, sufficient computational power for tracking, and good support from development tools) configured to: determine a first position of the instrument (pg. 195: 3.1. Components and transformation map in ARssist: transformations between the markers and the HMD, T M 1 H and T M 2 H , are computed at runtime through vision-based tracking algorithms; pg. 197: 4.1.2. Microsoft HoloLens: fiducial marker tracking implemented based on ARToolKit by HoloLens); based on the determined position, display augmented reality information using the AR system (Fig. 4f: Endoscopy visualization registered with viewing frustum; pg. 196: 3.3.3. Endoscopy registered with the endoscope frustum: Since the endoscope is held by a robotic arm, ARssist obtains the kinematics of the endoscopic arm and calculates the pose of the endoscope at runtime, and ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum), the augmented reality information comprising a representation of a relationship between at least a distal end of the instrument and a tissue of a patient (Fig. 4f: Endoscopy visualization registered with viewing frustum; pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum: ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum to inform about the geometry of the endoscope), wherein the representation is at least one of: a cone; a cuboid; a travel path; or an overlay of a virtual anatomical feature (Fig. 4f: Endoscopy visualization registered with viewing frustum; pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum: ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum); and if the instrument moves to a second position, update the representation (Fig. 7 and pg. 197: 4.1.3. Data flow: data are obtained at runtime and are updated frequently; pg. 195: 3.1. Components and transformation map in ARssist: transformations between the markers and the HMD, T M 1 H and T M 2 H , are computed at runtime through vision-based tracking algorithms; pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum: kinematics of the endoscopic arm obtained and pose of the endoscope is calculated at runtime, and combining the pose and field of view (FOV) of the endoscope, a frustum is rendered). It is noted that Qian discloses displaying the representation based on frequently updated runtime data of marker poses and robot kinematics (see Fig. 7 and pg. 197: 4.1.3. Data flow; pg. 195: 3.1. Components and transformation map in ARssist; and pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum). Therefore, Qian inherently discloses updating the representation, since the runtime data is frequently updated. Regarding claim 19, Qian discloses a method of using a computer aided surgery (CAS) system (at least Fig. 2) comprising: determining a first position of an instrument (Fig. 4f: endoscopy with fiducial marker; pg. 195: 3.1. Components and transformation map in ARssist: transformations between the markers and the HMD, T M 1 H and T M 2 H , are computed at runtime through vision-based tracking algorithms; pg. 197: 4.1.2. Microsoft HoloLens: fiducial marker tracking implemented based on ARToolKit by HoloLens) by a controller (Fig. 5-6; pg. 197: 4.1.1. da Vinci research kit (dVRK): desktop computer with Ubuntu 16.04 operating system, Xeon(R) E5-1620 CPU and 28.8 G RAM; 4.1.2. Microsoft HoloLens: Microsoft HoloLens is a binocular OST-HMD featuring a holographic waveguide-based optical system, stable self-localisation capability, sufficient computational power for tracking, and good support from development tools), wherein the instrument (Fig. 4f: endoscopy) coupled to a navigational tracker detectable by a position tracking system (Fig. 2: robotic instrument (RI) with marker (M1) and Fig. 4f: fiducial marker on endoscopy; pg. 195: 3.1. Components and transformation map in ARssist: fiducial tracking, kinematics data, robot model, and pivot calibration for determining marker pose relative to a robot joint; pg. 196: 3.3.3. Endoscopy registered with the endoscope frustum: endoscope held by a robotic arm and the kinematics of the endoscopic arm for the pose of the endoscope at runtime; pg. 197: 4.1.2. Microsoft HoloLens: fiducial marker tracking implemented based on ARToolKit by HoloLens), wherein the position tracking system (Fig. 2, Table 1, and pg. 194-195: 3.1. Components and transformation map in ARssist) is in communication with the controller (pg. 195: 3.1. Components and transformation map in ARssist: transformations between the markers and the HMD, T M 1 H and T M 2 H , are computed at runtime through vision-based tracking algorithms; pg. 197: 4.1.2. Microsoft HoloLens: fiducial marker tracking implemented based on ARToolKit by HoloLens) and is configured to track positions of objects (Fig. 2: Kinematics, simultaneous localisation and mapping (SLAM), Fiducial Tracking; pg. 195: 3.1. Components and transformation map in ARssist: fiducial tracking, kinematics data, robot model, and pivot calibration for determining marker pose relative to a robot joint); based on the determined position, causing, by the controller, an augmented reality (AR) system (Fig. 2: Head-mounted display (HMD)) configured to display augmented reality information (pg. 194: 1. Introduction: use augmented reality (AR), based on an optical see through head-mounted display (HMD), which provides an unhindered and instantaneous real-world view, with computer graphics presented to the user on top of the real-view through optical combiners) and in communication with the controller (pg. 195: 3.1. Components and transformation map in ARssist; pg. 197: 4.1.2. Microsoft HoloLens), to display augmented reality information comprising a representation of a relationship between at least a distal end of the instrument and a tissue of a patient (Fig. 4f: Endoscopy visualization registered with viewing frustum; pg. 196: 3.3.3. Endoscopy registered with the endoscope frustum: Since the endoscope is held by a robotic arm, ARssist obtains the kinematics of the endoscopic arm and calculates the pose of the endoscope at runtime, and ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum to inform about the geometry of the endoscope), wherein the representation is at least one of: a cone; a cuboid; a travel path; or an overlay of a virtual anatomical feature (Fig. 4f: Endoscopy visualization registered with viewing frustum; pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum: ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum); and based on a determination by the controller that the instrument has moved to a second position using information from the position tracking system, causing, by the controller, the AR system to display an updated representation of the relationship between the distal end of the instrument and the tissue of a patient (Fig. 7 and pg. 197: 4.1.3. Data flow: data are obtained at runtime and are updated frequently; pg. 195: 3.1. Components and transformation map in ARssist: transformations between the markers and the HMD, T M 1 H and T M 2 H , are computed at runtime through vision-based tracking algorithms; pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum: kinematics of the endoscopic arm obtained and pose of the endoscope is calculated at runtime, and combining the pose and field of view (FOV) of the endoscope, a frustum is rendered). It is noted that Qian discloses displaying the representation based on frequently updated runtime data of marker poses and robot kinematics (see Fig. 7 and pg. 197: 4.1.3. Data flow; pg. 195: 3.1. Components and transformation map in ARssist; and pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum). Therefore, Qian inherently discloses updating the representation, since the runtime data is frequently updated. Claim Rejections - 35 USC § 103 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 2-3 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Qian, as applied to claims 1 and 19 respectively above, and further in view of Rafii-Tari et al. (US PG Pub No. 2021/0059559) – hereinafter referred to as Rafii-Tari (‘559). Regarding claim 2, Qian discloses all limitations of claim 1, as discussed above, and Qian further discloses: wherein the representation is a frustum (Fig. 4f: Endoscopy visualization registered with viewing frustum), the frustum's orientation representing a field of view of the instrument extending from a tip of the distal end of the instrument based on a determined axis of the instrument (pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum: ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum). Qian does not disclose that the representation is a cone. Rafii-Tari (‘559) in the same field of providing an augmented reality system, however, teaches: a representation in a cone (Fig. 32 and [0167]: graphical indicator 1020 is a cone formed with an angle defining an aperture of the cone). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Rafii-Tari (‘559)’s representation of a cone. The combination would have yielded a reasonable expectation of success, since both Qian and Rafii-Tari (‘559) are directed to tracking an instrument using an augmented reality system. The motivation for the combination would have been to provide a suitable indicator, including “an aperture of the cone, where the aperture is based on an estimated error ranges of the orientation of the distal end 910 of the medical instrument”, as taught by Rafii-Tari (‘559; [0167]). Regarding claim 3, Qian in view of Rafii-Tari (‘559) discloses all limitations of claim 2, as discussed above, and Qian further discloses: a three dimensional representation of a field of view of an endoscopic camera overlay (Fig. 4f: Endoscopy visualization registered with viewing frustum; pg. 196-197: 3.3.3. Endoscopy registered with the endoscope frustum: endoscope comprising a standard camera, and ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum). Regarding claim 20, Qian discloses all limitations of claim 19, as discussed above, and Qian further discloses: wherein the representation is a frustum (Fig. 4f: Endoscopy visualization registered with viewing frustum), the frustum's orientation representing a field of view of the instrument extending from a tip of the distal end of the instrument based on a determined axis of the instrument (pg. 197: 3.3.3. Endoscopy registered with the endoscope frustum: ARssist renders a frustum extending the tip of the endoscope and projects the endoscopy on a clipping plane of the frustum). Qian does not disclose that the representation is a cone. Rafii-Tari (‘559) in the same field of providing an augmented reality system, however, teaches: a representation in a cone (Fig. 32 and [0167]: graphical indicator 1020 is a cone formed with an angle defining an aperture of the cone). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s method to include Rafii-Tari (‘559)’s representation of a cone. The combination would have yielded a reasonable expectation of success, since both Qian and Rafii-Tari (‘559) are directed to tracking an instrument using an augmented reality system. The motivation for the combination would have been to provide a suitable indicator, including “an aperture of the cone, where the aperture is based on an estimated error ranges of the orientation of the distal end 910 of the medical instrument”, as taught by Rafii-Tari (‘559; [0167]). Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Qian in view of Rafii-Tari (‘559), as applied to claim 2 above, and further in view of Inglis et al. (US PG Pub No. 2021/0137350) – hereinafter referred to as Inglis. Regarding claim 4, Qian in view of Rafii-Tari (‘559) discloses all limitations of claim 2, as discussed above, and Qian does not disclose: wherein the controller is further configured to determine a virtual view simulating a view of the tissue of the patient along the field of view; and cause the AR system to display the virtual view simulating the view of the tissue of the patient along the field of view. Inglis in the same field of providing an augmented reality system, however, teaches: determining a virtual view simulating a view of the tissue of the patient along the field of view ([0030]-[0031]: generation of the AR objects, including augmented floating window 32 of camera image 34 that is current, real-time video feed from the camera at the distal end of the endoscope); and causing an AR system to display the virtual view simulating the view of the tissue of the patient along the field of view (Fig. 2A-B: floating window 32 comprising camera image 34 of current, real-time video feed from the camera at the distal end of the endoscope 12 during the clinical procedure). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Inglis’s display of a virtual view simulating a view of the tissue of the patient along a field of view of an endoscope. The combination would have yielded a reasonable expectation of success, since both Qian and Inglis are directed to an augmented reality system for a surgical setting. The motivation for the combination would have been to provide “the current, real-time video feed from the camera at the distal end of the endoscope 12 during the clinical procedure”, as taught by Inglis ([0031]). Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Qian, as applied to claim 1 above, and further in view of Razzaque et al. (US PG Pub No. 2017/0065352) – hereinafter referred to as Razzaque. Regarding claim 5, Qian discloses all limitations of claim 1, as discussed above, and Qian does not disclose: wherein the controller is further configured to receive planning information regarding a position on a patient where the instrument is to be used, and the representation is a cuboid which represents a working volume of the instrument based on the planning information. Razzaque in the same field of providing a display, however, teaches: receiving a planning information regarding a position on a patient where the instrument is to be used (Fig. 26 and [0080]: volumetric region of interest 2670 including a planar CT scan’s region of interest 2660; [0074]: CT scan or other 3D preoperative imaging data obtained), and the representation is a cuboid which represents a working volume of the instrument based on the planning information (Fig. 26 and [0080]: volumetric region of interest 2670 including planar CT scan’s region of interest 2660 defined by medical device 2645 which is a rectilinear or cuboid shape). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Razzaque’s cuboid representation of a working volume of the interest. The combination would have yielded a reasonable expectation of success, since both Qian and Razzaque are directed to providing a display guidance for a surgical setting. The motivation for the combination would have been to provide a display of the region of interest “defin(ing) a volume or plane within the 3D visualizable medical data” relative to a medical instrument, as taught by Razzaque ([0079]). Claims 6 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Qian in view of Razzaque, as applied to claim 5 above, and further in view of Paul et al. (US PG Pub No. 2021/0378752, provided by the Applicant in the IDS of 02 Oct 2023) – hereinafter referred to as Paul. Regarding claim 6, Qian in view of Razzaque discloses all limitations of claim 5, as discussed above, and Qian does not disclose: wherein the planning information comprises information regarding a vertebral body. Paul in the same field of using an augmented reality system, however, teaches: a planning information comprising information regarding a vertebral body ([0129]-[0133]: pre-operatively planned procedure type including vertebral decompression, vertebral diffusion, and lumbar discectomy). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Paul’s planning information regarding a vertebral body. The combination would have yielded a reasonable expectation of success, since both Qian and Paul are directed to using an augmented reality system for a surgical setting. The motivation for the combination would have been to “provid[ing] navigation information to users and/or surgical robots for spine surgeries”, as taught by Paul ([0001]). Regarding claim 8, Qian in view of Razzaque and Paul discloses all limitations of claim 6, as discussed above, and Paul further teaches: wherein the instrument is a disc removal tool ([0147]-[0154]: pre-operatively planned tools including disc box cutter, disc rongeurs, etc.) and the working volume corresponds to boundaries for the distal end of the tool being disposed in an intervertebral disc space between two vertebral bodies ([0153]-[0155]: pre-operatively planned spinal disc preparation and created amount of disc space by removing portion of the intervertebral disc; Fig. 15: disc space bounded by adjacent vertebrae 1410a-b). Claims 9-11 are rejected under 35 U.S.C. 103 as being unpatentable over Qian, as applied to claim 1 above, and further in view of Rafii-Tari et al. (US PG Pub No. 2019/0110839) – hereinafter referred to as Rafii-Tari (‘839). Regarding claim 9, Qian discloses all limitations of claim 1, as discussed above, and Qian does not disclose: wherein the representation is a travel path of the distal end of the instrument over time, the travel path comprising a series of consecutive positions of a tip of the instrument. Rafii-Tari (‘839) in the same field of tracking a travel path of an instrument, however, teaches: a representation being a travel path of a distal end of an instrument over time (Fig. 26 and [0168]: visual indicia 172 illustrated as a darkened triangle signify that the instrument has traveled down a path), the travel path comprising a series of consecutive positions of a tip of the instrument ([0103]: embedded EM tracker in one or more positions of the medical instrument (e.g., the distal tip of an endoscope) may provide real-time indications of the progression of the medical instrument through the patient's anatomy). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Rafii-Tari (‘839)’s representation of a travel path of a distal end of an instrument over time. The combination would have yielded a reasonable expectation of success, since both Qian and Rafii-Tari (‘839) are directed to tracking a travel path of an instrument. The motivation for the combination would have been “to provide different information to a user regarding the historical positions of the instrument”, as taught by Rafii-Tari (‘839; [0168]-[0169]). Regarding claim 10, Qian in view of Rafii-Tari (‘839) discloses all limitations of claim 9, as discussed above, and Rafii-Tari (‘839) further teaches (also see claim 9 above): the travel path providing an indication of portions of the travel path that are more heavily traveled and more lightly traveled (Fig. 26 and [0168]: visual indicia 172 illustrated as a darkened triangle signify that the instrument has traveled down a path and visual indicia 174 illustrated as an undarkened triangle signify a path that has not been explored by the instrument). It is noted that Fig. 26 of Rafii-Tari (‘839) discloses heavily traveled path as a longer trail of darkened circles and triangles and lightly traveled path as a shorter trail of darkened circles and triangles. Regarding claim 11, Qian in view of Rafii-Tari (‘839) discloses all limitations of claim 9, as discussed above, and Rafii-Tari (‘839) further teaches (also see claim 9 above): the travel path indicating areas that are predicted as requiring more traversing of the distal end of the instrument (Fig. 26 and [0168]: visual indicia 172 illustrated as a darkened triangle signify that the instrument has traveled down a path and visual indicia 174 illustrated as an undarkened triangle signify a path that has not been explored by the instrument). It is noted that Fig. 26 of Rafii-Tari (‘839) discloses areas requiring more traversing as a shorter trail of darkened circles and triangles. Claims 12-15 are rejected under 35 U.S.C. 103 as being unpatentable over Qian in view of Rafii-Tari (‘839), as applied to claims 9 and 11 respectively above, and further in view of Paul. Regarding claim 12, Qian in view of Rafii-Tari (‘839) discloses all limitations of claim 11, as discussed above, and Qian does not disclose: wherein the instrument is a scraper, the tissue of the patient is an intervertebral disc space between two vertebral bodies, and the indication is an area of disc space predicted as requiring more scraping before insertion of an intervertebral implant. Paul in the same field of using an augmented reality system, however, teaches: instrument being a scraper ([0129]-[0155]: pre-operatively planned or intra-operatively used tools including scrapers), the tissue of the patient is an intervertebral disc space between two vertebral bodies ([0129]-[0155]: pre-operatively planned or intra-operatively created amount of disc space by removing portion of the intervertebral disc), and an indication is an area of disc space predicted as requiring more scraping before insertion of an intervertebral implant (Fig. 14-15 and [0221]: planned surgical procedure displaying disc gap 1510 during intraoperative stage for execution of the surgical plan; [0129]-[0154]: planned and created amount of disc space by removing a portion of intervertebral disc). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Paul’s utilization of the augmented reality system in a surgical setting. The combination would have yielded a reasonable expectation of success, since both Qian and Paul are directed to using an augmented reality system for a surgical setting. The motivation for the combination would have been to “provid[ing] navigation information to users and/or surgical robots for spine surgeries”, as taught by Paul ([0001]). Regarding claim 13, Qian in view of Rafii-Tari (‘839) discloses all limitations of claim 9, as discussed above, and Qian does not disclose: wherein the controller is further configured to use the travel path to determine a position of a surface of the patient's skin. Paul in the same field of using an augmented reality system, however, teaches: a controller configured to use the travel path to determine a position of a surface of the patient's skin ([0129]-[0149]: machine learning model 1300 trained based on pre-operative and intra-operative stage data, including planned or used incision location on patient). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Paul’s controller configured to use the travel path to determine a position of a surface of the patient’s skin. The combination would have yielded a reasonable expectation of success, since both Qian and Paul are directed to using an augmented reality system for a surgical setting. The motivation for the combination would have been to “provid[ing] navigation information to users and/or surgical robots for spine surgeries”, as taught by Paul ([0001]). Regarding claim 14, Qian in view of Rafii-Tari (‘839) and Paul discloses all limitations of claim 13, as discussed above, and Qian does not disclose: wherein the controller is further configured to use the travel path to determine a desired position of an incision on the patient. Paul in the same field of using an augmented reality system, however, teaches: the controller configured to use the travel path to determine a desired position of an incision on the patient ([0129]-[0149]: machine learning model 1300 trained based on pre-operative and intra-operative stage data, including planned or used incision location on patient). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Paul’s controller configured to use the travel path to determine a desired position of an incision on the patient. The combination would have yielded a reasonable expectation of success, since both Qian and Paul are directed to using an augmented reality system for a surgical setting. The motivation for the combination would have been to “provid[ing] navigation information to users and/or surgical robots for spine surgeries”, as taught by Paul ([0001]). Regarding claim 15, Qian in view of Rafii-Tari (‘839) discloses all limitations of claim 9, as discussed above, and Qian does not disclose: wherein the controller is further configured to use the travel path to determine an envelope of excised tissue from the patient. Paul in the same field of using an augmented reality system, however, teaches: the controller configured to use the travel path to determine an envelope of excised tissue from the patient ([0129]-[0154]: machine learning model 1300 trained based on pre-operative and intra-operative stage data, including planned or created amount of disc space by removing portion of intervertebral disc). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Paul’s controller configured to use the travel path to determine an envelope of excised tissue from the patient. The combination would have yielded a reasonable expectation of success, since both Qian and Paul are directed to using an augmented reality system for a surgical setting. The motivation for the combination would have been to “provid[ing] navigation information to users and/or surgical robots for spine surgeries”, as taught by Paul ([0001]). Claims 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Qian, as applied to claim 1 above, and further in view of Lang (US PG Pub No. 2022/0079675). Regarding claim 16, Qian discloses all limitations of claim 1, as discussed above, and Qian does not disclose: wherein the representation is an overlay of a virtual patient nerve adjacent to the instrument. Lang in the same field of tracking an instrument, however, teaches: a representation of an overlay of a virtual patient nerve adjacent to the instrument ([0160]-[0161]: virtual anatomical models and virtual models of instruments in a defined spatial relationship; [0349]-[0352]: virtual image includes nerve roots, and virtual image is spatially registered with live data including physical surgical instruments). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Lang’s representation of an overlay of a virtual patient nerve adjacent to the instrument. The combination would have yielded a reasonable expectation of success, since both Qian and Lang are directed to tracking an instrument. The motivation for the combination would have been to inform the operator of different types of tissues relative to the instrument during a surgical procedure ([0349]-[0351] of Lang). Regarding claim 17, Qian discloses all limitations of claim 1, as discussed above, and Qian does not disclose: wherein the representation is an overlay of a virtual patient vascular structure adjacent to the instrument. Lang in the same field of tracking an instrument, however, teaches: a representation of an overlay of a virtual patient vascular structure adjacent to the instrument ([0160]-[0161]: virtual anatomical models and virtual models of instruments in a defined spatial relationship; [0349]-[0352]: virtual image includes vascular structures, and virtual image is spatially registered with live data including physical surgical instruments). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Lang’s representation of an overlay of a virtual patient vascular structure adjacent to the instrument. The combination would have yielded a reasonable expectation of success, since both Qian and Lang are directed to tracking an instrument. The motivation for the combination would have been to inform the operator of different types of tissues relative to the instrument during a surgical procedure ([0349]-[0351] of Lang). Regarding claim 18, Qian discloses all limitations of claim 1, as discussed above, and Qian does not disclose: wherein the representation is an overlay of a virtual patient bone adjacent to the instrument. Lang in the same field of tracking an instrument, however, teaches: a representation of an overlay of a virtual patient bone adjacent to the instrument ([0160]-[0161]: virtual anatomical models and virtual models of instruments in a defined spatial relationship; [0349]-[0352]: virtual image includes bone, and virtual image is spatially registered with live data including physical surgical instruments). Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Lang’s representation of an overlay of a virtual patient bone adjacent to the instrument. The combination would have yielded a reasonable expectation of success, since both Qian and Lang are directed to tracking an instrument. The motivation for the combination would have been to inform the operator of different types of tissues relative to the instrument during a surgical procedure ([0349]-[0351] of Lang). Claim 21 is rejected under 35 U.S.C. 103 as being unpatentable over Qian, as applied to claim 1 above, and further in view of Walen et al. (US PG Pub No. 2022/0338938) – hereinafter referred to as Walen. Regarding claim 21, Qian discloses all limitations of claim 1, as discussed above, and Qian does not disclose: providing virtual geofencing, wherein an alert is generated when the distal end of the instrument is about to enter a no-go area of the patient. Walen in the same field of providing a surgical navigational guidance, however, teaches: providing virtual geofencing ([0058]: boundary generator is a software program or module that generates one or more virtual boundaries for constraining movement and/or operation of the surgical instruments), wherein an alert is generated when the distal end of the instrument is about to enter a no-go area of the patient ([0058]: virtual boundaries or alert zones may also be provided to control operation of the surgical instruments 220, 320, 420 relative to critical anatomical features that the surgeon wishes to avoid, target depths and/or target positions). It is noted that a broadest reasonable interpretation has been given to “virtual geofencing” recited in the limitation as defining virtual boundaries or a no-go area. Therefore, It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Qian’s system to include Walen’s virtual geofencing. The combination would have yielded a reasonable expectation of success, since both Qian and Walen are directed to tracking an instrument during a surgical guidance. The motivation for the combination would have been “to control operation of the surgical instruments 220, 320, 420 relative to critical anatomical features that the surgeon wishes to avoid, target depths and/or target positions”, as taught by Walen ([0058]). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Younhee Choi whose telephone number is (571)272-7013. The examiner can normally be reached M-F 9AM-5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anhtuan Nguyen can be reached at 571-272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Y.C./Examiner, Art Unit 3797 /ANH TUAN T NGUYEN/Supervisory Patent Examiner, Art Unit 3795 02/09/26
Read full office action

Prosecution Timeline

Mar 17, 2023
Application Filed
Jun 28, 2025
Non-Final Rejection — §102, §103, §112
Oct 03, 2025
Response Filed
Feb 04, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588895
SYSTEM, METHOD, COMPUTER-ACCESSIBLE MEDIUM AND APPARATUS FOR FLEXIBLE TWO-DIMENSIONAL ULTRASOUND PHASED ARRAY
2y 5m to grant Granted Mar 31, 2026
Patent 12588892
ULTRASOUND DIAGNOSTIC SYSTEM AND CONTROL METHOD OF ULTRASOUND DIAGNOSTIC SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12564473
BIOPSY SITE MARKER HAVING MOVABLE PORTIONS
2y 5m to grant Granted Mar 03, 2026
Patent 12544147
APPARATUS, SYSTEMS, AND METHODS FOR INTRAOPERATIVE INSTRUMENT TRACKING AND INFORMATION VISUALIZATION
2y 5m to grant Granted Feb 10, 2026
Patent 12544038
ULTRASOUND PATCH WITH INTEGRATED FLEXIBLE TRANSDUCER ASSEMBLY
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
72%
Grant Probability
99%
With Interview (+49.5%)
3y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 186 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month