Prosecution Insights
Last updated: April 19, 2026
Application No. 17/688,663

INTRAOPERATIVE DISPLAY FOR SURGICAL SYSTEMS

Non-Final OA §103
Filed
Mar 07, 2022
Examiner
BEUTEL, WILLIAM A
Art Unit
2616
Tech Center
2600 — Communications
Assignee
Cilag GmbH International
OA Round
4 (Non-Final)
70%
Grant Probability
Favorable
4-5
OA Rounds
2y 7m
To Grant
90%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
328 granted / 469 resolved
+7.9% vs TC avg
Strong +20% interview lift
Without
With
+20.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
28 currently pending
Career history
497
Total Applications
across all art units

Statute-Specific Performance

§101
9.9%
-30.1% vs TC avg
§103
49.8%
+9.8% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
22.0%
-18.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 469 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after allowance or after an Office action under Ex Parte Quayle, 25 USPQ 74, 453 O.G. 213 (Comm'r Pat. 1935). Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, prosecution in this application has been reopened pursuant to 37 CFR 1.114. Applicant's submission filed on February 11, 2026 has been entered. As a result of reopening prosecution, additional search and consideration results in the application being rejected under 35 U.S.C. 103, in view of newly cited art Parihar et al. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-6, 10, 15-17, 19 and 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Fahim et al. (US 2019/0183576 A1) in view of Lennartz et al. (US 2021/0267664 A1) and in further view of Parihar et al. (US 2019/0125457 A1). Regarding claim 1, Fahim discloses: An augmented reality display device (Fahim, Abstract; Fahim, ¶13: AR to enhance cardiovascular surgical mapping, navigation and procedural diagnostics; ¶16: AR display system enhancing surgical procedure; Also see ¶¶65-66) comprising: A processor (Fahim, ¶65) to: Receive functional data from a surgical instrument in a surgical area, (Fahim, ¶61: treatment device 12 may further include one or more sensors to monitor the operating parameters, including for example, pressure, temperature, flow rates, volume, or the like in the treatment device 12, in addition to monitoring, recording or otherwise conveying measurements or conditions of the ambient environment at the distal portion of the treatment device 12; ¶63: information of one or more sensors within treatment device 12 in communication with AR device 100, e.g. sensors for temperature, pressure, impedance, force, ECG, etc, which is captured, analyzed and displayed via AR display 200; ¶90: rendering virtual object and electrogram data during procedure, providing the physician 1002 with a 3D visual reference within his/her field of view via the augmented reality display system 200 as an immediately accessible visual guide) ; Determine a tissue interaction between tissue in the surgical area and at least a portion of the surgical instrument (Fahim, ¶¶58-59: system for treating tissue, used for treating, ablating, diagnosing, and/or mapping one or more target tissue areas; ¶62: one or more pressure sensors 39, coupled to distal portion of treatment device 12, may provide monitoring of the engagement between the treatment device 12 and a designated tissue region during a procedure; ¶71: In addition to sensing impedance (and therefore lesion quality and PV occlusion, as discussed herein), the electrodes 30, 31 may also be configured for mapping cardiac tissue (for example, recording cardiac electrograms) from adjacent tissue); Generate a first visual representation (Fahim, Fig. 11 and ¶99 discloses generating electrical mapping data related to procedure); Generate a second visual representation of the functional data received from the surgical instrument (Fahim, ¶63: information of one or more sensors within treatment device 12 in communication with AR device 100, e.g. sensors for temperature, pressure, impedance, force, ECG, etc, which is captured, analyzed and displayed via AR display 200; ¶71: he additional electrodes may be used for electrical impedance tomography imaging to visualize, i.e., “see” the ice formation of the cryoablation. Information from the thermocouple 37 and/or electrodes 30, 31 may be transmitted to the console 18 and/or an augmented reality device 100, and this information may be used to create or modify a virtual representation of a virtual organ object 1000 (see FIGS. 10-12) to provide device location information and/or procedure guidance or feedback to the physician); and Generate an overlay of an operational aspect of the surgical instrument by combining the first visual representation and the second visual representation on a real image of the surgical area (Fahim, ¶16: AR display system enhancing surgical procedure; ¶52: camera facing forward to capture images and video of portion of real-world perceived by a user wearing the eyeglasses; Fig. 12 and ¶91: overlaying real-world environment as seen by user with virtual organ object attached to and overlaying patient’s heart 800; ¶94: attach virtual organ to patient’s anatomy, e.g. chest or heart; ¶¶98-99: sensor data from sensors displayable by AR device 100 within physician’s view overlaying real-world environment in real-time during procedure – as in Figs. 6a-6b and 11; Fig. 11 and ¶99 and ¶104: sensor of parameter of heart with electrical mapping data displayed during surgical treatment, as well as AR visual display including monitored position of treatment device; Fig. 12 and ¶¶90-92 and 94 discloses overlaying virtual organ object to patient’s anatomy; ¶98: In addition, sensor data from sensors in the medical system 10, the navigation system 50, and/or the mapping system 500 may be displayable by the augmented reality device 100 within the physician's 1002 field of view, overlaying the real-world environment. Data, such as temperature readings, pressure, electrical properties, and the like, may be displayed by the augmented reality device 100 via the augmented reality display system 200 in real-time during the procedure in order to assist the physician with being provided a real-time assessment of the efficacy of the procedure so that the physician can adjust his/her approach on-the-fly, as warranted. ¶100: In other embodiments, the medical sensor(s) may include a temperature sensor (e.g., thermocouple 37) or the one or more pressure sensors 39. Such sensor data may allow the physician 1002 to make an informed assessment as to the location of a treatment device, such as for example, the treatment device 12, and/or the quality of a treatment. For example, cryoablation requires extremely low temperatures in order to freeze the tissue. Temperature data from the thermocouple 37 indicating a temperature value beneath an expected predetermined temperature value for the cryoablation may alert the physician 1002 that a lesion may not be well-formed, or that the treatment device 12 may not be properly positioned to fully occlude the PV, as will be explained in more detail herein below. In embodiments where an indication of the sensor data is displayed via the display system 200 (which may also be referred to as an augmented reality display system 200), the physician 1002 may not be required to look away from the patient in order to be alerted. In another embodiment, the sensor may be an external sensor to the patient, such as, for example, the electrode patches 54. ). Fahim alone does not explicitly disclose generating a first visual representation of the tissue interaction. Lennartz discloses: Receive functional data from a surgical instrument in a surgical area, wherein the surgical instrument comprises an end effector with jaws (Lennartz, Abstract and Fig. 1; ¶44: jaw members of end of effector assembly 100; Fig 6, and ¶¶43-44: controller receives sensor data from sensor mechanism to generate a tissue indication that is output for display, including for jaw members of an effector assembly, including displaying tissue indication in form of video image during grasping of tissue) Generate a first visual representation of the tissue interaction Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Fahim modified by Lennartz fails to teach the visual representation indicates movement or flow of the tissue between the jaws. Parihar discloses: Generate a first visual representation of the tissue interaction that indicates movement or flow of the tissue between the jaws (Parihar, ¶¶1274-1275: force-to-close applied to end-effector; ¶1273: FIG. 125 is a graph 6280 of tissue creep clamp stabilization curves 6282, 6284 for two tissue types, according to one aspect of the present disclosure. The clamp stabilization curves 6284, 6284 are plotted as force-to-close (FTC) as a function of time, where FTC (N) is displayed along the vertical axis and Time, t, (Sec) is displayed along the horizontal axis; ¶1276: The end-effector 6234 clamp stabilization is monitored as described above in connection with FIGS. 122-124 and is displayed every second corresponding the sampling times of the force-to-close to provide user feedback regarding the state of the clamped tissue; Note further that ¶¶1282-1283 discloses the use of augmented reality vision including presenting an overlay of a status of a device within or around a displayed image) It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, to include the additional visual guidance related to grasped tissue conditions as provided by Parihar, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data with tissue conditions in place of other surgical tool related data. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by providing additional important surgical guidance information to an operator to provide better feedback and understanding of conditions for improved medical procedures. Regarding claim 2, Fahim modified by Lennartz further discloses: Wherein the jaws are configured to capture tissue therebetween (Lennartz, Abstract and ¶44), and wherein the operational aspect of the surgical instrument is clamping tissue between the jaws of the end effector (Lennartz, Fig 6, and ¶¶43-44: controller receives sensor data from sensor mechanism to generate a tissue indication that is output for display, including for jaw members of an effector assembly, including displaying tissue indication in form of video image during grasping of tissue) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 3, Fahim modified by Lennartz further discloses: Wherein the first visual representation of the tissue indication indicates appropriate capture of the tissue between the jaws of the end effector (Lennartz, ¶44: The tissue indication 450, more specifically, may indicate a location of the grasped tissue “T” along the length of jaw member 110, a size of the grasped tissue “T” relative to jaw member 110, and/or properties or features of the grasped tissue “T,” e.g., tissue texture, tissue type, tissue state, etc. Knowing the location of the grasped tissue “T” along the length of jaw member 110, the size of the grasped tissue “T” relative to the jaw member 110, and/or properties or features of the grasped tissue “T” enables a user to determine, for example, whether the grasped tissue “T” is properly positioned between jaw members 110, 120 and/or whether the grasped tissue “T” is too larger or too small, thus allowing the user to determine, for example, whether re-grasping or other remedial action is necessary or whether the user can proceed to treat and/or cut the tissue “T.”; ¶51: The resulting pressure map of jaw member 110 can then be converted to a visual map that is output as a tissue indication for display on display 430) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 4, Fahim modified by Lennartz further discloses: Wherein the first visual representation of the tissue interaction indicates incorrectly positioned tissue between the jaws of the end effector (Lennartz, ¶44: The tissue indication 450, more specifically, may indicate a location of the grasped tissue “T” along the length of jaw member 110, a size of the grasped tissue “T” relative to jaw member 110, and/or properties or features of the grasped tissue “T,” e.g., tissue texture, tissue type, tissue state, etc. Knowing the location of the grasped tissue “T” along the length of jaw member 110, the size of the grasped tissue “T” relative to the jaw member 110, and/or properties or features of the grasped tissue “T” enables a user to determine, for example, whether the grasped tissue “T” is properly positioned between jaw members 110, 120 and/or whether the grasped tissue “T” is too larger or too small, thus allowing the user to determine, for example, whether re-grasping or other remedial action is necessary or whether the user can proceed to treat and/or cut the tissue “T.”; ¶51: The resulting pressure map of jaw member 110 can then be converted to a visual map that is output as a tissue indication for display on display 430) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 5, Fahim modified by Lennartz further discloses: wherein the first visual representation of the tissue interaction indicates insufficiently captured tissue between the jaws of the end effector (Lennartz, ¶44: The tissue indication 450, more specifically, may indicate a location of the grasped tissue “T” along the length of jaw member 110, a size of the grasped tissue “T” relative to jaw member 110, and/or properties or features of the grasped tissue “T,” e.g., tissue texture, tissue type, tissue state, etc. Knowing the location of the grasped tissue “T” along the length of jaw member 110, the size of the grasped tissue “T” relative to the jaw member 110, and/or properties or features of the grasped tissue “T” enables a user to determine, for example, whether the grasped tissue “T” is properly positioned between jaw members 110, 120 and/or whether the grasped tissue “T” is too larger or too small, thus allowing the user to determine, for example, whether re-grasping or other remedial action is necessary or whether the user can proceed to treat and/or cut the tissue “T.”; ¶51: The resulting pressure map of jaw member 110 can then be converted to a visual map that is output as a tissue indication for display on display 430) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 6, Fahim modified by Lennartz further discloses: wherein the first visual representation of the tissue interaction indicates tension of the tissue between the jaws of the end effector (Lennartz, ¶¶50-51: pressure resistant panels incorporated into jaw members, “configured to sense a force acting thereon and a location(s) of the applied force. In this manner, controller 410, using the force and location data provided by the one or more pressure-sensitive resistor panels 442 can determine the location(s) along jaw member 110 where tissue is grasped (as the grasped tissue provides the applied force) and the location(s) along jaw member 110 where no tissue is grasped (where no applied force is detected or where a detected applied force is below a minimum threshold). The resulting pressure map of jaw member 110 can then be converted to a visual map that is output as a tissue indication for display on display 430.”) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 10, Lennartz further discloses: The jaws are configured to capture tissue therebetween and (Lennartz, Abstract and ¶44), wherein the operational aspect of the surgical instrument comprises clamping on metal or foreign object between the jaws (Lennartz, ¶43: output sensor information, which information may include the presence and/or location of tissue, a tissue type, a tissue state, the presence and/or location of a foreign object, the presence and/or location of a critical tissue, etc.; ¶56 discloses warning issued when non-tissue object sensed) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 15, Lennartz further discloses: Wherein the jaws are configured to capture tissue therebetween, (Lennartz, Abstract and ¶44) and wherein the operational aspect of the surgical instrument comprises a clamping magnitude of the clamping status of the jaws (Lennartz, ¶¶50-51: pressure resistant panels incorporated into jaw members, “configured to sense a force acting thereon and a location(s) of the applied force. In this manner, controller 410, using the force and location data provided by the one or more pressure-sensitive resistor panels 442 can determine the location(s) along jaw member 110 where tissue is grasped (as the grasped tissue provides the applied force) and the location(s) along jaw member 110 where no tissue is grasped (where no applied force is detected or where a detected applied force is below a minimum threshold). The resulting pressure map of jaw member 110 can then be converted to a visual map that is output as a tissue indication for display on display 430.”’) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 16, Fahim modified by Lennartz further discloses: Wherein first visual representation of the tissue interaction further indicates tissue compression (Lennartz, ¶¶50-51: pressure resistant panels incorporated into jaw members, “configured to sense a force acting thereon and a location(s) of the applied force. In this manner, controller 410, using the force and location data provided by the one or more pressure-sensitive resistor panels 442 can determine the location(s) along jaw member 110 where tissue is grasped (as the grasped tissue provides the applied force) and the location(s) along jaw member 110 where no tissue is grasped (where no applied force is detected or where a detected applied force is below a minimum threshold). The resulting pressure map of jaw member 110 can then be converted to a visual map that is output as a tissue indication for display on display 430.” – note that by combining the information of the pressure map while the clamp is physically compressed by the instrument results in the combination of the aspect of the tissue with the overlaid data; ¶51: The resulting pressure map of jaw member 110 can then be converted to a visual map that is output as a tissue indication for display on display 430) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 17, the limitations included from claim 1 are rejected based on the same rationale as claim 1 set forth above. Further regarding claim 17, Lennartz discloses: Wherein the jaws are configured to capture tissue therebetween, (Lennartz, Abstract and ¶44) and wherein the operational aspect of the surgical instrument is sufficiency of clamping of the jaws (Lennartz, ¶44: The tissue indication 450, more specifically, may indicate a location of the grasped tissue “T” along the length of jaw member 110, a size of the grasped tissue “T” relative to jaw member 110, and/or properties or features of the grasped tissue “T,” e.g., tissue texture, tissue type, tissue state, etc. Knowing the location of the grasped tissue “T” along the length of jaw member 110, the size of the grasped tissue “T” relative to the jaw member 110, and/or properties or features of the grasped tissue “T” enables a user to determine, for example, whether the grasped tissue “T” is properly positioned between jaw members 110, 120 and/or whether the grasped tissue “T” is too larger or too small, thus allowing the user to determine, for example, whether re-grasping or other remedial action is necessary or whether the user can proceed to treat and/or cut the tissue “T.”) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 19, Fahim further discloses: wherein the surgical instrument comprises an energy device, wherein a tissue parameter is any one of impedances, cautery status, bleeding or magnitude, and wherein the operational aspect of the surgical instrument comprises one or more of energy level, timing, or clamp pressure. (Fahim, ¶58: catheter as ablation electrode using energy for treatment; ¶68 discusses treatment device including one or more distal electrodes including for sensing impedance or temperature; ¶71: sensing electrical impedance used to create or modify a virtual representation of a virtual organ object 1000 (see FIGS. 10-12) to provide device location information and/or procedure guidance or feedback to the physician – i.e. electricity is a type of energy, for which the flow measurement through resistance measurement is an “energy level”) Regarding claim 22, Fahim further discloses: Wherein the processor is further configured to determine at least one of the tissue interaction or the operational aspect based on the functional data (Fahim, ¶61: treatment device including sensors to monitor operating parameters, such as pressure, temperature, flow rates, volume or the like in treatment device 12, where sensors are in communication with control unit of console for initiating or triggering alerts during operation of treatment device; ¶62: pressure sensors in communication with control unit, where information from the one or more pressure sensors 39 may be transmitted to the console 18 and/or an augmented reality device 100, and this information may be used to create or modify a virtual representation of a virtual organ object 1000 (see FIGS. 10-12) to provide procedure guidance or feedback to the physician, also including for temperature, pressure, impedance, force, ECG etc.) Claim(s) 7, 11-13 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Fahim et al. (US 2019/0183576 A1) in view of Lennartz et al. (US 2021/0267664 A1) and Parihar et al. (US 2019/0125457 A1) in further view of Messerly et al. (US 2019/0205001 A1). Regarding claim 7, the limitations included from claim 1 are rejected based on the same rationale as claim 1 set forth above and incorporated herein. Further regarding claim 7, Lennartz discloses: Wherein the jaws are configured to capture tissue there between (Lennartz, Abstract, Fig 6, and ¶¶43-44: controller receives sensor data from sensor mechanism to generate a tissue indication that is output for display, including for jaw members of an effector assembly, including displaying tissue indication in form of video image during grasping of tissue) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Fahim modified by Lennartz does not explicitly teach the operational aspect of the instrument as overheating. Messerly teaches: Wherein the operational aspect of the surgical instrument comprises the surgical instrument overheating (Messerly, ¶229: integrated diagnostics provide indications of overtemperature; ¶324 further discloses instrument based temperature sensor; ¶505 also discloses alerts for conditions, including temperature of ultrasonic blade) Fahim, Lennartz and Messerly are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, and using the additional visual guidance related to grasped tissue conditions as provided by Parihar, by providing utilizing the technique of obtaining sensor data for tool overheating during operation as taught by Messerly, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of medical sensor data for use in displaying on the augmented reality information system for another, yielding predictable results of displaying sensor based data related to proper tool operation during a procedure. Moreover, the modification results in an improved surgical information system by providing additional useful information to the surgeon. Regarding claim 11, the limitations included from claim 10 are rejected based on the same rationale as the rejection of claim 10 set forth above. Further regarding claim 11, Messerly discloses: Wherein the first visual representation of the tissue interaction further indicates a geometric relationship of transections to the tissue and to other firings (Messerly, ¶230: positioning system including firing member/bar, tracking linear displacement of firing member; ¶345 discloses procedure involving transection and aligning anvil trocar of stapler; Figs. 23-25 and ¶356-357: interactive surgical system including detecting staple line during transection and displaying image centered around the double staple line to ensure overlap, and aligning anvil trocar to overlap staple line) Fahim, Lennartz and Messerly are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, and using the additional visual guidance related to grasped tissue conditions as provided by Parihar, by providing further use of the augmented reality for alignment during a transection as provided by Messerly, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing additional guidance data of a surgical tool during an operation. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools for different operations and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 12, the limitations included from claim 10 are rejected based on the same rationale as the rejection of claim 10 set forth above. Further regarding claim 12, Messerly discloses: Wherein the wherein the operational aspect of the surgical instrument comprises anvil orientation of the surgical instrument (Messerly, ¶433: second jaw including anvil; ¶513: icon shows staple and could be used to communicate to the surgeon that the anvil of instrument are properly positioned) Fahim, Lennartz and Messerly are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, and using the additional visual guidance related to grasped tissue conditions as provided by Parihar, by providing further use of the augmented reality for alignment of an anvil as provided by Messerly, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing additional guidance data of a surgical tool during an operation. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools, such as jaws including an anvil or otherwise, for different operations and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Regarding claim 13, the limitations included from claim 10 are rejected based on the same rationale as the rejection of claim 10 set forth above. Further regarding claim 13, Messerly further discloses: Wherein the first visual representation of the tissue interaction indicates tissue thickness (Messerly, ¶239: A magnetic field sensor can be employed to measure the thickness of the captured tissue; ¶284: sensed impedance is indicative of thickness of tissue; ¶400: The secondary screen may display tissue gap, impedance, tissue compression stability (creep), etc.; ¶516: feedback menu includes plurality of feedback categories, such as thickness of clamped tissue) Fahim, Lennartz and Messerly are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, and using the additional visual guidance related to grasped tissue conditions as provided by Parihar, by providing utilizing the technique of obtaining sensor data for tissue thickness as taught by Messerly, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of medical sensor data for use in displaying on the augmented reality information system for another, yielding predictable results of displaying sensor based data related to an effector type tool during a procedure. Moreover, the modification results in an improved surgical information system by providing additional useful information to the surgeon. Regarding claim 18, Fahim modified by Lennartz further discloses: Wherein the jaws are configured to capture tissue there between (Lennartz, Abstract, Fig 6, and ¶¶43-44: controller receives sensor data from sensor mechanism to generate a tissue indication that is output for display, including for jaw members of an effector assembly, including displaying tissue indication in form of video image during grasping of tissue) Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Messerly discloses: Wherein the operational aspect of the surgical instrument comprises firing sufficiency (Messerly, ¶238: load sensor can measure firing force in firing stroke of surgical instrument or tool; ¶240: The measurements of the tissue compression, the tissue thickness, and/or the force required to close the end effector on the tissue, as respectively measured by the sensors 474, 476, can be used by the microcontroller 461 to characterize the selected position of the firing member and/or the corresponding value of the speed of the firing member; ¶516: instrument feedback menu includes different categories of feedback data of surgical instrument, including velocity and force of firing element) Fahim, Lennartz and Messerly are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, and using the additional visual guidance related to grasped tissue conditions as provided by Parihar, by providing utilizing the technique of obtaining sensor data for instrument firing as related to tissue characteristics as taught by Messerly, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of medical sensor data for use in displaying on the augmented reality information system for another, yielding predictable results of displaying sensor-based data related to a surgical tool during a procedure. Moreover, the modification results in an improved surgical information system by providing additional useful information to the surgeon. Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Fahim et al. (US 2019/0183576 A1) in view of Lennartz et al. (US 2021/0267664 A1) and Parihar et al. (US 2019/0125457 A1) in further view of Itkowitz et al. (US 2009/0192524 A1). Regarding claim 8, the limitations included from claim 1 are rejected based on the same rationale as claim 1 set forth above and incorporated herein. Further regarding claim 8, Fahim modified by Lennartz further discloses: Wherein the jaws are configured to capture tissue therebetween (Lennartz, Abstract and ¶44). Both Fahim and Lennartz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector jaws data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Itkowitz discloses: Wherein the jaws are configured to capture tissue therebetween (Itkowitz, ¶3: effectors including clamps, graspers, scissors, staplers, etc.; See Fig. 13 and ¶83: displaying over clamp for tool), and Wherein the operational aspect of the surgical instrument is jaw closure position (Itkowitz, Fig. 13 and ¶83: display text message “closed” over video image of tool to indicate that the clamp for the tool is closed) Fahim, Lennartz and Itkowitz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, and using the additional visual guidance related to grasped tissue conditions as provided by Parihar using the augmented reality instrument information for effector clamps and graspers in place of the surgical instrument as provided by Itkowitz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector clamp/grasper data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Claim 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Fahim et al. (US 2019/0183576 A1) in view of Lennartz et al. (US 2021/0267664 A1) and Parihar et al. (US 2019/0125457 A1) in further view of Itkowitz et al. (US 2009/0192524 A1). Regarding claim 14, the limitations included from claim 10 are rejected based on the same rationale as claim 10 set forth above. Further regarding claim 14, Itkowitz discloses: Wherein the operational aspect of the surgical instrument is clamping status of the jaws (Itkowitz, Fig. 13 and ¶83: display text message “closed” over video image of tool to indicate that the clamp for the tool is closed) Fahim, Lennartz and Itkowitz are directed to augmented reality display of information to a surgeon during a surgical procedure. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable expectation of success, to modify the augmented reality system for displaying an overlay of surgical instrument information on a patient during surgery as provided by Fahim, using the augmented reality instrument information for effector jaws in place of the surgical instrument as provided by Lennartz, and using the additional visual guidance related to grasped tissue conditions as provided by Parihar by further incorporating the additional augmented reality instrument information for effector clamps and graspers in place of the surgical instrument as provided by Itkowitz, using known electronic interfacing and programming techniques. The modification merely substitutes one known type of instrument data as the instrument data overlay during a surgical procedure for another, yielding predictable results of providing the effector clamp/grasper data in place of the other instrument data for guiding a user. Both techniques were known prior to the applicant’s invention for sensing surgical tool data and displaying related data to a surgeon using augmented reality principles. Furthermore, the modification results in an improved augmented reality surgical device by allowing an increased versatility of surgical tools and allowing utilizing of additional types of surgical instruments for improved usability and useful guidance. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM A BEUTEL whose telephone number is (571)272-3132. The examiner can normally be reached Monday-Friday 9:00 AM - 5:00 PM (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, DANIEL HAJNIK can be reached at 571-272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WILLIAM A BEUTEL/Primary Examiner, Art Unit 2616
Read full office action

Prosecution Timeline

Mar 07, 2022
Application Filed
Sep 24, 2024
Non-Final Rejection — §103
Jan 27, 2025
Response Filed
Mar 26, 2025
Final Rejection — §103
Jul 01, 2025
Request for Continued Examination
Jul 02, 2025
Response after Non-Final Action
Jul 08, 2025
Non-Final Rejection — §103
Oct 02, 2025
Response Filed
Jan 13, 2026
Response after Non-Final Action
Feb 11, 2026
Request for Continued Examination
Feb 18, 2026
Response after Non-Final Action
Feb 19, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12581262
AUGMENTED REALITY INTERACTION METHOD AND ELECTRONIC DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12572258
APPARATUS AND METHOD WITH IMAGE PROCESSING USER INTERFACE
2y 5m to grant Granted Mar 10, 2026
Patent 12566531
CONFIGURING A 3D MODEL WITHIN A VIRTUAL CONFERENCING SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12561927
MEDIA RESOURCE DISPLAY METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12554384
SYSTEMS AND METHODS FOR IMPROVED CONTENT EDITING AT A COMPUTING DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
70%
Grant Probability
90%
With Interview (+20.4%)
2y 7m
Median Time to Grant
High
PTA Risk
Based on 469 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month