Prosecution Insights
Last updated: April 19, 2026
Application No. 17/277,414

GRAPHICAL USER INTERFACE FOR DEFINING AN ANATOMICAL BOUNDARY

Final Rejection §103
Filed
Mar 18, 2021
Examiner
CWERN, JONATHAN
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Intuitive Surgical Operations, Inc.
OA Round
8 (Final)
50%
Grant Probability
Moderate
9-10
OA Rounds
4y 2m
To Grant
87%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
402 granted / 797 resolved
-19.6% vs TC avg
Strong +36% interview lift
Without
With
+36.3%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
51 currently pending
Career history
848
Total Applications
across all art units

Statute-Specific Performance

§101
4.0%
-36.0% vs TC avg
§103
48.9%
+8.9% vs TC avg
§102
14.0%
-26.0% vs TC avg
§112
26.5%
-13.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 797 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 5, 15-16, 19, 21-22, 24, 26, 28, and 50-55 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rich (US 2008/0071292) in view of Higgins et al. (US 2008/0183073; hereinafter Higgins), Jenkins et al. (US 2010/0312095; hereinafter Jenkins), Bharadwaj et al. (US 2016/0038248; hereinafter Bharadwaj), and Krimsky (US 2018/0055574; hereinafter Krimsky). Rich shows a medical system and method comprising: a display system ([0013]); a user input device ([0027]); a medical instrument ([0010]); a manipulator assembly configured to support and operate the medical instrument ([0010]); and a control system ([0014]) communicatively coupled to at least the display system, the user input device, and the manipulator assembly, the control system configured to: determine a location of an anatomical target in a three-dimensional region ([0014]); display image data corresponding to a three-dimensional anatomical region via the display system ([0014]); receive a first user input to generate a first curve in the three-dimensional anatomical region via the user input device (user select target, [0027]-[0028]); control the manipulator assembly to operate medical instrument using the determined anatomical boundary to limit movement of the medical instrument (automated system automatically guides instrument, [0010]). Rich also shows wherein the control system is further configured to display guidance information via the display system during placement of the curve (Figures 2-7 illustrate guidance information displayed throughout procedure); wherein the guidance information includes an extrapolated projection of the anatomical boundary in regions outside of a current range of the anatomical boundary (Figure 5-7 show extrapolated projections regarding where the instrument will intersect anatomical locations away from the target); wherein the control system is further configured to: display the anatomical boundary overlaid on an anatomic model derived from the image data via the display system; and deform the anatomical boundary to conform with deformations of the anatomic model based on movement of a patient anatomy (display is updated continuously throughout interventional procedure, and will therefore be deformed throughout procedure continuously to conform with movement of patient anatomy; [0017]); wherein the control system is further configured to: receive a third user input while a medical instrument is located within the three- dimensional anatomical region and responsive to the third user input, direct an orientation of a distal end of the medical instrument away from the anatomical boundary (display corrective information during procedure, allowing instrument to be properly oriented to avoid intervening structures [0014]-[0015]); wherein the control system is further configured to determine a distance between a distal end of a virtual medical instrument and the anatomical boundary ([0014]-[0015]); wherein the control system is further configured to: determine a distance between a distal end of a medical instrument and the anatomical boundary ([0014]-[0015]); wherein the control system is further configured to: provide a visual, audible, or haptic indicator when the distance between the distal end of the medical instrument and the anatomical boundary is less than a predetermined threshold distance (display corrective information, [0014]-[0015]); wherein the first curve is generated at least partially along the intersection to mark the at-risk portion relative to the image data (Figures 4-7). Rich fails to show receive a second user input to generate a second curve in the three-dimensional anatomical region via the user input device; determine an anatomical boundary bounded by the first curve and the second curve, the anatomical boundary indicating a surface of an anatomical structure in the three-dimensional anatomical region. Rich fails to show data corresponding to a lung, receive user inputs to generate a plurality of curves in different slices of the CT image data; interpolating among the plurality of curves to determine an anatomical boundary indicating a location of a pleura of the lung; and providing a variety of planned navigation paths for navigating a medical instrument based on different likelihoods that the medical instrument will breach the anatomical boundary. Rich fails to show based on the first curve, display guidance information in a second view of the three-dimensional region via the display system to assist with placement of a second curve in the second view of the three-dimensional anatomical region, wherein the guidance information includes a projection overlaid on the second view of the three-dimensional anatomical region, wherein the projection indicates a position of the first curve in the first view; wherein the projection includes a first boundary and a second boundary overlaid on the second slice of image data, and wherein the projection indicates a range of potential deployment locations of a medical instrument, the range of potential deployment locations being between the first and second boundary, wherein the anatomical target is reachable by the medical instrument within the range of potential deployment locations; determine an at-risk portion of the anatomical structure based on an intersection between the surface of the anatomical structure and the range of potential deployment locations of the medical instrument, wherein the location of the anatomical target is positioned between a distal end of the medical instrument and the at-risk portion of the surface of the anatomical structure; display the at-risk portion of the surface of the anatomical structure, wherein the at-risk portion of the surface is displayed as extending between the first boundary and the second boundary of the projection and as extending along the anatomical boundary. Rich also fails to show determine a respective level of risk associated with different locations of the at-risk portion of the surface of the anatomical structure; and display, via the display system, the different locations of the at-risk portion. Rich also fails to show receiving a first manual input, the first manual input tracing a first line along the anatomical boundary, the first line corresponding to the first curve. Rich also fails to show wherein the control system is further configured to: receive a third user input to generate a third curve in the three-dimensional anatomical region via the user input device; adjust the anatomical boundary to be bounded by the first curve, the second curve, and the third curve; and display the adjusted anatomical boundary with the image data via the display system; wherein the control system is further configured to: provide one or more suggested deployment locations for a medical instrument, wherein the one or more suggested deployment locations are located at least a threshold distance from the anatomical boundary. Rich also fails to show receiving a user input in a first slice of the image data, the first slice displaying a portion of the three-dimensional anatomical region; display guidance information in a second slice of the image data displaying at least a portion of the three-dimensional anatomical region. Rich fails to show wherein the projection overlaid on the second image slice of the image data indicates an at-risk portion of the surface of the anatomical structure based on an intersection between the surface and a range of potential deployment locations of a medical instrument. Rich fails to show wherein the projection overlaid on the second slice of image data is a two-dimensional projection of a three-dimensional zone, the three-dimensional zone based on an exit point of the medical instrument and a target. Rich fails to show wherein the projection is displayed as extending from an exit point of the medical instrument through a target. Higgins discloses methods and apparatus for 3D route planning. Higgins teaches receive a second user input to generate a second curve in the three-dimensional anatomical region via the user input device; determine an anatomical boundary bounded by the first curve and the second curve, the anatomical boundary indicating a surface of an anatomical structure in the three-dimensional anatomical region (plural region of interest, Figure 2; [0052], [0055], [0130]-[0131]). Higgins teaches data corresponding to a lung ([0044], [0057]), receive user inputs to generate a plurality of curves in different slices of the CT image data (MDCT images, [0115], [0130]-[0131], [0133]-[0134]); interpolating among the plurality of curves to determine an anatomical boundary indicating a location of a pleura of the lung (labeling lobes and segments, [0057]; Figures 7-11 illustrate various locations for route planning throughout the lungs indicative of the pleura); and providing a variety of planned navigation paths for navigating a medical instrument based on different likelihoods that the medical instrument will breach the anatomical boundary (route should avoid puncturing major blood vessels, [0060], [0069], [0080], [0104], [0113]-[0114]). Higgins also shows the control system is further configured to: receive a third user input to generate a third curve in the three-dimensional anatomical region via the user input device; adjust the anatomical boundary to be bounded by the first curve, the second curve, and the third curve; and display the adjusted anatomical boundary with the image data via the display system (user can change parameters including ROI dilation and branches from which extensions are allowed, and update the file, [0130]-[0131]); wherein the control system is further configured to: provide one or more suggested deployment locations for a medical instrument, wherein the one or more suggested deployment locations are located at least a threshold distance from the anatomical boundary ([0101], [0133], [0135). Jenkins discloses image guided surgical systems with proximity alerts. Jenkins teaches based on the first curve, display guidance information in a second view of the three-dimensional region via the display system to assist with placement of a second curve in the second view of the three-dimensional anatomical region, wherein the guidance information includes a projection overlaid on the second view of the three-dimensional anatomical region, wherein the projection indicates a position of the first curve in the first view (define plurality of avoidance zones/curves to be avoided during insertion of medical device across 3D map, where the user may also rotate the map to access different planning views/slices of the target site while further placing avoidance zones; [0103]-[0106], [0116], [0120]-[0121]; where the various graphical elements are projected onto the images and 3D map to indicate to the user the location of the marked regions, [0105], Fig. 3; also, wherein the images include slices projected forward from the distal tip of the device, [0207]-[0213] [0251]). Jenkins also teaches receiving a first manual input, the first manual input tracing a first line along the anatomical boundary, the first line corresponding to the first curve (user input to identify/select/define/mark avoid zones and treatment sites using the planning model; touch screen input; [0104], [0116]). Bharadwaj discloses a treatment procedure planning system and method. Bharadwaj teaches receiving a user input in a first slice of the image data, the first slice displaying a portion of the three-dimensional anatomical region; display guidance information in a second slice of the image data displaying at least a portion of the three-dimensional anatomical region ([0079]; Figs 5A-5B). Krimsky discloses systems and methods for interventional procedure planning. Krimsky teaches wherein the projection overlaid on the second image slice of the image data indicates an at-risk portion of the surface of the anatomical structure based on an intersection between the surface and a range of potential deployment locations of a medical instrument, wherein the location of the anatomical target is positioned between a distal end of the medical instrument and the at-risk portion of the surface of the anatomical structure (3D model of patient’s lungs, [0049]; display model of probability diagnostic and/or treatment zone (PTZ) which shows where endobronchial tool is capable of reaching and interacting with portions of region of interest which are displayed within the volume of PTZ [0052]; Fig. 6); wherein the projection overlaid on the second slice of image data is a two-dimensional projection of a three-dimensional zone, the three-dimensional zone based on an exit point of the medical instrument and a target (Figs. 4-6). Krimsky also teaches wherein the projection is displayed as extending from an exit point of the medical instrument through a target (display model of probability diagnostic and/or treatment zone (PTZ), [0052]; PTZ model depicted by cone shaped projection which indicates effective distance tool can extend beyond distal tip of the catheter’s extended working channel ((EWC), [0066], [0076]; Figs. 4-6); determine a respective level of risk associated with different locations of the at-risk portion of the surface of the anatomical structure; and display, via the display system, the different locations of the at-risk portion (color different portions of the PTZ depending on whether the tool likely allows interaction with the region of interest, where the different colors correspond with the risk of missing the target; [0056], [0073], [0077]). Krimsky also teaches wherein the projection includes a first boundary and a second boundary overlaid on the second slice of image data, and wherein the projection indicates a range of potential deployment locations of a medical instrument, the range of potential deployment locations being between the first and second boundary, wherein the anatomical target is reachable by the medical instrument within the range of potential deployment locations (PTZ represented as cone, where the cone indicates the range of potential deployment locations into the first and second boundaries of the bronchial anatomy, [0066], [0076], Figs. 4-6); determine an at-risk portion of the anatomical structure based on an intersection between the surface of the anatomical structure and the range of potential deployment locations of the medical instrument; display the at-risk portion of the surface of the anatomical structure, wherein the at-risk portion of the surface is displayed as extending between the first boundary and the second boundary of the projection and as extending along the anatomical boundary (color different portions of the PTZ depending on whether the tool likely allows interaction with the region of interest, where the different colors correspond with the risk of missing the target; [0056], [0073], [0077]). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the invention of Rich to receive a plurality of user inputs related to the region of interest as taught by Higgins, as this will allow the user to more accurately define the anatomical region. Furthermore, it would have been obvious to have modified the invention of Rich to provide CT images corresponding to a lung as taught by Higgins, as the lung is a known organ for which instrument route planning is desired as taught by Higgins. Furthermore, it would have been obvious to have modified the invention of Rich to provide a variety of planned navigation paths as taught by Higgins, as this will provide the user with additional options to select the most desirable path which will avoid damaging critical structures. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Rich and Higgins to provide user controls for planning curves over different views/slices of the three-dimensional anatomical region as taught by Jenkins, as this will allow for a more complete view of the anatomy to be considered by the user, providing a more accurate result. Furthermore, the placement of the curves over the three dimensional map allows for avoidance zones to be traced by the user, providing more accurate treatment to the patient by delineating regions which if treated/contacted would be harmful to the patient. It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Rich, Higgins, and Jenkins to display image slices as taught by Bharadwaj, as Bharadwaj teaches that because the target is a three dimensional object each of the axial, coronal, and sagittal slices is taken from a different direction, manipulation and adjustment of the boundary ring in one of the slices by the clinician may result in a change or adjustment of the boundary ring in one or both of the remaining slices. In this manner the clinician may accurately set the target dimensions and the location of the target in all three views, effectively mapping the target to specific coordinates and dimensions in a 3D coordinate space ([0079]). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Rich, Higgins, Jenkins, and Bharadwaj to indicate an at-risk portion of the surface of the anatomical structure based on an intersection between the surface and a range of potential deployment locations of a medical instrument, display the projection as extending from an exit point of the medical instrument through a target as taught by Krimsky, in order to more accurately display to the user how far the medical instrument will be deployed from the catheter within the medical image, so that the user may avoid critical anatomical structures of the patient while being accurately guided to the target. Furthermore, the use of a graphical cone projection as taught by Krimsky would improve the display of Rich by visually indicating the range of potential tool deployment locations in relation to the patient’s anatomical structures, including coloring the target location and at-risk locations to be avoided. Furthermore, regarding the limitation “wherein the location of the anatomical target is positioned between a distal end of the medical instrument and the at-risk portion of the surface of the anatomical structure”, the examiner notes that Figure 6 of Krimsky illustrates one point in time of the navigation procedure, including the distal tip of the tool 193, the target region of interest 403 within the patient’s airway, and cone 404 representing the range of the deployable tool. The cone encompasses tissue of the patient which is both between the tool and the target, and also encompasses tissue beyond the target. Depending on the patient’s health condition, any of the tissue surrounding the target may be an “at-risk” portion of the tissue, including tissue beyond the target. Furthermore, depending on the particular patient’s health condition, the target may be at any other location in the patient’s airway within the displayed image, and the cone 404 may therefore include regions which are near another airway passage, tumor, organ, blood vessel, bone, etc. to be avoided. The computer based medical navigation system is capable of displaying a target, and the specific location of the anatomical target and at-risk portion is dependent upon the patient’s anatomy. Over the course of the medical procedure by which the clinician operates the system and moves the tool through the patient to the target region, the clinician will view many points in time where the cone 404 is displayed in regards to different locations of the patient’s anatomy, including times when the location of the anatomical target is positioned between a distal end of the medical instrument and an at-risk portion. Using the visual display, the clinician will move the device until there is enough clearance to deploy the tool in such a manner that it may reach the target while avoiding the tool contacting an at-risk portion of the tissue. The examiner further notes that the rejection is based upon the combination of references, and Rich teaches displaying warnings to the user when the tool will improperly intersect with at-risk tissue, and it would be obvious to utilize the PTZ and color coding of Krimsky to allow the user to more easily visualize the potential deployment of the tool interacting with different anatomical locations within the patient. Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rich (US 2008/0071292) in view of Higgins et al. (US 2008/0183073; hereinafter Higgins), Jenkins et al. (US 2010/0312095; hereinafter Jenkins), Bharadwaj et al. (US 2016/0038248; hereinafter Bharadwaj), and Krimsky (US 2018/0055574; hereinafter Krimsky), and as applied to claim 1 above, and further in view of Chalana et al. (US 2004/0127796; hereinafter Chalana). Rich fails to show wherein the control system is configured to determine the anatomical boundary based on an intensity gradient associated with the image data. Chalana discloses a 3D ultrasound instrument and method. Chalana teaches wherein the control system is configured to determine the anatomical boundary based on an intensity gradient associated with the image data ([0136]-[0137]). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Rich, Higgins, Jenkins, Bharadwaj, and Krimsky to determine the anatomical boundary based on an intensity gradient as taught by Chalana, as Chalana teaches that the transition between anatomical regions will correspond with intensity gradients associated with the image data, and thus will provide an effective means of locating an anatomical boundary. Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rich (US 2008/0071292) in view of Higgins et al. (US 2008/0183073; hereinafter Higgins), Jenkins et al. (US 2010/0312095; hereinafter Jenkins), Bharadwaj et al. (US 2016/0038248; hereinafter Bharadwaj), and Krimsky (US 2018/0055574; hereinafter Krimsky) as applied to claim 1 above, and further in view of Maguire et al. (US 2015/0290472; hereinafter Maguire. Rich fails to show wherein the control system is further configured to apply computer vision to the image data to identify a candidate anatomical boundary, and wherein the anatomical boundary is snapped to the candidate anatomical boundary. Maguire discloses a method and apparatus for treatment planning. Maguire teaches the control system is further configured to apply computer vision to the image data to identify a candidate anatomical boundary (segment image to locate anatomical boundaries, [0064], and wherein the anatomical boundary is snapped to the candidate anatomical boundary ([0008], [0010], [0014], [0070]-[0071]). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Rich, Higgins, Jenkins, Bharadwaj, and Krimsky to apply computer vision to identify an anatomical boundary and to snap to the boundary as taught by Maguire, as this will provide an effective automated means of identifying and accurately placing the anatomical boundary in the images. Claim(s) 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rich (US 2008/0071292) in view of Higgins et al. (US 2008/0183073; hereinafter Higgins), Jenkins et al. (US 2010/0312095; hereinafter Jenkins), Bharadwaj et al. (US 2016/0038248; hereinafter Bharadwaj), and Krimsky (US 2018/0055574; hereinafter Krimsky) as applied to claim 1 above, and further in view of Cohen et al. (US 2010/0161023; hereinafter Cohen). Rich fails to show wherein the control system is further configured to: display the anatomical boundary overlaid on fluoroscopic image data obtained during a patient procedure. Cohen discloses automatic tracking of a tool upon a roadmap. Cohen teaches wherein the control system is further configured to: display the anatomical boundary overlaid on fluoroscopic image data obtained during a patient procedure ([0503], [0581], [0604]-[0606], [0613], [0656], [0774]). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Rich, Higgins, Jenkins, Bharadwaj, and Krimsky to display anatomical boundaries overlaid on a fluoroscopic image as taught by Cohen, as a fluoroscopic image may provide an advantageous real-time x-ray based image representative of the patient’s tissue, but may lack detail in regards to softer tissue structures such as vasculature, and thus it is advantageous to overlay complementary information such as the anatomical boundaries so that they may be visualized together with the real-time fluoroscopic image. Claim(s) 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rich (US 2008/0071292) in view of Higgins et al. (US 2008/0183073; hereinafter Higgins), Jenkins et al. (US 2010/0312095; hereinafter Jenkins), Bharadwaj et al. (US 2016/0038248; hereinafter Bharadwaj), and Krimsky (US 2018/0055574; hereinafter Krimsky) as applied to claim 21 above, and further in view of Atarot et al. (US 2015/0238276; Atarot). Rich fails to show wherein the control system is further configured to: alter an advancement speed of the medical instrument based on the determined distance. Atarot discloses a device and method for assisting surgery. Atarot teaches wherein the control system is further configured to: alter an advancement speed of the medical instrument based on the determined distance (tracking system with soft control—fast movement when nothing is nearby, slow movement when something is close, [0587]-[0600]). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Rich, Higgins, Jenkins, Bharadwaj, and Krimsky to alter an advancement speed of the instrument based on the distance as taught by Atarot, as this will ensure that as the instrument gets close to a sensitive anatomical structure, it will slow down to avoid accidental puncture of the anatomy. Response to Arguments Applicant's arguments filed 8/18/25 have been fully considered but they are not persuasive. In response to applicant’s arguments regarding Krimsky, examiner respectfully disagrees. As noted in the rejection above: “Furthermore, regarding the limitation “wherein the location of the anatomical target is positioned between a distal end of the medical instrument and the at-risk portion of the surface of the anatomical structure”, the examiner notes that Figure 6 of Krimsky illustrates one point in time of the navigation procedure, including the distal tip of the tool 193, the target region of interest 403 within the patient’s airway, and cone 404 representing the range of the deployable tool. The cone encompasses tissue of the patient which is both between the tool and the target, and also encompasses tissue beyond the target. Depending on the patient’s health condition, any of the tissue surrounding the target may be an “at-risk” portion of the tissue, including tissue beyond the target. Furthermore, depending on the particular patient’s health condition, the target may be at any other location in the patient’s airway within the displayed image, and the cone 404 may therefore include regions which are near another airway passage, tumor, organ, blood vessel, bone, etc. to be avoided. The computer based medical navigation system is capable of displaying a target, and the specific location of the anatomical target and at-risk portion is dependent upon the patient’s anatomy. Over the course of the medical procedure by which the clinician operates the system and moves the tool through the patient to the target region, the clinician will view many points in time where the cone 404 is displayed in regards to different locations of the patient’s anatomy, including times when the location of the anatomical target is positioned between a distal end of the medical instrument and an at-risk portion. Using the visual display, the clinician will move the device until there is enough clearance to deploy the tool in such a manner that it may reach the target while avoiding the tool contacting an at-risk portion of the tissue. The examiner further notes that the rejection is based upon the combination of references, and Rich teaches displaying warnings to the user when the tool will improperly intersect with at-risk tissue, and it would be obvious to utilize the PTZ and color coding of Krimsky to allow the user to more easily visualize the potential deployment of the tool interacting with different anatomical locations within the patient.” The examiner maintains that the display of a particular target and at-risk tissue depends upon the particular patient’s anatomy. The computer based system of Rich and Krimsky is capable of displaying a medical instrument, a target, and at-risk tissue, wherein the location of the anatomical target is positioned between a distal end of the medical instrument and the at-risk portion. In regards to applicant’s arguments regarding claim 55, examiner respectfully disagrees. Applicant argues that “the system” defines the boundary of the avoid zone. However, as described in Jenkins ([0104]), the planning model can accept a user input via a user interface with at least one UI control to identify, select, define, and/or “mark” at least one avoid zone. The UI control can include a touch screen input. The examiner maintains that this language encompasses the claimed “first manual input tracing a first line”, as the user operates a touch interface to identify, select, define, and/or mark the avoid zone. Furthermore, the examiner notes that the claim is directed to the medical system of claim 1, and the examiner maintains that the computer based system in the combined invention including Jenkins is capable of receiving such manual input. Furthermore, regarding [0116] of Jenkins, the examiner notes the bolded sections include the user operating the touch the screen to adjust the size and shape of the avoid zone, where the operation of a touch screen in such a manner includes a “first manual input tracing a first line”: “[0116] The UI typically includes multiple GUI controls 25c that can include a touch screen and/or mouse or other input control to allow a physician to select a region of interest in the map 137p by placing a cursor or by touching the screen at a region of interest. This can cause the system to define a corresponding avoid zone 155, define a target treatment site 55t, calculate boundary limits 55l (FIG. 5) (e.g., maximum acceptable boundary positions for an ablation location), and optionally electronically define preset scan planes 141 (FIG. 1) for use during an interventional procedure. In other embodiments, the system can be configured to provide a list and/or overlay of suggested avoid zones 155 for a particular medical procedure or for target anatomical structure for treatment (and/or the access path used to reach this structure). A user can select one or more of the suggested avoid zones 155 and the system can automatically virtually place these zones on the planning map 137p (using fiducial markers or anatomical markers) and the like. In some embodiments, the system can show a proposed avoid zone on the planning map 137p and allow a user to use a UI control 25c to move the location or adjust the size or shape of the suggested avoid zone 155 on the planning map 137p which may vary patient to patient as well as procedure to procedure. Once in the position, size and/or shape desired, a user may optionally affirmatively lock the avoid zone to the planning map 137p. A lock icon or other UI control 25c may be used for facilitating the avoid zone selection. In other embodiments, once the select avoid zone input is deselected, the avoid zones can be automatically locked.” In that applicant’s arguments also argue that it is “the system” of Jenkins which defines the boundary, the examiner notes that in applicant’s invention, “the system” also defines the boundary. Applicant’s invention (and Jenkins) is directed to a user providing an input to a computer system, where the computer system ultimately generates the desired image. Additionally, the examiner notes that the term “tracing” is not defined further in applicant’s specification. The examiner maintains that manual input of Jenkins including using touch screen operation to define an avoid zone encompasses the broadest reasonable interpretation of the claim language in regards to the term “tracing”. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN CWERN whose telephone number is (571)270-1560. The examiner can normally be reached Monday - Friday, 8:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at (571) 272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JONATHAN CWERN/Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

Mar 18, 2021
Application Filed
Mar 18, 2021
Response after Non-Final Action
Aug 04, 2023
Non-Final Rejection — §103
Oct 11, 2023
Response Filed
Oct 16, 2023
Final Rejection — §103
Nov 28, 2023
Examiner Interview Summary
Nov 28, 2023
Applicant Interview (Telephonic)
Dec 05, 2023
Response after Non-Final Action
Dec 08, 2023
Response after Non-Final Action
Jan 12, 2024
Request for Continued Examination
Jan 16, 2024
Response after Non-Final Action
Feb 29, 2024
Non-Final Rejection — §103
May 15, 2024
Applicant Interview (Telephonic)
May 15, 2024
Examiner Interview Summary
Jun 06, 2024
Response Filed
Jun 10, 2024
Final Rejection — §103
Aug 08, 2024
Response after Non-Final Action
Aug 12, 2024
Response after Non-Final Action
Aug 26, 2024
Request for Continued Examination
Aug 28, 2024
Response after Non-Final Action
Oct 30, 2024
Non-Final Rejection — §103
Dec 23, 2024
Applicant Interview (Telephonic)
Dec 23, 2024
Examiner Interview Summary
Jan 29, 2025
Response Filed
Feb 03, 2025
Final Rejection — §103
Mar 17, 2025
Applicant Interview (Telephonic)
Mar 17, 2025
Examiner Interview Summary
Apr 04, 2025
Response after Non-Final Action
Apr 18, 2025
Request for Continued Examination
Apr 21, 2025
Response after Non-Final Action
May 15, 2025
Non-Final Rejection — §103
Jul 17, 2025
Applicant Interview (Telephonic)
Jul 17, 2025
Examiner Interview Summary
Aug 18, 2025
Response Filed
Sep 02, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12590949
A Radiomics-Based Imaging Tool to Monitor Tumor-Lymphocyte Infiltration and Outcome in Cancer Patients Treated by Anti-PD-1/PD-L1
2y 5m to grant Granted Mar 31, 2026
Patent 12588897
VISCOELASTICITY MEASUREMENT METHOD AND ULTRASONIC IMAGING SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12564313
SYSTEMS AND METHODS OF INTEGRATED REAL-TIME VISUALIZATION
2y 5m to grant Granted Mar 03, 2026
Patent 12564374
SPINAL CEREBRAL ARTERY RUPTURE DETECTOR
2y 5m to grant Granted Mar 03, 2026
Patent 12558576
SYSTEM FOR NERVE MODULATION AND INNOCUOUS THERMAL GRADIENT NERVE BLOCK
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

9-10
Expected OA Rounds
50%
Grant Probability
87%
With Interview (+36.3%)
4y 2m
Median Time to Grant
High
PTA Risk
Based on 797 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month