Prosecution Insights
Last updated: April 19, 2026
Application No. 18/430,192

SYSTEMS AND METHODS FOR ACTIVE TRACKING OF ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY TOOLS WITH SINGLE GUIDE SHEATHS

Final Rejection §103
Filed
Feb 01, 2024
Examiner
PARK, PATRICIA JOO YOUNG
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Covidien LP
OA Round
2 (Final)
56%
Grant Probability
Moderate
3-4
OA Rounds
4y 3m
To Grant
72%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
244 granted / 433 resolved
-13.6% vs TC avg
Strong +15% interview lift
Without
With
+15.3%
Interview Lift
resolved cases with interview
Typical timeline
4y 3m
Avg Prosecution
27 currently pending
Career history
460
Total Applications
across all art units

Statute-Specific Performance

§101
5.6%
-34.4% vs TC avg
§103
56.5%
+16.5% vs TC avg
§102
10.0%
-30.0% vs TC avg
§112
22.2%
-17.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 433 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments with respect to amended claim(s) 1, 9, and 16 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Applicant’s arguments, see page 8, filed 20 January 2026, with respect to 112 rejections have been fully considered and are persuasive in view of amendment. The 112 rejection of 17 October 2025 has been withdrawn. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The following rejection has been modified in view of applicant's arguments and/or amendments. Claims 1-3, 9-11, 16, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over “Krimsky et al.,” US 2022/0015837 (hereinafter Krimsky) and “Bydlon et al.,” US 2019/0374130 (hereinafter Bydlon). Regarding to claim 1, Krimsky teaches a system for navigating to a target via a luminal network of a patient (luminal device configured to be inserted into a patient [0007]), the system comprising: an extended working channel (EWC) including a location sensor disposed at a distal portion of the EWC (EWC includes an electromagnetic sensor located on a distal end of the EWC [0040]); at least one processor; and a memory coupled to the at least one processor and having stored thereon instructions, which when executed by the at least one processor, cause the at least one processor to (computing device including a processor and a storage medium wherein the processor is capable of executing instructions stored on the storage medium [0008]): receive preprocedural computed tomography (CT) images; generate a three-dimensional (3D) model based on the CT images (three-dimensional models from a series of CT images [0005]); cause display of the 3D model in a user interface on a display operatively coupled to the at least one processor (three-dimensional model displayed on a display [0035]; a user interface displayed [0073]) receive an indication of a location of a target in the CT images (a target is identified in the images generated [0058]); cause display of the location of the target in the 3D model (model of the region of interest including a target [0047]); generate a pathway plan to the target for navigation of the EWC (generates a path to the target, pathway plan [0059]-[0060]); receive location information from the location sensor as the EWC is navigated through a luminal network of a patient (location of EM sensors tracked [0061]); register the 3D model with the luminal network of the patient based on the location information (during registration, location of EM sensor are tracked and the patient and the 3D model are registered to one another [0061]); cause display of the location of the location sensor within the 3D model that substantially corresponds to the location of the location sensor within the luminal network of the patient (display location of EM sensor in the model of the region of interest [0068]) cause display of the navigation of the extended working channel following the pathway plan to a location near the target (displaying guidance for navigating EM sensor to the target, viewing a live video feed from a camera [0062]-[0063]); receive tool information of a tool disposed within the EWC (type and model of the tool input by a clinician or read via a barcode, RFID, or other indicator capable of containing information regarding tool [0053]), fixed relative to the EWC, and having a distal end portion extending distally beyond the distal end portion of the EWC (tool can be locked to EWC with locking mechanism [0040]); determine a location of a portion of the tool based on the location information from the location sensor(tool can be locked into place extending beyond the distal tip of EWC, if EM sensor is coupled to EWC at a known point and tool extends beyond EWC at a known distance, application may determine a location of the tool [0049]); cause display of tool in the 3D model according to the location information and the tool information (display location of EM/tool in the model of the region of interest [0068]); and cause display of advancement of the tool into the target in the 3D model based on the location information and the tool information (update progression of the tool in model displayed [0047]). Krimsky does not teach displaying a tool is “a virtual tool” and “cause a display of a distal end portion of the virtual tool extending beyond a distal end portion of a virtual EWC” as amended. However, in the analogous field of endeavor in image guided medical procedures, Bydlon teaches a tool extending from the bronchoscope ([0063]) and displaying a virtual image of the tool and the bronchoscope superimposed upon a 2D/3D image volume ([0062]; a virtual image of the bronchoscope and the tool superimposed upon 3D volume [0083] Figures 4A-B). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify displaying tool in the model as taught by Krimsky to incorporate teaching of Bydlon, since AR display of virtual surgical tool in the anatomical model was well known in the art as taught by Bydlon. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring its display to be augmented realty display, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide a real-time virtual image of the tool with respect to anatomical region ([0107]), and there was reasonable expectation of success. Regarding to claims 2-3, Krimsky and Bydlon together teach all limitations of claim 1 as discussed above. Krimsky further teaches following limitations: Of claim 2, wherein the tool includes a locatable guide tool including a second location sensor, an ablation tool, or a biopsy tool (biopsy, guidewire, ablation device [0040]). Of claim 3, wherein the EWC includes an EWC handle (a control handle 91 [0040]), and wherein the tool includes: a fixing member configured to mate with the EWC handle when the tool is disposed within the EWC and a distal portion of the tool extends beyond a distal portion of the EWC, and configured to fix the tool in place with respect to the EWC; and a handle for operating the tool (locking mechanism include a threaded configuration allowing it to threadably engage with and lock to the EWC [0048]). Regarding to claim 9, Krimsky teaches a method for navigating to a target via a luminal network (bronchoscope [0003]), the method comprising: receiving computed tomography (CT) images (CT [0005]); generating a three-dimensional (3D) model based on the CT images (three-dimensional models generated from a series of CT [0005]): causing display of the 3D model in a user interface on a display (three-dimensional model displayed on a display [0035]; a user interface displayed [0073]); receiving an indication of a location of a target in the CT images (a target is identified in the images generated [0058]); causing display of the location of the target in the 3D model (model of the region of interest including a target [0047]); generating a pathway plan to the target for navigation of a catheter (catheter advancement [0048]; generates a path to the target, pathway plan [0059]-[0060]); receiving location information from a location sensor disposed at a distal portion of the catheter (location of EM sensors tracked [0061]); registering the 3D model with a luminal network of a patient based on the location information (during registration, location of EM sensor are tracked and the patient and the 3D model are registered to one another [0061]); causing display of the location of the location sensor within the 3D model (display location of EM sensor in the model of the region of interest [0068]); causing the display of the navigation of the catheter following the pathway plan to a location near the target (displaying guidance for navigating EM sensor to the target, viewing a live video feed from a camera [0062]-[0063]); receiving tool information of a tool disposed within the catheter, fixed in place relative to the catheter (type and model of the tool input by a clinician or read via a barcode, RFID, or other indicator capable of containing information regarding tool [0053]), and having a distal end portion extending distally beyond the distal end portion of the catheter (tool may be locked into place extending beyond the distal tip of EWC of catheter [0049]); determining a location of a portion of the tool based on the location information from the location sensor (tool can be locked into place extending beyond the distal tip of EWC, if EM sensor is coupled to EWC at a known point and tool extends beyond EWC at a known distance, application may determine a location of the tool [0049]); causing display of at least a portion of the tool in the 3D model based on the location information and the tool information (display location of EM/tool in the model of the region of interest [0068]); and causing display of advancement of the tool to the target in the 3D model based on the location information and the tool information (update progression of the tool in model displayed [0047]). Krimsky does not teach displaying a tool is “cause a display of a distal end portion of a virtual tool extending beyond a distal end portion of a virtual catheter” as amended. However, in the analogous field of endeavor in image guided medical procedures, Bydlon teaches a tool extending from the bronchoscope ([0063]) and displaying a virtual image of the tool and the bronchoscope superimposed upon a 2D/3D image volume ([0062]; a virtual image of the bronchoscope and the tool superimposed upon 3D volume [0083] Figures 4A-B). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify displaying tool in the model as taught by Krimsky to incorporate teaching of Bydlon, since AR display of virtual surgical tool in the anatomical model was well known in the art as taught by Bydlon. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring its display to be augmented realty display, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide a real-time virtual image of the tool with respect to anatomical region ([0107]), and there was reasonable expectation of success. Regarding to claims 10-11, Krimsky and Bydlon together teach all limitations of claim 9 as discussed above. Krimsky further teaches following limitations: Of claim 10, further comprising displaying a distal end portion of the tool as the target is treated, ablated, sampled, or biopsied (when distal portion of the tool is confirmed and positioned within the treatment, ablation or biopsy procedure is performed [0070]) Of claim 11, wherein determining the location of the portion of the tool includes determining a location of a distal end portion of the tool by projecting the location information from the location sensor to the distal end portion of the tool (Fig. 5 [0054]). Regarding to claim 16, Krimsky teaches a system comprising: a guide catheter configured for insertion into a luminal network of a patient (catheter guide for luminal passageway [0042]) ; a sensor coupled the guide catheter for sensing a location of a distal end portion of the guide catheter within the luminal network of the patient (EM sensor to catheter guide assembly [0042]); tools (biopsy tools, diagnostic/therapeutic instruments [0003]) configured to be inserted through the guide catheter such that a distal end portion of the tools extend distally beyond the distal end portion of the guide catheter (tool advanced through catheter guide [0042]) , and configured to be fixed relative to the guide catheter during navigation of the guide catheter (tool may be locked into place extending beyond the distal tip of EWC of catheter [0049]); at least one processor ( processor [0071]-[0073]); a display operatively coupled to the at least one processor (processor cause display [0073]); and a memory (memory [0073]) coupled to the at least one processor and having stored thereon instructions, which when executed by the at least one processor (computing device including a processor and a storage medium wherein the processor is capable of executing instructions stored on the storage medium [0008]), cause the at least one processor to: cause display of a target in a 3D model on the display (three-dimensional model displayed on a display [0035]; a user interface displayed [0073]); receive location information from the sensor in the luminal network of the patient; cause display of navigation of a guide catheter near the target in the 3D model based on the location information ([0042]); determine tool information from a tool inserted through the guide catheter and fixed in place relative to the guide catheter (type and model of the tool input by a clinician or read via a barcode, RFID, or other indicator capable of containing information regarding tool [0053], tool may be locked into place extending beyond the distal tip of EWC of catheter [0049]); determine a location of a distal portion of the tool; cause display, in the 3D model, the distal portion of a tool based on the location information and the tool information (sensor locations allows tool position and used to update a location of the tool within a 3D model of ROI [0046], update the model and distance between tool and target [0047]; given type and model of the tool to determine a location of tool relative to EM sensor [0053]); and cause display of operation of the tool to perform a procedure on the virtual target in the 3D model based on updated location information and the tool information update the model and distance between tool and target [0047]; given type and model of the tool to determine a location of tool relative to EM sensor [0053]); Krimsky does not teach displaying a tool is “a virtual tool,” “a virtual target,” and “cause a display of a distal end portion of the virtual tool extending beyond a distal end portion of a virtual EWC” as amended. However, in the analogous field of endeavor in image guided medical procedures, Bydlon teaches a tool extending from the bronchoscope ([0063]) and displaying a virtual image of the tool and the bronchoscope superimposed upon a 2D/3D image volume, as well as a virtual lesion([0062]; a virtual image of the bronchoscope and the tool superimposed upon 3D volume [0083] Figures 4A-B; virtual lesion [0095]; a virtual tumor [0099]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify displaying tool in the model as taught by Krimsky to incorporate teaching of Bydlon, since AR display of virtual surgical tool in the anatomical model was well known in the art as taught by Bydlon. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring its display to be augmented realty display, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide a real-time virtual image of the tool with respect to anatomical region ([0107]), and there was reasonable expectation of success. Regarding to claim 20, Krimsky and Bydlon together teach all limitations of claim 16 as discussed above. Krimsky further teaches an electromagnetic generator (electromagnetic field generator [0040]) configured to generate an electromagnetic field, wherein the sensor is an EM sensor configured to sense the electromagnetic field and output an electromagnetic field signal indicating the location of the EM sensor (EM sensor within the magnetic field generated by field generator [0040]). Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Krimsky and Bydlon as applied to claim 1 above, and further in view of “Gordon et al.,” US 2021/0248922 (hereinafter Gordon). Regarding to claim 4, Krimsky and Bydlon together teach all limitations of claim 1 as discussed above. Krimsky teaches wherein a distal portion of the EWC is curved (Figures 4A-C shows curved distal end of the EWC). Krimsky does not further disclose location information is six degrees of freedom. However, Gordon teaches using electromagnetic sensor to track location, where the sensor is a six degree of freedom sensor ([0050]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify EM sensor as taught by Krimsky to incorporate teaching of Gordon, since six degree of freedom sensor was well known in the art as taught by Gordon. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring the EM sensor to be six degree of freedom sensor, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide a six degree of freedom electromagnetic locating or tracking system ([0049]), and there was reasonable expectation of success. Claims 5-8 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Krimsky and Bydlon as applied to claims 1 and 16 above, and further in view of “Gonzalez et al.,” US 2018/0344389 (hereinafter Gonzalez) and “Gordon et al.,” US 2021/0248922 (hereinafter Gordon). Regarding to claims 5 and 18, Krimsky and Bydlon together teach all limitations of claims 1 and 16 as discussed above. Krimsky teaches using a bronchoscope ([0066]) but does not further teach displaying message to lock or unlock a bronchoscope adapter. However, in the analogous field of endeavor in surgical procedures using a bronchoscope, Gonzales teaches specifically steps of locking the electrosurgical device onto bronchoscope with a bronchoscope adapter ([0036]) and unlocking the device from the bronchoscope with the bronchoscope adapter to retract the device out of the bronchoscope ([0051]-[0052]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify bronchoscope and the tool/instrument as taught by Krimsky to incorporate teaching of Gonzalez, since using an adapted to lock and unlock the tool onto bronchoscope was well known in the art as taught by Gonzalez. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring an adapted to lock and unlock onto bronchoscope, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to enable the electrosurgical device to be connected to a bronchoscope during procedure ([0024]), and there was reasonable expectation of success. Gonzales does not explicitly disclose displaying message prompting a user to operate bronchoscope. However, in the analogous field of endeavor in bronchoscope navigation system and method, Gordon teaches VR guided navigation of bronchoscope, that can prompt the user to insert the bronchoscope ([0110]) and discloses user interface to prompt the user via message (Figure 19) and using an user interface to insert and withdraw tool from bronchoscope ([0082]), and discloses using intraoperative images (bronchoscope view a real-time image [0067]; images collected during fluoroscopic sweep [0103]), and update a relative position of the target and EWC in the 3D model based on the intraprocedural image (clinical extends LG and catheter from the working channel of bronchoscope and tracks the progress of LG in 3D view [0078]). Thus, Gonzalez’s procedure steps of locking and unlocking device onto bronchoscope with an adapted, can incorporate VR navigation of bronchoscope, to prompt the user to take specific locking and unlocking steps during navigation. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify locking and unlocking steps of device onto bronchoscope as taught by Gonzalez to incorporate teaching of Gordon, since user interface prompting user to perform user manipulation was well known in the art as taught by Gordon. One of ordinary skill in the art could have combined the elements as claimed by Gonzalez with no change in their respective functions, implementing an user interface in performing steps of bronchoscope procedures, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide navigation of bronchoscope procedure with user manipulation and user interface ([0008]), and there was reasonable expectation of success. Regarding to claim 6, Krimsky, Bydlon, Gonzalez, and Gordon together teach all limitations of claim 5 as discussed above. Gordon further teaches wherein the intraprocedural images are C-arm fluoroscopic images, 3D fluoroscopic images, or cone beam CT images (images collected during the fluoroscopic sweep to form a fluoroscopic 3D reconstruction [0103]). Regarding to claim 7, Krimsky, Bydlon, Gonzalez, and Gordon together teach all limitations of claim 6 as discussed above. Krimsky further teaches wherein the tool information is a tool type, tool characteristics, or tool dimensions (tool type and model [0053]), and wherein the instructions when executed by the at least one processor further cause the at least one processor to determine a location of a distal portion of the tool by projecting the location information according to the tool information (determine a location of the tool relative to EM sensor based on the tool type and model information [0053]); and cause display of a portion of the tool in the 3D model based on the location of the distal portion of the tool ([0068]). Krimsky does not teach displaying a tool is “a virtual tool” as claimed. However, in the analogous field of endeavor in image guided medical procedures, Bydlon teaches a tool extending from the bronchoscope ([0063]) and displaying a virtual image of the tool and the bronchoscope superimposed upon a 2D/3D image volume, as well as a virtual lesion([0062]; a virtual image of the bronchoscope and the tool superimposed upon 3D volume [0083] Figures 4A-B; virtual lesion [0095]; a virtual tumor [0099]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify displaying tool in the model as taught by Krimsky to incorporate teaching of Bydlon, since AR display of virtual surgical tool in the anatomical model was well known in the art as taught by Bydlon. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring its display to be augmented realty display, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide a real-time virtual image of the tool with respect to anatomical region ([0107]), and there was reasonable expectation of success. Regarding to claim 8, Krimsky, Bydlon, Gonzalez, and Gordon together teach all limitations of claim 6 as discussed above. Krimsky further teaches following limitations: a tool memory coupled to the tool and configured to store the tool information (library of tools [0053]); and a tool memory reader configured to read the tool information from the tool memory (read via a barcode, RFID or any other suitable indicator capable of containing information regarding tool [0053]), wherein the at least one processor is in operative communication with the tool memory reader, and wherein the instructions when executed by the at least one processor further cause the at least one processor to receive the tool information from the tool memory reader (application receives the sensed distortion and compares it to the library readings for the given tool type and model [0053]). Claims 12-14 are rejected under 35 U.S.C. 103 as being unpatentable over Krimsky and Bydlon as applied to claim 9 above, and further in view of “Dickhans et al.,” US 2017/0156685 (hereinafter Dickhans). Regarding to claim 12, Krimsky and Bydlon together teach all limitations of claim 9 as discussed above. Krimsky does not further explicitly disclose receiving intraoperative images of the catheter and the target and updating a location based on the image data. Dickhans teaches navigating bronchoscope, comprising: receiving intraprocedural images of the catheter and the target ( additional image data, such as CBCT data collected to show a real-time location of the tool [0039]; intraoperative CBCT scan [0040]); and updating a location of the catheter relative to the target in the 3D model based on the intraprocedural images (3D model of the tracked position of the tool updated based on the image data, showing a confirmed location of the tool and/or the treatment target [0039], intraoperative CBCT scan [0040]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the live image during procedure as taught by Krimsky to incorporate teaching of Dickhans, since intraoperative CBCT was well known in the art as taught by Dickhans. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, using its CT imaging system to acquire images during procedures, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to accurate confirmation of the location of the tracked tool during procedures ([0039]-[0040]), and there was reasonable expectation of success. Regarding to claims 13-14, Krimsky, Bydlon, and Dickhans together teach all limitations of claim 12 as discussed above. Dickhans further teaches following limitations: Of claim 13, wherein receiving the intraprocedural images includes receiving 2D fluoroscopic images, 3D fluoroscopic images, or cone beam CT images (intraoperative CBCT [0040]). Of claim 14, further comprising causing display of at least a portion of the pathway plan, at least a portion of the tool, and the target on the intraprocedural images (3D model includes position of the tool, target and pathway to target[0063]). Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Krimsky, Bydlon, and Dickhans as applied to claim 12 above, and further in view of “Sohlden et al.,” US 2018/0318009 (hereinafter Sohlden). Regarding to claim 15, Krimsky, Bydlon, and Dickhans together teach all limitations of claim 12 as discussed above. Dickhans discloses intraoperative images (intraoperative CBCT), but does not explicitly disclose wherein the intraprocedural images are captured at a decreased radiation dose. However, in the analogous field of endeavor in planning a surgical instrument path, Sohlden teaches acquiring images during procedures, and due to exposure of radiation to clinician’s hand for manipulating surgical instrument, one can calculate dosimetry of x-ray radiation and determine limits on x-radiation exposure due to continuous imaging ([0066]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify intraoperative image acquisition as taught by Dickhans to incorporate teaching of Sohlden, since limiting x-ray radiation exposure was well known in the art as taught by Sohlden. One of ordinary skill in the art could have combined the elements as claimed by Dickhans with no change in their respective functions, determine acceptable dosimetry to determine limits on x-ray radiation exposure and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide safe intraoperative imaging with limiting radiation exposure ([0066]), and there was reasonable expectation of success. Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Krimsky and Bydlon as applied to claim 16 above, and further in view of “Koyrakh et al.,” US 2020/0100843 (hereinafter Koyrakh). Regarding to claim 17, Krimsky and Bydlon together teach all limitations of claim 16 as discussed above. Krimsky further teaches wherein the guide catheter is a smart extended working channel, and wherein the tools include at least two of a locatable guide, a forceps, a biopsy needle, or an ablation antenna (a needle, a guidewire, biopsy tool, ablation device [0040]). Krimsky does not explicitly disclose “a smart” extended working channel. However, in the analogous field of endeavor in navigating tools using EM sensor in extended working channel, Koyrakh teaches “smart extended working channel localization” (title). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify displaying tool in the model as taught by Krimsky to incorporate teaching of Koyrakh, since smart extended working channel was well known in the art as taught by Koyrakh. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring its extended working channel to be smart extended working channel, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to facilitate electromagnetic functionality through the procedure ([0022]), and there was reasonable expectation of success. Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Krimsky and Bydlon as applied to claim 16 above, and further in view of “Dickhans et al.,” US 2017/0156685 (hereinafter Dickhans). Regarding to claim 19, Krimsky and Bydlon together teach all limitations of claim 16 as discussed above. Krimsky and Bydlon do not further teach that lengths from distal end portions of the handles to the distal end portions of the tools are the same lengths. However, in the analogous field of endeavor in bronchoscope, Dickhans teaches using multiple tools inserted into extended working channel, and would result in same length from the handle to the distal portion of the tools that are within the EWC ([0051] Fig. 1). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify extended working channel as taught by Krimsky to incorporate teaching of Dickhans, since multiple instruments was well known in the art as taught by Dickhans. One of ordinary skill in the art could have combined the elements as claimed by Krimsky with no change in their respective functions, configuring its working channel to accompany multiple instruments, and the combination would have yielded nothing more than predictable results to one of ordinary skill in the art before the effective filing date of the claimed invention. The motivation would have been to provide necessary plurality of tools during procedure ([0047] and [0083]), and there was reasonable expectation of success. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to PATRICIA J PARK whose telephone number is (571)270-1788. The examiner can normally be reached Monday-Thursday 8 am - 3 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at 571-272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PATRICIA J PARK/Primary Examiner, Art Unit 3798
Read full office action

Prosecution Timeline

Feb 01, 2024
Application Filed
Oct 15, 2025
Non-Final Rejection — §103
Jan 20, 2026
Response Filed
Feb 17, 2026
Interview Requested
Feb 19, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582474
VISUALIZATION OF THREE-DIMENSIONAL IMAGE DATA ON TWO- DIMENSIONAL IMAGES
2y 5m to grant Granted Mar 24, 2026
Patent 12579761
ALIGNMENT OF VIRTUAL OVERLAY BASED ON TRACE GESTURES
2y 5m to grant Granted Mar 17, 2026
Patent 12575802
METHOD AND ELECTRONIC DEVICE FOR CLASSIFYING VESSEL
2y 5m to grant Granted Mar 17, 2026
Patent 12569231
SYSTEM FOR FLUOROSCOPIC TRACKING OF A CATHETER TO UPDATE THE RELATIVE POSITION OF A TARGET AND THE CATHETER IN A 3D MODEL OF A LUMINAL NETWORK
2y 5m to grant Granted Mar 10, 2026
Patent 12564371
SYSTEM AND METHOD FOR DISPLAYING ABLATION ZONE PROGRESSION
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
56%
Grant Probability
72%
With Interview (+15.3%)
4y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 433 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month