Prosecution Insights
Last updated: April 19, 2026
Application No. 18/630,474

SYSTEM AND METHOD FOR ILLUSTRATING A POSE OF AN OBJECT

Non-Final OA §103
Filed
Apr 09, 2024
Examiner
LI, JOHN DENNY
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Medtronic Navigation Inc.
OA Round
1 (Non-Final)
64%
Grant Probability
Moderate
1-2
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
158 granted / 246 resolved
-5.8% vs TC avg
Strong +49% interview lift
Without
With
+48.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
36 currently pending
Career history
282
Total Applications
across all art units

Statute-Specific Performance

§101
6.5%
-33.5% vs TC avg
§103
47.7%
+7.7% vs TC avg
§102
12.2%
-27.8% vs TC avg
§112
29.7%
-10.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 246 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions Claims 7-19 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected invention, there being no allowable generic or linking claim. Election was made without traverse in the reply filed on 1/11/2025. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over Shahidi (US20030032878) and Anderson (US20080287783). Regarding claim 1, Shahidi discloses a system configured to assist a user with navigating an instrument relative to a subject (Shahidi, Para 14; “The present invention provides an improved system and method for displaying 3D images of anatomical structures in real time during surgery to enable the surgeon to navigate through these structures during the performance of surgical procedures.”), the system comprising: an imaging system configured to image an anatomy, the imaging system including a probe (instrument 109) maneuverable by the user (Shahidi, Para 48; “The surgical instrument 109 may include an ultrasound transducer located at the tip 115, which itself scans and detects ultrasound imaging data when placed in contact with the patient's head.”); a tracking system (position tracking system) including a localizer (sensing unit 105), a probe tracker (LEDS 110 and 111) configured to be mounted to the probe (Shahidi, Para 41; “The apparatus further includes a position tracking system, which is preferably an optical tracking system (hereafter “OTS”) having a sensing unit 105 mounted overhead in view of the operating table scene, and at least two light emitting diodes (LED's) 110, 111 mounted on the surgical instrument 109. These LED's preferably emit continuous streams of pulsed infrared signals which are sensed by a plurality of infrared detectors 106, 107, 108 mounted in the sensing unit 105 in view of the surgical instrument 109. The instrument 109 and the sensing unit 105 are both connected to the computer 101, which controls the timing and synchronization of the pulse emissions by the LED's and the recording and processing of the infrared signals received by the detectors 106-108. The OTS further includes software for processing these signals to generate data indicating the location and orientation of the instrument 109. The OTS generates the position detecting data on a real time continuous basis, so that as the surgical instrument 109 is moved, its position and orientation are continually tracked and recorded by the sensing unit 105 in the computer 101.”), and a subject tracker (fiducial markers 113 and 114) configured to be mounted to the subject (Shahidi, Para 41; “Fiducial markers 113, 114 are attached to the head to enable registration of images”); a display device including a display screen (video display device 102) (Shahidi, Para 41; “A computer 101 is connected to user input devices including a keyboard 103 and mouse 104, and a video display device 102 which is preferably a color monitor. The display device 102 is located such that it can be easily viewed by the surgeon during an operation,”); and a processor (computer 101 containing CPU 201) configured to (Shahidi, Para 43; “FIG. 2 shows a schematic block diagram of the computer system connected to the position tracking system. The computer 101 includes a central processing unit (CPU) 201 communicative with a memory 202, the video display 102, keyboard and mouse 103, 104, optical detectors 106-108, and the LED's mounted on the surgical instrument 109. The computer memory contains software means for operating and controlling the position tracking system. In an alternative preferred embodiment, the OTS components 105-109 may be connected to and controlled by a separate computer or controller which is connected to the computer 101 and provides continual data indicating the position and orientation of the surgical instrument 109.”): generate an avatar of the subject (head) and display the avatar in a first area of the display screen; display an anatomy icon (brain) of the anatomy on the avatar in the first area of the display screen; display a field of view icon (field of 905) of the probe in the first area of the display screen at a position relative to the avatar corresponding to position of the probe relative to the anatomy (Shahidi, Figure 9 showing this) (Shahidi, Para 60; “a three- dimensional image display 901 obtained by the above system with the surgical probe 109 of FIG. 1 in the position illustrated, pointing toward the target lesion or tumor 117 inside the patient's head 112. The display 901 is a perspective view from the tip 115 of the probe 109. This display is continuously refreshed, so that as the probe 109 is moved the displayed image 901 immediately changes. It will be noted that, although the probe 109 is shown entirely outside the patient's head, the display 901 shows internal anatomical structures such as the brain and the target lesion 117.”) (Shahidi, Para 61-62; “When the surgical instrument 109 is an endoscope or US transducer, the field of view 116 is also indicated in the display 901 by the quasi-circular image 905 indicating the intersection of the conical field of view 116 with the surface of the skin viewed by the endoscope 109. This conical field of view is also superimposed, for completeness, in the 2D displays 902-904. In a preferred embodiment, displays are also presented showing the actual image seen by the endoscope in the field of view 905, and the 3D perspective image for the same region in the field of view 905; these auxiliary displays are not shown in the drawings. Similar auxiliary displays are preferably included when the instrument 109 is an ultrasound transducer. […] Again, the endoscope field of view 905 is indicated in the display, and in a preferred embodiment auxiliary displays are also presented showing the actual image seen by the endoscope in the field of view 905”); and display a first image of the anatomy captured by the probe in a second area of the display screen, the anatomy in the first image corresponding to the anatomy icon within the field of view (Shahidi, Para 61; “In a preferred embodiment, displays are also presented showing the actual image seen by the endoscope in the field of view 905, and the 3D perspective image for the same region in the field of view 905; these auxiliary displays are not shown in the drawings.”). Shahidi does not clearly and explicitly disclose displaying a probe icon. In an analogous probe tracking field of endeavor Anderson discloses displaying a probe icon with an avatar of the subject (Anderson, Para 62; "Referring to FIG. 5, an embodiment of step 380 includes creating a display 385 of the acquired real-time, partial views of 3D or 4D ICE image data 102 of the anatomical structure in combination with one or more of the following: graphic representation(s) 390 of the identification and position of the imaging probe 105 (e.g., ICE catheter 145)") (Anderson, Para 19; "One embodiment of the image acquisition system 115 includes a generally real-time, intracardiac echocardiography (ICE) imaging system 140 that employs ultrasound to acquire generally real-time, 3D or 4D ultrasound image data of the patient's anatomy and to merge the acquired image data to generate a 3D or 4D image or model 112 of the patient's anatomy relative to time, generally herein referred to as the 4D model or image 112."). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shahidi to include displaying a probe icon in order to reduce manpower, expense, and time to perform interventional procedures, thereby reducing health risks as taught by Anderson (Anderson, Para 6) as well as improving ease of use. Regarding claim 2, Shahidi as modified by Anderson above discloses all of the limitations of claim 1 as discussed above. Shahidi further discloses wherein the imaging system is an ultrasound imaging system, and the probe is an ultrasound probe (Shahidi, Para 61; “the instrument 109 is an ultrasound transducer.”) (Shahidi, Para 48; “The surgical instrument 109 may include an ultrasound transducer located at the tip 115, which itself scans and detects ultrasound imaging data when placed in contact with the patient's head. FIG. 4 is a schematic block diagram showing the intra-operative (“intra-op”) ultrasound (“US”) protocol for handling the US image data during surgery. Typically the ultrasound transducer is a phased focusing array which generates data from a planar fan-shaped sector of the anatomical region of interest, where the central axis of the transducer lies in the plane of the scan sector which, in this context, is collinear with the longitudinal axis of the surgical instrument 109.”). Regarding claim 3, Shahidi as modified by Anderson above discloses all of the limitations of claim 1 as discussed above. Shahidi does not clearly and explicitly disclose wherein the anatomy includes a heart. However, Anderson further discloses wherein an anatomy includes a heart (Anderson, Para 16; "From the acquired generally real-time, partial views of 3D or 4D image data 102, a technical effect of the system 100 includes creating an illustration of a generally real-time 3D or 4D model 112 of a region of interest (e.g., a beating heart) so as to guide a surgical procedure."). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shahidi wherein the anatomy includes a heart in order to reduce manpower, expense, and time to perform interventional procedures, thereby reducing health risks in surgical procedures involving the heart as taught by Anderson (Anderson, Para 6 and 16) as well as improving ease of use. Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Shahidi and Anderson as applied to claim 1 above, and further in view of Panescu et al. (US20170181809, hereafter Panescu). Regarding claim 4, Shahidi as modified by Anderson above discloses all of the limitations of claim 1 as discussed above. Shahidi does not clearly and explicitly disclose wherein the processor is configured to display the first image at a first orientation or a second orientation that is different than the first orientation without the probe being moved. In an analogous visualization of a surgical procedure field of endeavor Panescu discloses wherein the processor is configured to display the first image at a first orientation or a second orientation that is different than the first orientation without the probe being moved (Panescu, Para 119; "Returning once again to FIG. 23, in response to receipt of a user selection, module 2304 configures the computer to display a menu on a computer console. Decision module 2306 configures the computer to determine whether user input to the menu is received to rotate an image of a selected object. In response to a determination that user input is received to rotate an image, module 2308 configures the computer to display rotate the image to show a different three-dimensional perspective of the object"). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shahidi wherein the processor is configured to display the first image at a first orientation or a second orientation that is different than the first orientation without the probe being moved as taught by Panescu in order to allow a user to more comfortably view the image in the manner of their choosing, which improves ease of use. Claims 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Shahidi and Anderson as applied to claim 1 above, and further in view of Markowitz et al. (US20120189173, Markowitz). Regarding claim 5, Shahidi as modified by Anderson above discloses all of the limitations of claim 1 as discussed above. Shahidi does not clearly and explicitly disclose wherein the processor is further configured to: display a second image of the anatomy in a third area of the display screen, the second image shows a different area of the anatomy than the first image. In an analogous surgical navigation visualization field of endeavor Markowitz discloses display a second image of an anatomy on a display screen, the second image shows a different area of the anatomy than a first image (Markowitz, Para 27; "display 24 may provide multiple images having different viewpoints or orientations of the same anatomical or physiological construct. For example, a second image 36 may be displayed with a second plurality of pixels, where the second image 36 is an alternative orientation of the first image 32. As shown in FIG. 2, the first image 32 may include an anterior-posterior view of the illustrated structure, while the second image 36 may include an illustration of the same anatomical structure in a right lateral view."). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shahidi wherein the processor is further configured to: display a second image of the anatomy in a third area of the display screen, the second image shows a different area of the anatomy than the first image in order to display multiples types of patient information in a convenient and readily-legible manner as taught by Markowitz (Markowitz, Para 9) which improves diagnosis. Regarding claim 6, Shahidi as modified by Anderson and Markowitz above discloses all of the limitations of claim 5 as discussed above. Shahidi does not clearly and explicitly disclose wherein the second image is captured by the probe before or after the probe captures the first image; and wherein the processor is configured to display another field of view icon on the display corresponding to the second image. However, Markowitz further discloses wherein a second image is captured after a first image (Markowitz, Para 26; “information resulting in the first image 32 may be acquired from the patient 14 and displayed in substantially real-time and/or displayed from previously-obtained information recalled from the storage media 22 of the control unit 12.”) (Markowitz, Para 28; " The third image 42 may consist of one or more signal traces or indications of the monitored or measured information, including a periodically-updated image or graphic that streams or sweeps across a portion of the display 24 as the information contributing to the third image 42 is updated or acquired. At least a portion of the third image 42 may traverse a portion of the first and/or second images 32, 36, or the plane of reference 40. On the portion of the display 24 where the third image 42 traverses, intersects or would otherwise be in the same position on the display 24 as part of the first and/or second images, the third image 42 may visually dominate or appear to overwrite the traversed portion of the first and/or second images 32, 36, as described in more detail below."). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Shahidi wherein the second image is captured by the probe before or after the probe captures the first image in order to display multiples types of patient information in a convenient and readily-legible manner as taught by Markowitz (Markowitz, Para 9) which improves diagnosis. Shahidi as modified by Anderson and Markowitz above discloses wherein the processor is configured to display another field of view icon on the display corresponding to the second image because Shahidi discloses displaying a field of view icon for an image, and Markowitz modifies Shahidi to include an additional image so the result would be multiple images with corresponding field of view icons. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to John Li whose telephone number is (313)446-4916. The examiner can normally be reached Monday to Thursday; 5:30 AM to 3:30 PM Eastern. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at (571) 272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOHN D LI/Primary Examiner, Art Unit 3798
Read full office action

Prosecution Timeline

Apr 09, 2024
Application Filed
Dec 01, 2025
Non-Final Rejection — §103
Mar 24, 2026
Examiner Interview Summary
Mar 24, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588954
ARTICULATING GUIDE WITH INTEGRAL POSITION SENSOR
2y 5m to grant Granted Mar 31, 2026
Patent 12575885
AUGMENTED REALITY GUIDANCE SYSTEM FOR CARDIAC INTERVENTIONAL SURGERY
2y 5m to grant Granted Mar 17, 2026
Patent 12569301
SURGICAL NAVIGATION SYSTEM FOR ALIGNMENT OF A SURGICAL INSTRUMENT
2y 5m to grant Granted Mar 10, 2026
Patent 12564368
NUCLEAR MEDICINE DIAGNOSIS APPARATUS, ACQUISITION PERIOD EXTENDING METHOD, AND NON-TRANSITORY COMPUTER-READABLE MEDIUM
2y 5m to grant Granted Mar 03, 2026
Patent 12558067
TEMPERATURE INSENSITIVE BACKING STRUCTURE FOR INTRALUMINAL IMAGING DEVICES
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
64%
Grant Probability
99%
With Interview (+48.7%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 246 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month