Prosecution Insights
Last updated: April 19, 2026
Application No. 18/839,526

SYSTEM AND METHOD FOR ALIGNING MOVEMENT DIRECTION OF INTERVENTIONAL DEVICE IN IMAGE AND CONTROL DIRECTION OF COMMANDS ENTERED BY USER

Non-Final OA §102§103§112
Filed
Aug 19, 2024
Examiner
MCGRATH, ERIN E
Art Unit
3771
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Koninklijke Philips N V
OA Round
1 (Non-Final)
59%
Grant Probability
Moderate
1-2
OA Rounds
3y 11m
To Grant
90%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
250 granted / 423 resolved
-10.9% vs TC avg
Strong +31% interview lift
Without
With
+31.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
45 currently pending
Career history
468
Total Applications
across all art units

Statute-Specific Performance

§101
0.5%
-39.5% vs TC avg
§103
46.0%
+6.0% vs TC avg
§102
19.1%
-20.9% vs TC avg
§112
31.6%
-8.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 423 resolved cases

Office Action

§102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 3, 5-9 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 3 recites “the alteration of functionality of the user interface.” There is insufficient antecedent basis for this limitation in the claim. Claim 5 recites “wherein a length of the future motion vector indicates a number of future frames of the next images are needed for the movement of the interventional device to be fully realized.” It is not clear what is meant by this. This limitation recites that the length indicates future frames are needed. How? And what does it mean for “the movement” of the device to be “fully realized?” Further clarification is required. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1, 3, 10-13, 16, 17 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Graetzel et al. [US 2021/0393338 A1, hereinafter “Graetzel”]. Re. claim 1, Graetzel discloses a system [Fig. 1] for displaying and controlling the progress of an interventional device configured for insertion into an anatomical structure of a subject [Par. 0005], the system comprising: at least one processor [part of control system 150, Pars. 0072 and 0245] coupled to (i) a display [152] and (ii) a user interface [154 and 156, together] to provide control inputs for controlling movements [Par. 0067] of the interventional device [130], the at least one processor configured to: read a determinate coordinate system associated with the user interface [vertical/horizontal axes of image data within an interface, Par. 0085]; receive current image data of a current image of the interventional device in the anatomical structure displayed or to be displayed on a display, the current image showing a current position of the interventional device [Par. 0068]; receive a control input from the user interface for controlling a movement of the interventional device from the current position, the control input being representative of a control direction in the determinate coordinate system of the user interface [Par. 0078]; estimate from at least the current image data a movement direction of the interventional device based on the control input [“the direction in which the control system 150 estimated that the catheter 130 moved,” Par. 0101]; estimate a mismatch between the movement direction of the interventional device and the control direction [a “difference between the direction indicated by the physician 160 in which the catheter 130 moved (with respect to a control/coordinate frame) and the direction in which the control system 150 estimated that the catheter 130 moved (with respect to the control/coordinate frame),” Par. 0101]; determine a change of orientation of the current image displayed or to be displayed or a change of orientation of the coordinate system of the user interface to align the movement direction of the interventional device in the current image and the control direction of the user interface; and implement the change of orientation [“the control system 150 can update the control frame/scheme for the catheter,” Par. 0101, which changes the display orientation; see e.g. Pars. 0092-0093, which discuss the examples of direct and inverted control schemes]. Re. claim 3, as best understood, Graetzel discloses determining a change of orientation of the coordinate system of the user interface comprises controlling the alteration of functionality of the user interface such that a control input corresponding to the control direction matches the movement direction of the interventional device displayed or to be displayed [Par. 0087]. Re. claim 10, Graetzel discloses the user interface for controlling movements of the interventional device based on control inputs, associated with said the stored determinate coordinate system, coupled to the at least one processor [Fig. 1, Par. 0067], and the display configured to display images of the interventional device in the anatomical structure of the subject, coupled to the at least one processor [Par. 0068]. Re. claim 11, Graetzel discloses the user interface comprises a control console [156] comprising an input device operable by a user for controlling movement of the interventional device [Fig. 1], the input device being optionally a joy stick or a thumb stick [this optional limitation need not be met, though Fig. 1 does show a thumb stick]. Re. claim 12, Graetzel discloses the at least one processor is further configured to: determine the control direction of the input device relative to the control console based on the control input [Pars. 0134-0136 disclose the two control directions being determined based on control input]. Re. claim 13, Graetzel discloses an imaging system [imaging device 180] configured to acquire the current image of the anatomical structure [Par. 0076], and a robot controller [control electronics in 114 of robotic system 110, Par. 0074] configured to enable control of the robot in accordance with the control input provided through the user interface [Par. 0074]. Re. claim 16, Graetzel discloses the processor configured to perform the steps as set forth with respect to claim 1 above. Thus, Graetzel also inherently discloses performing said steps as claimed in claim 16. Re. claim 17, Graetzel discloses the processor configured to perform the steps as set forth with respect to claim 1 above. Because the processor must be provided with instructions to perform said steps, Graetzel inherently discloses the non-transitory computer readable medium storing instructions to cause the processor to perform the steps of claim 1, and thus teaches the limitations of claim 17. Claim(s) 1, 2, 16, 17 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Masaki et al. [US 2021/0369355 A1, hereinafter “Masaki”]. Re. claim 1, Masaki discloses a system [Fig. 1] for displaying and controlling the progress of an interventional device configured for insertion into an anatomical structure of a subject [Par. 0045], the system comprising: at least one processor [410, Fig. 1] coupled to (i) a display [420] and (ii) a user interface [300] to provide control inputs for controlling movements [Par. 0051] of the interventional device [100], the at least one processor configured to: read a determinate coordinate system associated with the user interface [axes in the image, Par. 0078]; receive current image data of a current image of the interventional device in the anatomical structure displayed or to be displayed on a display, the current image showing a current position of the interventional device [image of catheter, with reference marks RM1/RM2, received from camera 180, Par. 0077-0078]; receive a control input from the user interface for controlling a movement of the interventional device from the current position, the control input being representative of a control direction in the determinate coordinate system of the user interface [receiving input from buttons of game pad controller, including up/down left/right directions corresponding to the determinate coordinate system, Par. 0078]; estimate from at least the current image data a movement direction of the interventional device based on the control input [“ At this point, the up/down button of the gamepad controller is mapped with the up/down (vertical) direction of the image and the left/right buttons of the gamepad controller are mapped with the left/right (horizontal) direction of the image, so the user intuitively controls the direction of the catheter to navigate to either the left bronchus (LB) or right bronchus (RB). However, when the orientation of the catheter with respect to the orientation of the camera 180 moves (rotates) in a direction 199 by β degrees (see FIG. 11C), the up button of game pad controller is off by β.,” Par. 0078]; estimate a mismatch between the movement direction of the interventional device and the control direction [“To compensate for this mis-mapping, β is calculated using the position of marks RM1 and RM2 shown in the image, by comparing the rotated image to the previous un-rotated image.,” Par. 0078]; determine a change of orientation of the current image displayed or to be displayed or a change of orientation of the coordinate system of the user interface to align the movement direction of the interventional device in the current image and the control direction of the user interface [“Then β is used to re-map the direction of the game pad controller,” Par. 0078]; and implement the change of orientation [“After re-mapping, the up button of game pad controller is accurately mapped with the up direction of the image, so the user can intuitively control the direction of the catheter with respect to the anatomy of the patient ,” Par. 0078, where the re-mapping includes a change in orientation, Par. 0079]. Re. claim 2, Masaki discloses determining a change of orientation of the current image displayed or to be displayed comprises rotating the current image displayed or to be displayed until the movement direction of the interventional device displayed or to be displayed aligns with the control direction [Par. 0079]. Re. claim 16, Masaki discloses the processor configured to perform the steps as set forth with respect to claim 1 above. Thus, Masaki also inherently discloses performing said steps as claimed in claim 16. Re. claim 17, Masaki discloses the processor configured to perform the steps as set forth with respect to claim 1 above. Because the processor must be provided with instructions to perform said steps, Masaki inherently discloses the non-transitory computer readable medium storing instructions to cause the processor to perform the steps of claim 1, and thus teaches the limitations of claim 17. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Graetzel or Masaki in view of Hunter et al. [US 2004/0097806 A1, hereinafter “Hunter.”]. Re. claim 4, the primary reference teaches the system as set forth with respect to claim 1 above, but fails to teach the estimating the movement direction comprises inferring a shape of the interventional device. However, Hunter teaches estimating a movement direction of an interventional device (catheter) comprises: inferring a shape of the interventional device and surrounding anatomy of the anatomical structure from the current image [“Visualization of the shape and position of a distal portion of the catheter,” Par. 0010] and a plurality of recent past images [“The image 178 further includes a spline or curved projection 182, which is based upon the shape of the curved catheter 52,” Par. 0080]; and estimating the movement direction of the interventional device based on the shape of the interventional device and the surrounding anatomy of the anatomical structure [Par. 0080]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of Graetzel or Masaki as taught by Hunter in order to enable estimated curved trajectories of the catheter to be displayed to assist the user [Hunter Par. 0010]. Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Graetzel or Masaki in view of Gormley et al. [US 2023/0372032 A1, hereinafter “Gormley”]. Re. claim 5, as best understood, Graetzel or Masaki teaches the system of claim 1 above but fails to teach estimating the movement direction in the manner claimed in claim 5. However, Gormley teaches estimating the movement direction of an interventional device comprises: establishing a motion vector for the interventional device using a plurality of recent past images and corresponding control inputs, wherein the motion vector represents a direction and a magnitude of displacement of the interventional device moving through the anatomical structure shown in the plurality of recent past images [Par. 0054]; predicting future motion vectors in corresponding next images indicative of a future direction of the movement of the interventional device using a first neural network model [Par. 0054], wherein a length of the future motion vector indicates a number of future frames of the next images are needed for the movement of the interventional device to be fully realized [see 112(b) above; Gormley, Pars. 0054-0056]; and estimating the movement direction of the interventional device based on the predicted future motion vectors [Par. 0055]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of Graetzel or Masaki by configuring the system such that estimating the movement direction comprises establishing a motion vector, predicting future motion vectors using a first neural network model, and estimating the movement direction, as taught by Gormley, because this allows for automatic driving of the instrument [Gormley, abstract]. Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Graetzel or Masaki in view of Gormley and further in view of Yeung et al. [US 2018/0296281 A1, hereinafter “Yeung”]. Re. claim 6, the modified Graetzel/Masaki teaches training the first neural network, initially train the first neural network model using motion vectors and corresponding control inputs associated with a plurality of training images [Pars 0054-0056] but fails to teach recurrent convolutional layers or transformer architectures. However, Yeung teaches, in an automated steering system for an endoscope, at least one processor is further configured to: initially train the first neural network model using motion vectors and corresponding control inputs associated with a plurality of training images, wherein the first neural network model includes recurrent convolutional layers [Par. 0007, 0231]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of the modified Graetzel/Masaki such that the first neural network model includes recurrent convolutional layers as taught by Yeung because this allows image data to be used directly as input and “and the neural network is allowed to formulate the logical processing steps that provide optimal mapping of the input data to an output navigational direction and/or set of steering control instructions” [Yeung Par. 0231]. Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Graetzel or Masaki in view of Gormley and further in view of Sen et al. [US 2021/0290317 A1, hereinafter “Sen”]. Re. claim 7, the modified Graetzel/Masaki teaches estimating the mismatch between the movement direction of the interventional device and the control direction of the input device of the control console but fails to teach training a second neural network model. However, Sen teaches, in a system for tracking a position of a surgical instrument, a processor configured to initially train a “second” neural network model for estimating the mismatch between a movement direction of the interventional device and a control direction using a current image, a plurality of recent past images and corresponding control inputs, and an estimated movement direction of the interventional device [Par. 0061]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of Graetzel/Masaki such that a second neural network is trained as taught by Sen because this allows for more precise, accurate, efficient, and reliable position tracking of the system [Sen, Par. 0003]. Claim(s) 8-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Graetzel or Masaki in view of Gormley and Sen, as applied to claim 7, and further in view of Yeung. Re. claims 8 and 9, Graetzel/Masaki fail to teach the neural networks, which are taught by secondary references. Yeung teaches, in a neural network model, that it may be supervised or unsupervised, or any combination thereof [Par. 0091]. Therefore, selecting each of the first neural network model and the second neural network model to be supervised as in claim 8, or unsupervised as in claim 9, would have been obvious to one of ordinary skill in the art before the effective filing date of the invention, because this amounts to selecting one from a limited list of options re. the type of machine learning training used. Claim(s) 14-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Graetzel or Masaki in view of . Re. claim 14. Graetzel/Masaki fail to teach predicting future movement. However, Yeung teaches estimating the movement direction of the interventional device comprises predicting future movement [“predicted steering direction,” Par. 0084] of the interventional device. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of Graetzel/Masaki by predicting future movement as taught by Yeung in order to allow the position of the device to be determined in real time [Yeung, Par. 0009] Re. claim 15. Graetzel/Masaki fail to teach estimating the movement direction based on movement in recent past images. However, Yeung teaches the movement direction of the interventional device is estimated based on movement of the interventional device in the plurality of recent past images [“a predicted position for the center of the lumen is calculated based on motion vectors derived from the center positions of the lumen in two or more images previously captured by the first or at least second image sensor,” Par. 0011]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of Graetzel/Masaki by configuring the movement direction of the interventional device is estimated based on movement of the interventional device in the plurality of recent past images as taught by Yeung in order to allow the movement direction to be accurately estimated based on historical data. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIN MCGRATH whose telephone number is (571)270-0674. The examiner can normally be reached M-F 9 am to 5 pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JACKIE HO can be reached at (571) 272-4696. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ERIN MCGRATH/Primary Examiner, Art Unit 3771
Read full office action

Prosecution Timeline

Aug 19, 2024
Application Filed
Jan 07, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594068
MEDICAL SYSTEMS, DEVICES AND METHODS ADAPTED FOR TISSUE FIXATION AND APPROXIMATING TISSUE DEFECTS
2y 5m to grant Granted Apr 07, 2026
Patent 12591986
METHOD AND NAVIGATION SYSTEM FOR REGISTERING TWO-DIMENSIONAL IMAGE DATA SET WITH THREE-DIMENSIONAL IMAGE DATA SET OF BODY OF INTEREST
2y 5m to grant Granted Mar 31, 2026
Patent 12575821
A DEVICE FOR THE STITCHING OF LAPAROSCOPIC INCISIONS AND METHODS THEREOF
2y 5m to grant Granted Mar 17, 2026
Patent 12569246
METHODS AND DEVICES FOR REPAIRING AND ANCHORING DAMAGED TISSUE
2y 5m to grant Granted Mar 10, 2026
Patent 12569232
SURGICAL INSTRUMENT AND STEERING GEAR THEREOF
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
59%
Grant Probability
90%
With Interview (+31.3%)
3y 11m
Median Time to Grant
Low
PTA Risk
Based on 423 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month