Prosecution Insights
Last updated: April 19, 2026
Application No. 18/050,013

ENDOSCOPE WITH AUTOMATIC STEERING

Final Rejection §102§103§112
Filed
Oct 26, 2022
Examiner
SONG, LI-TING
Art Unit
3795
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Covidien LP
OA Round
2 (Final)
66%
Grant Probability
Favorable
3-4
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 66% — above average
66%
Career Allow Rate
52 granted / 79 resolved
-4.2% vs TC avg
Strong +35% interview lift
Without
With
+35.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
31 currently pending
Career history
110
Total Applications
across all art units

Statute-Specific Performance

§101
0.3%
-39.7% vs TC avg
§103
50.9%
+10.9% vs TC avg
§102
27.9%
-12.1% vs TC avg
§112
20.8%
-19.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 79 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 1 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the term "a bounding box" in line 7 and 8. It is unclear whether “a bounding box” in line 8 is the same bounding box as “a bounding box” in claim 7. For the purpose of examination, the examiner is interpreting “a bounding box” in line 8 as “the bounding box”. Claim Rejections - 35 USC § 102 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 11-16 are rejected under 35 U.S.C. 102(a)(1) as being unpatentable by Yeung et al. (US2018/0296281). Regarding claim 11, Yeung discloses an endoscope automatic steering system, comprising: an endoscope comprising a steerable distal end with a camera producing an image signal (control system may comprise first and second image sensor [0006-0007]; plurality of sensors 209 located at or in close proximity to the distal end 207 [0110]…plurality of sensors may include optical image sensors [0112]) and an orientation sensor producing an orientation signal of an orientation of the steerable distal end (plurality of sensors may include touch/proximity sensors, motion sensors or location sensors which provide additional information about a relative position of the body portion relative to a colonic wall or lumen or lumen center or various motion properties of the colonoscope [0112]); an endoscope controller that: receives the image signal and the orientation signal (additional sensor data may be supplied to the steering control module for determining the control signal or target direction [0113]); automatically selects a steering target of the endoscope based on the image signal (navigation direction is determined that directs the robotic colonoscope towards the center position of the colon lumen using a machine learning architecture-based analysis of the center positions and locations [0037]); identifies a distal advancement of the endoscope based on the orientation signal or the image signal or both (additional sensor data may be supplied to the steering control module for determining the control signal or target direction [0113]); and generates instructions to automatically steer the distal end of the endoscope towards the steering target during the distal advancement (input data collected from sensors located at the distal end may be used for generating the control signal [0113]), determines, based on the orientation signal or the image signal or both, that the distal advancement has stopped; and based on the distal advancement stopping, pauses automatically steering the distal end while the distal advancement remains stopped (route may be automatically selected by machine learning method based on processing of the real-time image data stream and threshold of confidence level, route searching process may be stopped when an image with a confidence level at or above the threshold is obtained [0206-0207], therefore pausing distal advancement and steering; steering control system may be configured to guide the advancing endoscope with no input from the operator [0084], thus if steering control is stopped or paused, then it will pause both advancement and steering). Regarding claim 12, Yeung discloses the system of claim 11, further disclosing wherein the endoscope controller: determines, based on the orientation signal or the image signal or both, that the distal advancement has resumed; and resumes automatically steering the distal end based on the distal advancement resuming (similar to the rejection of claim 11, steering control system may be configured to guide the advancing endoscope with no input from the operator [0084], thus if steering control is stopped or paused, then it will pause both advancement and steering, if steering control is resumed, then distal advancement and steering would is resumed). Regarding claim 13, Yeung discloses the system of claim 11, further disclosing wherein the endoscope controller receives one or more user steering inputs while the distal advancement has stopped and changes the orientation of the distal end based on the one or more user steering inputs (user may manually control the steering of the colonoscope, which would change the orientation of the distal end of the colonoscope [0199]). Regarding claim 14, Yeung discloses the system of claim 11, further disclosing wherein the endoscope controller automatically steers the distal end only when an automatic steering mode is activated (steering control system may be fully-automatic, thus configured to determine correct navigation direction and provide a set of steering control instructions to one or more actuators of the robotic colonoscope with little to no input from a surgeon [0095]; various system such as a fully-automated, semi-automated, user-supervised, manually controlled [0199]). Regarding claim 15, Yeung discloses the system of claim 11, further disclosing wherein the endoscope controller automatically selects the steering target by identifying features of the image signal characteristic of a passage and selecting a center of the passage as the steering target (performed automated feature extraction on the series of two or more images to determine a center position of the colon lumen [0037]; determine a navigation direction that directs the robotic colonoscope towards the center position of the colon lumen using a machine learning architecture-based analysis of the center positions and locations [0037]). Regarding claim 16, Yeung discloses the system of claim 11, further disclosing the system comprising an automatic steering icon displayed on a display screen (image map data may be displayed on viewing device [0053]; augmented information, such as a contour of the detected object of interest, analogous to applicant’s automatic steering icon, may be overlaid onto real-time images [0205]). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 2, and 4-10 are rejected under 35 U.S.C. 103 as being unpatentable over Yeung in view of Vunjak-Novakovic et al. (US2020/0046943). Regarding claim 1, Yeung discloses the endoscope automatic steering system, comprising: an endoscope comprising a distal end with a camera producing an image signal (control system may comprise first and second image sensor [0006-0007]; plurality of sensors 209 located at or in close proximity to the distal end 207 [0110]…plurality of sensors may include optical image sensors [0112]); and an endoscope controller coupled to the endoscope (steering control system 210), wherein the endoscope controller: receives the image signal from the endoscope (image sensors configured to acquire image and video data [0013]; steering control system may utilize image processing algorithms to process image data input and provide a predicted steering direction and/or steering control signal as output for guiding the movement of the of the robotic endoscope [0084]); identifies, via a feature identification model, an anatomical feature in the image signal (performed automated feature extraction on the series of two or more images to determine a center position of the colon lumen [0037]), wherein the feature identification model identifies the anatomical feature by detecting an object in the image signal (object of interest, which may be a polyp, lesion, adenoma or cancerous growth [0047]) ; selects a center of the detected object as a steering target based on the identified anatomical feature (Fig. 5B: center of mass, which is the center of the lumen or center of the object, or a geographical center may be used to define a center location [0133]); and automatically steers the distal end of the endoscope towards the steering target during distal motion of the distal end of the endoscope (control system tracks an object of interest by providing an adaptive steering control output signal to the robotic endoscope [0040], steering control signal aligns the steering direction with the motion direction vector of the object of interest [0044]; provide a predicted steering direction and/or steering control signal as output for guiding the movement of the of the robotic endoscope [0084]). Yeung fails to disclose wherein the feature identification model generates a bounding box around the object and selects a center of the bounding box around the detected object as a steering target. In the same field of endeavor, Vunjak-Novakovic teaches a substantially similar autonomous steerable catheter device and method comprising: an endoscope comprising a distal end with a camera producing an image signal (catheter or bronchoscope [0092]; catheter with imaging device to help steer [0093]); and an endoscope controller (controller and control system [0109]) coupled to the endoscope, wherein the endoscope controller: receives the image signal from the endoscope (Fig 22: images are obtained [0136]); identifies, via a feature identification model, an anatomical feature in the image signal (region of interest ROI, an airway branch, can be determined from computer-assisted airway pattern recognition [0136]), wherein the feature identification model identifies the anatomical feature by detecting an object in the image signal and generates a bounding box around the object (rectangle encloses the dotted regions and the center of each rectangle is identical to the center of the corresponding airway[0140]); selects a center of a bounding box around the detected object as a steering target based on the identified anatomical feature (Fig. 26A-26D: center of rectangles marked with a hollow circle or “o” [0140]); and automatically steers the distal end of the endoscope towards the steering target during distal motion of the distal end of the endoscope (Fig. 26A-D, 27: center of rectangles, which are airway centers, depicted by hollow circles, becomes the steering target as the distal tip of the catheter is steered toward the center of the target airway [0141]). In view of Vunjak-Novakovic, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included the bounding box around the object, wherein the center of the bounding box is selected as a steering target, as it is known in the endoscope art that the bounding box around the object is an alternative method of determining the center of an object. Regarding claim 2, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein the feature identification model identifies the anatomical feature by classifying a subset of pixels in the image signal and selects a center of the subset of pixels as the steering target (artificial intelligence algorithm used to process image data as input and provide predicted steering direct and control signals [0084]; object of interest is recognized using an automated image feature extraction comprising various methods contour mapping [0041]; determination of the location of the center of the lumen comprises (i) generating a plurality of vectors, wherein each vector is aligned in a direction that is normal to a local segment of a contour identified in an image, and (ii) counting the number of vectors that intersect a given pixel in the image [0007]). Regarding claim 4, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein the endoscope controller identifies multiple anatomical features and selects an individual anatomical feature of the identified multiple anatomical features to determine the steering target (control system can track object of interests within a field of view of the first image sensor [0040]; object of interests may be polyp, lesion, adenoma or a cancerous growth [0047]; object of interest may be auto-tracked [0200-0201], or center of colon lumen may be selected as target [0037]). Regarding claim 5, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein the endoscope controller: receives an updated image signal from the endoscope (first and second image sensors configured to provide a first and second input data stream comprises data relating to the position of the steerable distal portion relative to the center of a colon, walls of the colon lumen other obstacles and any combination thereof [0020-0021]; “data stream” may refer to the continuous sequence of analog or digital signals used to transmit or receive information [0089]; “real-time” update rate for the prediction of steering direction/control in response to changes in one or more input data streams [0092 & 0142]); identifies the anatomical feature based on the updated image signal; selects an updated steering target within the anatomical feature (performed automated feature extraction on the series of two or more images to determine a center position of the colon lumen [0037]); determines that the distal end is oriented away from the updated steering target (automated steering determines current orientation of the endoscope and reorients endoscope toward target [0107]); and generates updated steering instructions to cause the endoscope to automatically steer the distal end towards the updated steering target (steering control signal is adjusted to align the steering direction of the robotic endoscope with the motion direction vector of the object of interest [0044]; update rate for steering control signal may be various frequencies [0142]). Regarding claim 6, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein an indicator representing the steering target is overlaid on a displayed image based on the image signal (contour of the detected object of interest may be overlaid onto real-time images [0205]). Regarding claim 7, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein the steering target is a center or centroid of the identified anatomical feature, wherein the identified anatomical feature comprises a passage (determine a navigation direction that directs the robotic colonoscope towards the center position of the colon lumen using a machine learning architecture-based analysis of the center positions and locations [0037]). Regarding claim 8, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein the steering target is selected without user steering input (steering control system may enable automated control of colonoscope [0107]). Regarding claim 9, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein the endoscope controller identifies multiple anatomical features and selects an individual anatomical feature comprising a passage from the multiple anatomical features based on user steering input or motion of the endoscope towards the passage (control system can track object of interests within a field of view of the first image sensor [0040]; object of interests may be polyp, lesion, adenoma or a cancerous growth [0047]; object of interest may be auto-tracked [0200-0201], or center of colon lumen may be selected as target [0037]; tracking of object of interest may be manually selected by surgeon [0087]). Regarding claim 10, Yeung, modified by Vunjak-Novakovic, discloses the system of claim 1. Yeung further discloses wherein the endoscope controller: determines that the endoscope is inside a subject based on the image signal (analysis of data from the first and second input data stream determine whether the endoscope is inside a subject or not, the analysis comprises a determination of the location of the center of the lumen, location of a wall of the lumen or obstacle [0007]); and activates automatic steering to identify the anatomical feature based on the determination (steering control output signal is based on an analysis of data derived from the first and second data input stream, the steering control output signal adapts to changes in the data derived from the first or at least second input data streams in real time [0007]). Claims 21-23 and 25 are rejected under 35 U.S.C. 103 as being unpatentable over Yeung in view of Kotoda (JP2011244884). Regarding claim 21, Yeung discloses an endoscope automatic steering system, comprising: an endoscope comprising a distal end with a camera producing an image signal (control system may comprise first and second image sensor [0006-0007]; plurality of sensors 209 located at or in close proximity to the distal end 207 [0110]…plurality of sensors may include optical image sensors [0112]); and an endoscope controller coupled to the endoscope (steering control system 210), wherein the endoscope controller: receives the image signal from the endoscope (image sensors configured to acquire image and video data [0013]; steering control system may utilize image processing algorithms to process image data input and provide a predicted steering direction and/or steering control signal as output for guiding the movement of the of the robotic endoscope [0084]); identifies, via a feature identification model, an anatomical feature in the image signal (performed automated feature extraction on the series of two or more images to determine a center position of the colon lumen [0037]); selects a steering target based on the identified anatomical feature (center of object of interest, which may be a polyp, lesion, adenoma or cancerous growth [0047]; steering control signal aligns the steering direction with the motion direction vector the object of interest [0044]); and automatically steers the distal end of the endoscope towards the steering target during distal motion of the distal end of the endoscope (control system tracks an object of interest by providing an adaptive steering control output signal to the robotic endoscope [0040], steering control signal aligns the steering direction with the motion direction vector of the object of interest [0044]; provide a predicted steering direction and/or steering control signal as output for guiding the movement of the of the robotic endoscope [0084]). Yeung fails to disclose wherein the endoscope controller determines, based on the image signal, that the endoscope is inside a passage of a subject. In the same field of endeavor, Kotoda teaches a substantially similar endoscope system, further comprising wherein a body cavity color determination unit determines whether or not the endoscope is within the body cavity (if the image in the body cavity is dominated by red lineage, the body cavity color determination unit 702 sets the body cavity color determination signal to “inside the body cavity” when the red component is greater than or equal to a predetermined value, and sets the determination signal to “outside the body cavity” when the red component is less than the predetermined value [0052]). In view of Kotoda, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included wherein the endoscope controller checks to ensure the endoscope is within a body cavity before initiating automatic steering of the endoscope, as the process prevents unnecessary steering or image processing from occurring while the endoscope is outside the body. Regarding claim 22, Yeung, modified by Kotoda, discloses the endoscope automatic steering system of claim 21. Kotoda further teaches wherein determining that the endoscope is inside the passage comprises determining that at least one of: the image signal is above a threshold of red percentage (if the image in the body cavity is dominated by red lineage, the body cavity color determination unit 702 sets the body cavity color determination signal to “inside the body cavity” when the red component is greater than or equal to a predetermined value, and sets the determination signal to “outside the body cavity” when the red component is less than the predetermined value [0052]). Regarding claim 23, Yeung, modified by Kotoda, discloses the endoscope automatic steering system of claim 21. Yeung further discloses wherein an indicator representing the steering target is overlaid on a displayed image based on the image signal (augmented information superimposed on the image frame may comprise a plurality of graphical elements to indicated location of a detected lumen center or a suggested steering direction [0166-0167]). Regarding claim 25, Yeung, modified by Kotoda, discloses the endoscope automatic steering system of claim 21. Yeung further discloses wherein endoscope controller is a video laryngoscope (robotic endoscope, although mentioned as possibly a colonoscope, may also be a laryngoscope if inserted through the mouth to view the larynx) . Claim 24 is rejected under 35 U.S.C. 103 as being unpatentable over Yeung in view of Kotoda and Motoki (US2017/0374292). Regarding claim 24, Yeung, modified by Kotoda, discloses the endoscope automatic steering system of claim 21. While Yeung teaches wherein sensor data or augmented data may be displayed on the display screen, Yeung fails to explicitly disclose an icon, indicative of the mode of steering, particularly, the automatic steering mode, is displayed on a display screen. In the same field of endeavor, Motoki teaches a substantially similar endoscope system, comprising an endoscope and an endoscope controller (endoscope device 1 and CPU 47 [0023, 0034]), the controller configured to automatically perform bending control of a bendable endoscope shaft (front end portion 11a of insertion portion 11 [0024]; CPU 47 automatically performs bending control [0058]) further disclosing wherein the endoscope controller causes a display of an automatic steering icon on a display screen of the endoscope controller (Fig. 7: automatic adjustment mode is a mode where bending control is automatically performed, and when executed, an icon 80 on the CPU display will indicate the automatic adjustment mode is executed [0075-0076]). In view of Motoki, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included wherein the display screen displayed an icon indicating the steering mode of the endoscope system of Yeung, as it is well-known in the endoscope art that the icon will notify the operator that the endoscope is in automatic mode to prevent the operator from entering manual inputs that would interfere with the steering/bending control [0075]. Response to Arguments Applicant’s arguments with respect to claim 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Instead, new grounds of rejection based on Vunjak-Novakovic is presented in the above to evidence the Office’s Position. Applicant has argued that Yeung fails to teach at least “an endoscope controller that: …determines, based on the orientation signal or the image signal or both, that the distal advancement has stopped; and based on the distal advancement stopping, pauses automatic steering the distal end while the distal advancement remains stopped.” The examiner respectfully disagrees. The route-searching discussed in Yeung is a feature that coexists with the automatic steering control, as automated route searching is conducted to generate a steering control signal. Further, the examiner has cited paragraph [0084] which states that the steering control output signal steers the robotic endoscope in real-time, and that in the fully-automated mode, the steering control system may be configured to guide the advancing endoscope with little to no input from the operator. This strongly suggests that if steering control is paused, advancement and steering would both pause, and similarly, if steering control is resumed, advancement and steering would both resume. For the reasons above, the applicant’s arguments were not persuasive. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to LI-TING SONG whose telephone number is (571)272-5771. The examiner can normally be reached 8-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anhtuan Nguyen can be reached at 571-272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /LI-TING SONG/Examiner, Art Unit 3795 /ANH TUAN T NGUYEN/Supervisory Patent Examiner, Art Unit 3795 03/09/26
Read full office action

Prosecution Timeline

Oct 26, 2022
Application Filed
Aug 23, 2025
Non-Final Rejection — §102, §103, §112
Dec 01, 2025
Response Filed
Mar 06, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588924
MINIMALLY INVASIVE DISSECTOR FOR INTER-LAYER PROCEDURES
2y 5m to grant Granted Mar 31, 2026
Patent 12575721
DEVICES, SYSTEMS, AND METHODS FOR TREATING KIDNEY STONES
2y 5m to grant Granted Mar 17, 2026
Patent 12575714
A TIP PART FOR FORMING A TIP OF A DISPOSABLE INSERTION ENDOSCOPE
2y 5m to grant Granted Mar 17, 2026
Patent 12575904
ENDOSCOPE CONTROL METHOD AND SURGICAL ROBOT SYSTEM
2y 5m to grant Granted Mar 17, 2026
Patent 12544517
CANNULA LOCATOR DEVICE
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
66%
Grant Probability
99%
With Interview (+35.1%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 79 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month