Prosecution Insights
Last updated: April 19, 2026
Application No. 17/890,635

SURGERY ASSISTANCE SYSTEM, OPERATING METHOD FOR SURGERY ASSISTANCE SYSTEM, AND CONTROL DEVICE OF SURGERY ASSISTANCE SYSTEM

Final Rejection §103
Filed
Aug 18, 2022
Examiner
MONAHAN, MEGAN ELIZABETH
Art Unit
3795
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Olympus Corporation
OA Round
2 (Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
3y 11m
To Grant
80%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
62 granted / 106 resolved
-11.5% vs TC avg
Strong +22% interview lift
Without
With
+21.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
43 currently pending
Career history
149
Total Applications
across all art units

Statute-Specific Performance

§101
0.7%
-39.3% vs TC avg
§103
41.7%
+1.7% vs TC avg
§102
29.5%
-10.5% vs TC avg
§112
26.3%
-13.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 106 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions Applicant’s election without traverse of Invention I, which is drawn to a surgery assistance system, and directed to claims 1-6, in reply filed 05/12/2025 is acknowledged. Claims 7-16 are withdrawn because they are directed to a non-elected invention. Response to Amendment The amendment filed 09/04/2025 has been entered. In the present application, claims 1-16 are currently pending, with claims 7-16 being withdrawn because they are directed to a non-elected invention. Claims 1, 6, 7, and 12 are currently amended. Claims 1-6 are examined below. Response to Arguments Applicant’s arguments filed 09/04/2025, with respect to the pending claims, have been fully considered. Applicant can amended the independent claim 1 with the newly added limitation stating, “wherein the processor is configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points….” Such newly added limitation changes the scope of the claims and renders the previous rejections moot. Therefore, the previous rejections identified in the non-final office action dated, 06/05/2025, have been withdrawn. However, upon further consideration, a new ground of rejection is made below. Please see the rejection under 35 U.S.C. §103 below for further explanation. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-4 are rejected under 35 U.S.C. 103 as being unpatentable over Larkin et al. (US20080004603) hereinafter Larkin in view of DiMaio et al (US2018/0042680) hereinafter DiMaio in view of Assaf Govari (US2019/0059833) hereinafter Govari. Regarding Claim 1, Larkin discloses a surgery assistance system (Fig. 1), comprising: an endoscope (Fig. 1 endoscope 140); a display (Fig. 1 3-D monitor 104) configured to display an image ([0028] “The Console includes a 3-D monitor 104 for displaying an image of a surgical site to the Surgeon”) from the endoscope (Fig. 1 endoscope 140); a treatment tool (Figs. 1-7 surgical instruments 138, 139) that includes an end effector (Figs. 1-7 end effector of surgical instruments 138, 139) at a distal end (Figs. 1-7 near distal end of surgical instruments 138, 139); an input device (Fig. 1 control devices 108, 109) configured to input an instruction ([0029] “The Surgeon performs a minimally invasive surgical procedure by manipulating the control devices 108 and 109 so that the processor 102 causes their respectively associated slave manipulators 128 and 129 (also referred to herein as “robotic arms” and “patient-side manipulators”) to manipulate their respective removably coupled surgical instruments 138 and 139 (also referred to herein as “tools”) accordingly, while the Surgeon views the surgical site in 3-D, as it is captured by a stereoscopic endoscope 140 (having left and right cameras for capturing left and right stereo views) and displayed on the Console 3-D monitor 104.”) to the end effector (Figs. 1-7 end effector of surgical instruments 138, 139); and a processor (Fig. 1 processor 102) connected to the endoscope (Fig. 1 endoscope 140), the display (Fig. 1 3-D monitor 104), the treatment tool (Figs. 1-7 surgical instruments 138, 139), and the input device (Fig. 1 control devices 108, 109), wherein the processor (Fig. 1 processor 102) is configured to detect a distal end position of the end effector (Figs. 1-7 end effector of surgical instruments 138, 139) based on the instruction ([0032-0033, 0036], estimate a first treatment surface (Fig. 7 surface of object 700 illustrated as ghost image 711); Larkin discloses capturing an image of the position of the distal end of the end effector but fails to explicitly disclose record the distal end position that has been detected, and estimate a first treatment surface from a plurality of recorded distal end positions, and wherein the processor is also configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points. However DiMaio, in the same field of endeavor, teaches record the detected distal end position (DiMaio – Fig. 4 step 403 and [0077-0078] and Fig. 5 step 504 [0083), and estimate a first treatment surface from a plurality of recorded distal end positions (DiMaio – Figs. 6 [0084] “FIG. 6 illustrates, as an example, a flow diagram of a method for automatically moving the LUS probe 150 to a position and orientation associated with a clickable thumbnail upon command to do so by a surgeon while performing a minimally invasive surgical procedure using tool 139. In process 601, the clicking of a thumbnail generated by the method described in reference to FIG. 5 is detected by, for example, a conventional interrupt handling process. Upon such detection, in process 602, the auxiliary controller 242 is instructed by, for example, stored instructions corresponding to the interrupt handling process, to retrieve the position and orientation stored in memory 240 which is associated with the thumbnail. The auxiliary controller 242 then causes the LUS probe 150 to move to that position and orientation by appropriately controlling slave arm 124 in process 603. Thus, the surgeon is able to move the LUS probe 150 to a desired position without having to change modes of the control switch mechanism 231 and halt operation of the tool 139 until the LUS probe 150 is moved.”). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify Larkin to record the detected distal end position as taught by DiMaio for the benefit of retracing the positions and orientations later on (DiMaio – [0077-0078]). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify to modify Larkin to include an estimate of a first treatment surface from a plurality of recorded distal end positions as taught by DiMaio for the benefit of “…the surgeon is able to move the …probe … to a desired position” ([DiMaio – [00084].) Larkin in view of DiMaio alone or in combination fail to explicitly teach wherein the processor is further configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points. However Govari, in the same field of endeavor, teaches wherein the processor (Govari – [0009, 00027-0044] Fig. 1 processor 34) is further configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points (Govari - Fig. 2 and Fig. 3 steps 100-112, [0003]). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify the teachings of Larkin in view of DiMaio with the teachings of Govari to include the processor configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points for the benefit of “…navigating a medical device, such as an endoscope, to a patient head, without inserting the endoscope to restricted sections of the head, typically for patient safety reasons” (Govari – [0014]). Regarding Claim 2, Larkin in view of DiMaio in view of Govari teach the surgery assistance system according to claim 1, wherein the processor is configured to make a correspondence between anatomical information (Larkin – Fig. 7 ghost 711) of a target organ (Larkin – Fig. 7 object 700) and the image (Larkin – [0028] “The Console includes a 3-D monitor 104 for displaying an image of a surgical site to the Surgeon”), and display on the display (Larkin – Fig. 1 3-D monitor 104) the anatomical information (Larkin – Fig. 7 ghost 711) related to the first treatment surface (Larkin – Fig. 7 surface of object 700 illustrated as ghost image 711) that has been estimated (Larkin – [0048] “a GUI generated screen that is displayed on the monitor 104, wherein both tools 138 and 139 are positioned so as to be within the viewing area 300, but the end effector of the tool 138 is occluded by an object 700. In this case, since each of the tools is in the viewing area 300, their respective symbols 710 and 420 are at maximum size. Although the end effector of the tool 138 is occluded by the object 700, a ghost image 711 (e.g., a computer model) of the end effector is shown at the proper position and orientation over the object 700. If the ghost image 711 is too distracting, then an outline of the end effector may be used instead, as either a programmed or surgeon selected option.”). Regarding Claim 3, Larkin in view of DiMaio in view of Govari teach the surgery assistance system according to claim 2, wherein the correspondence is a correspondence between a first coordinate system (Larkin – Fig. 9 P1) defined by the anatomical information (Larkin – Fig. 7 ghost 711) and a second coordinate system (Larkin – Fig. 9 P2) defined by the image. Regarding Claim 4, Larkin in view of DiMaio in view of Govari teach the surgery assistance system according to Claim 2, wherein the anatomical information (Larkin – Fig. 7 ghost 711) is vessel information near the first treatment surface (Larkin – Fig. 7 surface of object 700 illustrated as ghost image 711). Claims 1, 2, 5, and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Satoshi Tanaka (US2016/0038004) hereinafter Tanaka, in view of DiMaio in view of Govari. Regarding Claim 1, Tanka discloses a surgery assistance system (abstract), comprising: an endoscope (imaging/scope section 200, [0046] “The endoscope apparatus includes an imaging section 200 (scope section)….”); a display (Figs. 2,10 display section 400) configured to display an image from the endoscope ([0053] “An image displayed on the display section 400 includes an image display area 10 in which the captured image that includes a blood vessel 11, a treatment tool 12, and the like is displayed, and a degree-of-relation display area 20 in which the degree of relation is displayed.”); a treatment tool (Figs. 2, 10 treatment tool 210) that includes an end effector ([0047] “a treatment tool 210 (e.g., forceps or knife)”) at a distal end (see Figs. 4, 7, 11); an input device ([0047] “…an operation section (e.g., an operation system for operating the end of the scope, an operation system for operating the treatment tool, and a mode setting button) (not illustrated in FIG. 2).”) configured to input an instruction to the end effector ([0047] “a treatment tool 210 (e.g., forceps or knife)”); and a processor (Figs. 2,10 processor 300) connected to the endoscope (imaging/scope section 200, [0046]), the display (Figs. 2,10 display section 400), the treatment tool (Fig. 2,10 treatment tool 210), and the input device ([0047]), wherein the processor (Figs. 2,10 processor 300) is configured to detect a distal end position of the end effector (Figs. 2,10 distance information acquisition section 120, [0051] “The distance information acquisition section 120 detects a treatment tool and a blood vessel from the captured image, and calculates the two-dimensional distance between the detected treatment tool and the detected blood vessel.” [0083-0085] “Note that the position of the tip of the knife may be estimated from the treatment tool information. In this case, the distance information acquisition section 120 acquires physical information (e.g., length and width) about the end (tip) of the treatment tool from the treatment tool information acquired by the treatment tool information acquisition section 170.”) based on the instruction, and estimate a first treatment surface (Fig. 7 surface /blood vessel 11 ) from a plurality of recorded distal end positions (Fig. 10 memory 230, [0080] “…the treatment tool information acquisition section 170 acquires the characteristic information about the knife (treatment tool 210) as the treatment tool information. The characteristic information about the knife is stored in the memory 230 included in the processor section 300, for example. The processor section 300 determines the treatment tool, and acquires the treatment tool information from the memory 230. Information about the application, the target part, the size, the shape, and the like of the knife is stored in the memory 230 as the characteristic information. The characteristic information about the knife may be input as the operation information about the user. For example, when the knife is an electrosurgical knife, the user sets the current that is caused to flow through the knife, the voltage that is applied to the knife, and the like. Alternatively, the user sets a knife procedure mode (e.g., an incision mode in which only an incision operation is performed, or a bleeding arrest mode in which a bleeding arrest operation is performed at the same time as an incision operation).”). Tanaka discloses capturing an image of the position of the distal end of the end effector but fails to explicitly disclose record the distal end position that has been detected, estimate a first treatment surface from a plurality of recorded distal end positions, and wherein the processor is further configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points.. However DiMaio, in the same field of endeavor, teaches record the detected distal end position (DiMaio – Fig. 4 step 403 and Fig. 5 step 504 [0083), [0084], and [0077-0078] “After the start of training indication is detected, in process 403, the training module records or stores the current LUS probe 150 position and orientation, and periodically (or upon surgeon command) continues to do so by looping around processes 403 and 404 until a stop training indication is detected or received. The stop training indication in this case may also be initiated by the surgeon in the same manner as the start of training indication, or it may be initiated in a different, but other conventional manner. After the stop training indication is detected or received, a last position and orientation of the LUS probe 150 is recorded or stored. Between the start and stop of training, the surgeon moves the LUS probe 150 and the processor 102 stores its trajectory of points and orientations so that they may be retraced later upon command. ”). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify Tanaka to record the detected distal end position as taught by DiMaio for the benefit of retracing the positions and orientations later upon command (DiMaio – [0077-0078]). DiMaio, also teaches estimate a first treatment surface from a plurality of recorded distal end positions (DiMaio – Figs. 6 [0084] “FIG. 6 illustrates, as an example, a flow diagram of a method for automatically moving the LUS probe 150 to a position and orientation associated with a clickable thumbnail upon command to do so by a surgeon while performing a minimally invasive surgical procedure using tool 139. In process 601, the clicking of a thumbnail generated by the method described in reference to FIG. 5 is detected by, for example, a conventional interrupt handling process. Upon such detection, in process 602, the auxiliary controller 242 is instructed by, for example, stored instructions corresponding to the interrupt handling process, to retrieve the position and orientation stored in memory 240 which is associated with the thumbnail. The auxiliary controller 242 then causes the LUS probe 150 to move to that position and orientation by appropriately controlling slave arm 124 in process 603. Thus, the surgeon is able to move the LUS probe 150 to a desired position without having to change modes of the control switch mechanism 231 and halt operation of the tool 139 until the LUS probe 150 is moved.”). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify to modify Tanaka to include an estimate of a first treatment surface from a plurality of recorded distal end positions as taught by DiMaio for the benefit of “…the surgeon is able to move the …probe … to a desired position” ([DiMaio – [00084].) Tanaka in view of DiMaio alone or in combination fails to explicitly teach wherein the processor is further configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points. However Govari, in the same field of endeavor, teaches wherein the processor (Govari – [0009, 00027-0044] Fig. 1 processor 34) is further configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points (Govari - Fig. 2 and Fig. 3 steps 100-112, [0003]). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify the teachings of Tanaka in view of DiMaio with the teachings of Govari to include the processor configured to associate a model coordinate system of a model of a target organ with a display coordinate system of a display space in which the image is displayed based on a plurality of feature points and a plurality of corresponding points for the benefit of “…navigating a medical device, such as an endoscope, in a patient head, without inserting the endoscope to restricted sections of the head, typically for patient safety reasons” (Govari – [0014]). Regarding Claim 2, Tanaka in view of DiMaio in view of Govari teach the surgery assistance system according to Claim 1, wherein the processor (Tanaka – Fig.2 processor 300) is configured to make a correspondence (Fig. 2 notification processing section 140) between anatomical information (Tanaka – degree of relation [0053]) of a target organ (Tanaka – Fig. 11 blood vessel 11) and the image (Tanaka – image on display area 10 ), and display on the display (Tanaka - display area 10) the anatomical information related to the first treatment surface (Tanaka – blood vessel 11) that has been estimated (Tanaka - [0043] “An image displayed on the display section 400 includes an image display area 10 in which the captured image that includes a blood vessel 11, a treatment tool 12, and the like is displayed, and a degree-of-relation display area 20 in which the degree of relation is displayed. For example, the notification processing section 140 displays a red image (that represents that the degree of relation is high) in the degree-of-relation display area 20 when the degree of relation is equal to or higher than a first threshold value, displays a yellow image (that represents that the degree of relation is medium) in the degree-of-relation display area 20 when the degree of relation is lower than the first threshold value and equal to or higher than a second threshold value, and displays a black image (that represents that the degree of relation is low) in the degree-of-relation display area 20 when the degree of relation is lower than the second threshold value.”). Regarding Claim 5, Tanaka in view of DiMaio in view of Govari teach the surgery assistance system according to Claim 1, wherein the treatment tool is an energy device, and the processor is configured to detect the distal end position at a time of the instruction to apply energy to the end effector (Tanaka – [0080] “…the treatment tool information acquisition section 170 acquires the characteristic information about the knife (treatment tool 210) as the treatment tool information. The characteristic information about the knife is stored in the memory 230 included in the processor section 300, for example. The processor section 300 determines the treatment tool, and acquires the treatment tool information from the memory 230. Information about the application, the target part, the size, the shape, and the like of the knife is stored in the memory 230 as the characteristic information. The characteristic information about the knife may be input as the operation information about the user. For example, when the knife is an electrosurgical knife, the user sets the current that is caused to flow through the knife, the voltage that is applied to the knife, and the like. Alternatively, the user sets a knife procedure mode (e.g., an incision mode in which only an incision operation is performed, or a bleeding arrest mode in which a bleeding arrest operation is performed at the same time as an incision operation).”). Regarding Claim 6, Tanaka in view of DiMaio in view of Govari teach the surgery assistance system according to Claim 2, wherein the anatomical information (Tanaka – degree of relation [0053]) includes a second treatment surface (Tanaka – Fig. 7 deep area 12, 14) that has been preoperatively planned, and when a distance between the first treatment surface (Tanaka – Fig. 7 blood vessel 11) and the second treatment surface (Tanaka – Fig. 7 deep area 12, 14) exceeds a predetermined threshold value (Tanaka – [0075-0076]), the processor modifies (Tanaka – the processor changing the image color based on the distance, [0053]) the correspondence (Fig. 2 notification processing section 140) between the anatomical information (Tanaka – degree of relation [0053]) and the image (Tanaka – image on display area 10 ). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEGAN E MONAHAN whose telephone number is (571)272-7330. The examiner can normally be reached Monday - Friday, 8am - 5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael Carey can be reached at (571) 270-7235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MEGAN ELIZABETH MONAHAN/Examiner, Art Unit 3795 /MICHAEL J CAREY/Supervisory Patent Examiner, Art Unit 3795
Read full office action

Prosecution Timeline

Aug 18, 2022
Application Filed
May 31, 2025
Non-Final Rejection — §103
Sep 04, 2025
Response Filed
Dec 11, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593960
Endoscope provided with a device for closing a fluid flow circuit, for improved sterilisation
2y 5m to grant Granted Apr 07, 2026
Patent 12588799
IMAGE DIAGNOSIS ASSISTANCE APPARATUS, ENDOSCOPE SYSTEM, IMAGE DIAGNOSIS ASSISTANCE METHOD, AND IMAGE DIAGNOSIS ASSISTANCE PROGRAM
2y 5m to grant Granted Mar 31, 2026
Patent 12582291
ENDOSCOPE AND ENDOSCOPE ILLUMINATION SUBSTRATE
2y 5m to grant Granted Mar 24, 2026
Patent 12582299
ENDOSCOPE COMPRISING A BENDING SECTION HAVING INDIVIDUAL SEGMENTS
2y 5m to grant Granted Mar 24, 2026
Patent 12543932
WIRE-DRIVEN MANIPULATOR
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
80%
With Interview (+21.7%)
3y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 106 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month