Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This office action is in response to the amendments filed December 9. 2025. Claims 1-14 are amended. Claims 1-14 are pending and addressed below.
Response to Arguments
Applicant’s amendments to claim 8 have overcome the claim objection. The claim objection is withdrawn.
Applicant’s amendments to claim 7 have overcome the rejection under 35 USC 112(b). The rejection under 35 USC 112(b) is withdrawn.
Applicant’s arguments regarding the rejection of claim 1 under 35 USC 103 have been fully considered but are not persuasive.
First, applicant argues that Beelen does not disclose “a processor subsystem configured to: determine a displacement distance for the surgical instrument to advance towards the surgical target based on the sensor data and the target position; output, via the user interface, a sensory-perceptible representation of the determined displacement distance.” However, examiner notes that Beelen specifically indicates displacement of the surgical instrument to be in a direction to approach the surgical target (see Abstract and [0145]). Furthermore, the sensor data acquired does indicate a distance from the surgical instrument to the surgical target ([0164], “based on sensor data indicating a distance to the surgical target”) and is further presented onto a sensory-perceptible representation to the user interface ([0147]).
Second, applicant argues that Beelen does not disclose “a processor subsystem configured to: … output, via the user interface, a sensory-perceptible representation of the determined displacement distance; require, via the user interface, the user to confirm the determine displacement distance.” However, examiner notes that Beelen discloses that the surgical instrument may be moved on responsibility of the human operator ([0147]). This is also further shown in Fig. 1 of Beelen, where the figure contains a surgical robotic system 100 controlled by the flowchart starting from a human machine interface 020 receiving positioning commands 022 from a human operator to be processed ([0135]). The user confirms the determined displacement distance through the act of inputting commands 022.
Third, applicant argues that Beelen does not disclose “a processor subsystem configured to: … upon receiving the confirmation signal, control the actuator to actuate the movable arm part to effect a single movement of the surgical instrument along the longitudinal axis over the determined displacement distance;”. However, examiner notes that, in light of the confirmation provided from the human operator, the flowchart process proceeds to actuator 060 to actuate the movable arm.
For at least these reasons, examiner maintains the rejection of claim 1 under 35 USC 103. By extension, examiner maintains the rejections of dependent claims 2-12 and independent claims 13 and 14 under 35 USC 103 as well.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-4 and 10-14 are rejected under 35 U.S.C. 103 as being unpatentable over US20170252113A1 (Beelen et al.).
Regarding claims 1 and 13, Beelen et al. disclose a surgical robot system for use in an ocular surgical procedure and a computer-implemented method for controlling a surgical robotic system during use in an ocular surgical procedure, comprising:
a surgical arm comprising a movable arm part,
[0010] of Beelen et al., “A first aspect of the invention provides a surgical robotic system for use in a surgical procedure, comprising: a surgical arm comprising a movable arm part,”
the movable arm part comprising an instrument connector for mounting of a surgical instrument having a longitudinal axis,
[0010] of Beelen et al., “the movable arm part comprising an instrument connector for mounting of a surgical instrument, the surgical instrument having a longitudinal axis,”
the movable arm part having at least one degree of freedom to enable longitudinal movement of the surgical instrument along the longitudinal axis of the surgical instrument towards or away from an ocular surgical target;
[0010] of Beelen et al., “the movable arm part having at least one degree-of-freedom to enable longitudinal movement of the surgical instrument along the longitudinal axis of the surgical instrument towards a surgical target;”
a sensor configured to obtain sensor data indicating a distance between the surgical instrument and the surgical target;
[0164] of Beelen et al., “However, the surgical robotic system may comprise a sensor inside or outside the eye which measures the longitudinal distance 171 between the instrument tip 137 and the surgical target.”
a user interface configured to receive user input from a user and to output sensory perceptible output to the user;
Beelen et al. disclose in a user interface configured to receive user input from a user ([0004],“ A human machine interface may be provided for receiving positioning commands from a human operator for controlling the movement of the surgical instrument.”) and send output to the user ([0147], “On responsibility of the human operator, the surgical instrument may be moved, namely by providing suitable positioning commands to the human machine interface.”).
One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to have output that is sensory perceptible to a user as acting in response to the output is on responsibility of the human operator.
an actuator configured and arranged for actuating the movable arm part to effect the longitudinal movement of the surgical instrument;
[0010] of Beelen et al., “an actuator configured and arranged for actuating the movable arm part to effect the longitudinal movement of the surgical instrument;”
a processor subsystem configured to:
at an initial position of the surgical instrument, obtain sensor data from the sensor and determine an initial distance between the surgical instrument and the surgical target based on the sensor data;
[0164] of Beelen et al., “FIG. 21 shows the virtual bound being repositioned, expanded or deformed, based on sensor data indicating a distance to the surgical target. Here, the virtual bound 133 may be established by a previous furthest positioning of the instrument tip at time t1 136.” Time t1 being the time where an instrument tip is positioned at an initial position.
obtain data indicative of a target position of the surgical instrument relative to the surgical target;
[0164] of Beelen et al., “However, the surgical robotic system may comprise a sensor inside or outside the eye which measures the longitudinal distance 171 between the instrument tip 137 and the surgical target. The sensor may obtain measurement data points, similar to those described with reference to FIGS. 11 and 12.”
determine a displacement distance to advance towards the surgical target based on the sensor data and the target position;
[0147] of Beelen et al., “On responsibility of the human operator, the surgical instrument may be moved, namely by providing suitable positioning commands to the human machine interface. The position of the instrument tip at time t 1 136 is at the virtual bound 133, which has radius R1 at t1 162, and is moved in positive longitudinal direction z 109. By providing positioning commands of a selected type, the spherical virtual bound 133 may be enlarged 127 such that the instrument tip remains inside the spherical virtual bound.”
output, via the user interface, a sensory-perceptible representation of the determined displacement distance;
See the rationale of the previous limitations, where “the surgical instrument may be moved, namely by providing suitable positioning commands to the human machine interface”
require, via the user interface, the user to confirm the determined displacement distance;
See Fig. 1 of Beelen. The flowchart of the figure starting from a human machine interface 020 requires positioning commands 022 from a human operator controlling the longitudinal movement of the surgical instrument ([0135]).
receive, via the user interface, a confirmation signal from the user indicating a confirmation of the determined displacement distance; and
See the rationale of the limitation regarding “a user interface configured to receive user input…”. Given that the human-machine interface is capable of both receiving output from the surgical instrument and sending input from an operator, one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, that the user interface is capable of sending a confirmation signal from the user to the processor of the surgical instrument as the surgical instrument may be moved “on responsibility of the human operator”, indicating that all motion must be confirmed by the operator as undesired motion of the end-effector may result in harm of a patient.
upon receiving the confirmation signal, control the actuator to actuate the movable arm part to effect a single movement of the surgical instrument along the longitudinal axis over the determined displacement distance.
In light of the rationale regarding “receive, via the user interface, a confirmation signal…”, [0147] of Beelen et al. disclose, “The position of the instrument tip at time t 1 136 is at the virtual bound 133, which has radius R1 at t1 162, and is moved in positive longitudinal direction z 109. By providing positioning commands of a selected type, the spherical virtual bound 133 may be enlarged 127 such that the instrument tip remains inside the spherical virtual bound. At time t2, the human operator may visually confirm, e.g., using a microscope 131, that the instrument tip 137 is in (close) contact with the tissue on the inside of the eye.” Where the eye is referred to as surgical target 123 (see Figs. 2 and 10 for exemplarygeometric reference of surgical target and virtual bounds).
Regarding claim 2, with all of the limitations of claim 1, the system further discloses:
wherein the processor subsystem is further configured to:
upon completing the single movement of the surgical instrument, obtain updated sensor data from the sensor and determine a remaining distance between the surgical instrument and the surgical target based on the updated sensor data;
[0166] of Beelen et al., “The coordinates of the instrument tip may now be a measure of the surgical target position. Having obtained these sensor data, the processor may update or replace the virtual bound 132.”
In light of the rationales of claim 1, one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, that the system is capable of determine a remaining distance between the surgical instrument and surgical target based on updated sensor data.
determine whether the target position is reached based on the determined remaining distance; and if not,
[0166] of Beelen et al., “Additionally or alternatively, a proximity switch sensor may be used. Here, the distance 171 is not measured over a range, but the presence of the surgical target within a position threshold is detected, e.g., at a distance of 0.1mm. If the threshold is 0 mm, the sensor acts as a contact/no-contact sensor.” Here, if the threshold is 0 mm or if the target is in contact, the sensor would act as a contact sensor.
control the actuator to correct a depth of the surgical instrument based on the determined remaining distance.
[0163] of Beelen et al., “Effectively, by suitably retracting the surgical instrument, the instrument tip position at time t 2 137 may be corrected in z direction over a distance 170, such that movement of the surgical instrument into zone B, towards, or further into the surgical target 123, is avoided.”
Regarding claim 3, with all of the limitations of claim 2, the system further discloses:
wherein the processor subsystem is configured to control the actuator in a displacement-based control mode to correct the depth of the surgical instrument by:
determining a correction displacement distance to reach the target position from the determined remaining distance to the surgical target;
[0165] of Beelen et al., “The longitudinal distance 171 may be measured indirectly, by a sensor outside the eye such as a camera or an optical coherence tomography device mounted on a microscope 131, with a view on the surgical target and the surgical instrument tip 137, through the pupil. Indirect measurement may also be done by a sensor in the eye, mounted on another instrument or positioned in the eye by other means. The longitudinal distance 171 may also be measured directly, by a sensor added to the surgical instrument and measuring along the z-direction 109. The sensor may be a non-contact distance sensor providing optical coherence tomography through an optical fiber, integrated in or attached to the surgical axis.
outputting, via the user interface, a sensory-perceptible representation of the correction displacement distance;
See rationale of claim 1 regarding the limitation “output, via the user interface, a sensory-perceptible representation of the determined displacement distance”
requiring, via the user interface, the user to confirm the correction displacement distance;
See rationale of claim 1 regarding the limitation “require, via the user interface, the user to confirm …”.
receiving, via the user interface, a further confirmation signal from the user; and upon receiving the further confirmation signal, controlling the actuator to actuate the movable arm part to effect a further single movement of the surgical instrument along the longitudinal axis over the correction displacement distance.
See rationales of claim 1 regarding the limitation “receive, via the user interface, a confirmation signal from the user” and “upon receiving the confirmation signal, control the actuator to actuate the movable arm part to effect a single movement of the surgical instrument along the longitudinal axis over the determined displacement distance.”
Regarding claim 4, with all of the limitations of claim 2, the system further discloses:
herein the processor subsystem is configured to control the actuator in a limited-closed-loop control mode in which updated sensor data is used to correct the depth of the surgical instrument,
Beelen et al. disclose in Fig. 20 that the processor subsystem responds to an undesired situation by controlling the actuator to move the end-effector ([0162], “the surgical instrument”) to retract from the incorrect depth ([0162], “the processor may be configured such that zone B 140 is regarded as a blocked zone. As such, longitudinal instrument penetration 190 into zone B may be disallowed”). However, Beelen et al. do not explicitly disclose that updated sensor data was used to make this judgement.
As Beelen et al. disclose that “the surgical robotic system may comprise a sensor inside or outside the eye that measures the longitudinal distance between 171 the instrument tip 137 and the surgical target” [0164], one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, that the disclosed sensor is capable of updating the sensor data to correct the depth of the surgical instrument.
wherein the limited-closed-loop control mode is limited by at least one of:
a predefined maximum correction displacement distance;
In light of the rationale regarding “herein the processor subsystem…” above, one of ordinary skill in the art would find it obvious that the virtual bound 132 would be the predefined maximum correction displacement distance as any more retraction from that point on would be outside of zone B 140, where said zone B 140 is a blocked zone.
Regarding claim 10, with all of the limitations of claim 1, the system further discloses:
wherein the sensor comprises at least one of:
an optical coherence tomography sensor configured to optically couple to an optical fiber attached to or integrated in the surgical instrument;
[0165] of Beelen et al., “The longitudinal distance 171 may also be measured directly, by a sensor added to the surgical instrument and measuring along the z-direction 109. The sensor may be a non-contact distance sensor providing optical coherence tomography through an optical fiber, integrated in or attached to the surgical axis.”
Regarding claim 11, with all of the limitations of claim 1, the system further discloses:
wherein the processor subsystem is further configured to:
obtain sensor data indicative of a distance between a first layer of tissue of an eye and a second layer of tissue of the eye, wherein the surgical target is between the first layer and the second layer; and
See Fig. 10 of Beelen et al. [0148] of Beelen et al. discloses, “In general, the virtual bound may be established under responsibility and visual observation of the human operator. The processor may be configured for processing the positioning commands based on (the distance to) this bound. For example, zone A may be treated as a safe region, or a high-performance region within the eye, whereas zone B may be treated as a low-speed, high-precision region near delicate tissue.” As shown in Fig. 10, the instrument tip 137 is between two layers of tissue, the layer penetrated by the instrument tip and the delicate tissue covered by zone B 143. Beelen et al. also further discloses sensors to measure distance indirectly or directly ([0165], “The longitudinal distance 171 may be measured indirectly, by a sensor outside the eye such as a camera or an optical coherence tomography device mounted on a microscope 131, with a view on the surgical target and the surgical instrument tip 137, through the pupil. Indirect measurement may also be done by a sensor in the eye, mounted on another instrument or positioned in the eye by other means. The longitudinal distance 171 may also be measured directly, by a sensor added to the surgical instrument and measuring along the z-direction 109. The sensor may be a non-contact distance sensor providing optical coherence tomography through an optical fiber, integrated in or attached to the surgical axis. However, this is not a limitation, in that also other non-contact distance sensors may be used, e.g., based on other optical principles, or sensors based on acoustical or electrical principles. In general, such types of distance sensor may provide an output, i.e., sensor data, proportional to the distance over a certain range of positions.”)
correct the position of the surgical instrument such that the surgical instrument is between the first layer and the second layer.
[0147] of Beelen et al., “The human operator may not want to damage this tissue 123, and therefore may not advance or penetrate any further.” See Fig. 10. One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to correct the position of the surgical instrument such that the penetration of the surgical instrument does not go past the first layer penetrated as the second layer may be sensitive tissue (as described above).
Regarding claim 12, with all of the limitations of claim 1, the system further discloses:
wherein the processor subsystem is further configured to
obtain sensor data indicative of a distance between a first layer of tissue of an eye and a second layer of tissue of the eye, wherein the surgical target is between the first layer and the second layer; and
See the rationale of claim 11 regarding “obtain sensor data indicative of a distance between a first layer of tissue of an eye…”.
wherein the data indicative of a target position of the surgical instrument relative to the surgical target comprises data indicative of the distance between the first layer and the second layer.
[0149] of Beelen et al., “The surgical instrument may enter the eye at the RCM 125, therefore the coordinates (0,0,0) may be available as a data point. Using these data points, a virtual bound may be constructed, e.g., based on algebraic geometry or a numerical model. In the former case, the data points 120 may be used to fit a higher order algebraic geometry for the virtual bound 121, in 3D space 101-103. The algorithm used for fitting may minimize the volume of the geometry while enclosing all data points. As such, in the example of FIG. 11, an ellipsoid geometry may be chosen, since it may describe the eye's inner surface better than a sphere.” See Fig. 11. From the figure, an example of the geometry applied is shown, where the geometry is composed of data points 120.
One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, that the data points used to span the geometry of the eye’s inner surface would then be able to indicate the distance between the first and second layer of the eye.
Regarding claim 14, with all of the limitations of claim 13, the method further discloses:
a non-transitory computer-readable storage medium comprising representing a computer program, the computer program comprising instructions for causing a processor system to carry out the method of claim 13.
[0169] of Beelen et al., “FIG. 23 shows a computer program product in the form of an computer readable medium 260 which comprises non-transitory program code 250 for causing a processor to perform a method according to the invention when said program code is executed the processor.”
Claims 5-6 and 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over US20170252113A1 (Beelen et al.) in further view of US20150287422A1 (Short et al.).
Regarding claim 5, with all of the limitations of claim 1, the system further discloses:
wherein the processor subsystem is further configured to:
calculate a certainty score representing a certainty of a determined distance between the surgical instrument and the surgical target; and
While Beelen et al. do not explicitly state the calculation of a certainty score, Beelen et al. disclose the use of a Kalman filter to account for uncertainty from the sensor data ([0167], “It is noted that, in updating or replacing the virtual bound based on sensor data, more samples may be used than only the current sample of the sensor data. Namely, due to possible noise and uncertainty in this data, also previous samples of sensor data may be used, and/or previous positions of the virtual bound position. For example, the virtual bound may be updated using a Kalman filter to account for noise and uncertainty in the sensor data.”).
From a related field of endeavor, Short et al. disclose methods and systems for processing signals using Signal Separation (SS) to form components in a plurality of fields such as “sound, audio, video, photographic, imaging (including medical), communications, optical/light, radio, RADAR, sonar, sensor and seismic sources” (Short et al., [0009]), where the Signal Separation (SS) can also be applied in both diagnostic and surgical applications (Short et al., [0354]). Specifically, Short et al. disclose the use of an ambiguity/certainty measure in a Kalman filter whereby the certainty/ambiguity measure is used to determine which tracklets should be extracted or enhanced (Short et al., [0302]).
As the instrument tip contains a sensor for measuring distance ([0165], “The longitudinal distance 171 may also be measured directly, by a sensor added to the surgical instrument and measuring along the z-direction 109.”) to a surgical target ([0164], “However, the surgical robotic system may comprise a sensor inside or outside the eye which measures the longitudinal distance 171 between the instrument tip 137 and the surgical target.”) and Beelen et al. disclose the use of a Kalman filter which is used to create an uncertainty estimate model, one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to use the certainty measure of Short et al. with the sensor system of Beelen et al. as the scoring function can help to determine a threshold where a measurement’s certainty is accepted or remeasured in response to an insufficient score from the uncertainty model established by the Kalman filter algorithm.
adapt the control of the actuator according to the calculated certainty score.
One of ordinary skill would find it obvious that the control of the actuator in response to a calculated certainty score would be different depending if the certainty calculated were high or if the certainty calculated were low.
Regarding claim 6, with all of the limitations of claim 5, the system further discloses:
wherein the processor subsystem is further configured to calculate the certainty score based on at least one of:
an estimate of measurement noise of the sensor,
In light of claim 5, one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, that the Kalman filter comprises an uncertainty model that updates a predicted state and error of the system measurement by additional measurements.
Regarding claim 8, with all of the limitations of claim 5, the system further discloses:
wherein the processor subsystem is further configured to select a control mode from a set of control modes based on the calculated certainty score,
Beelen et al. disclose the use of a plurality of control modes and distinguishing safer modes from the plurality ([0026], “For example, positioning commands provided using a particular input modality or input mode may be considered as ‘safe’ or ‘safer’ than positioning commands provided using other input modalities or input modes.) but does not explicitly state the selection of a control mode from the set of control modes based on a calculated certainty score which represents a certainty of displacement distance between instrument and target.
Beelen et al. further disclose from [0157], “The velocity mode may be more suitable to cover larger distances. The surgical instrument may move at a constant speed while the motion controller is kept stationary at the velocity mode boundary 154. Therefore, advantages of the velocity mode may include decreased user fatigue and faster task completion. However, the positioning mode may be safer than the velocity mode, because the human operator has to purposely move the motion controller to move the surgical instrument.” Said positioning mode is both scalable and controllable by a human operator ([0154]) but also gives more deliberate motion in light of both the operable motion controller and the tuning factor versus a velocity control mode that may also be scaled ([0155]) but does not offer a coupled motion of a controller from an operator in control.
In light of the rationale of claim 5, one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to adapt the choice of control modality of Beelen et al. with the certainty scoring of Short et al. as the positioning mode would show a higher precision versus the velocity mode, creating an uncertainty in displacement that would need to be accounted for when deciding the best course of action.
the set of control modes comprising at least two of:
a displacement-based control mode and
a limited-closed-loop control mode, wherein:
See the rationales of claims 3 and 4 regarding a displacement-based control mode and a limited-closed-loop control mode, respectively.
in the displacement-based control mode, the processor subsystem is configured to:
determine a correction displacement distance to reach the target position from the determined remaining distance to the surgical target;
See the rationale of claim 3 regarding “determining a correction displacement distance to reach the target position from the determined remaining distance to the surgical target”.
output, via the user interface, a sensory-perceptible representation of the correction displacement distance;
See the rationale of claim 3 regarding “outputting, via the user interface, a sensory-perceptible representation of the correction displacement distance”.
require, via the user interface, the user to confirm the correction displacement distance;
See the rationale of claim 3 regarding “requiring, via the user interface, the user to confirm…”, via the user interface, a further confirmation signal from the user”.
receive, via the user interface, a further confirmation signal from the user indicating a confirmation of the correction displacement distance; and
See the rationale of claim 3 regarding “receiving, via the user interface, a further confirmation signal from the user”.
upon receiving the further confirmation signal, control the actuator to actuate the movable arm part to effect a further single movement of the surgical instrument along the longitudinal axis over the correction displacement distance; and
See the rationale of claim 3 regarding “upon receiving the further confirmation signal, controlling the actuator to actuate the movable arm part to effect…”.
in the limited-closed-loop control mode, the processor subsystem is configured to be limited by at least one of:
a predefined maximum correction displacement distance;
See the rationale of claim 4.
Regarding claim 9, with all of the limitations of claim 5, the system further discloses:
wherein the processor subsystem is further configured to output, via the user interface, a sensory-perceptible representation of the calculated certainty score, prior to receiving the confirmation signal from the user.
In light of the rationale of claim 1 regarding “output, via the user interface, a sensory-perceptible representation of the determined displacement distance”, one of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to output a sensory-perceptible representation of the calculate certainty score to a user as the certainty score is representative of tracking a displacement of the system that a user is responsible for. ([0147], “On responsibility of the human operator, the surgical instrument may be moved, namely by providing suitable positioning commands to the human machine interface.”).
Claim 7 are rejected under 35 U.S.C. 103 as being unpatentable over US20170252113A1 (Beelen et al.) in further view of US20150287422A1 (Short et al.) and US20180078332A1 (Mozes et al.).
Regarding claim 7, with all of the limitations of claim 5, the system further discloses:
wherein the processor subsystem is configured to calculate the certainty score repeatedly, and to perform an abort procedure when the calculated certainty score falls below a certainty score threshold,
In light of claim 5, while Beelen et al. in view of Short et al. disclose a system that calculates a certainty score repeatedly, an abort procedure in response to a low calculated certainty score is not explicitly disclosed.
From a similar field of endeavor, Mozes et al. disclose a surgical robot featuring a controller device configured to receive data from a detector relative to the intersection with a fiducial marker to determine a spatial relation between the instrument and the marker (Abstract). Specifically, from their disclosure, Mozes et al. disclose a system for adjusting the position of the arm to maintain a desired position where, in response to a highly undesired position outside a threshold limit, an aborting and retracting of the surgical instrument occurs when the desired position is outside of a specific threshold.([0037], “In other instances, for example, if the deviation between the instrument 140 and the fiducial marker 170 is over a threshold, indicating excessive movement of the patient or other issue, the controller device 190 may direct that an alarm be emitted, or even that the virtual surgical plan be aborted and the instrument 140 retracted to a safe position.”)
One of ordinary skill in the art would find it obvious, prior to the applicant’s effective filing date, to combine the controller features described by Mozes et al. to the system of Beelen et al. in view of Short et al. As a high certainty score indicates that the instrument is in a desired location, a low certainty score would indicate ambiguity in whether the instrument tip’s location. Due to the possible harm from movement from an unknown location, retracting outside of a threshold range would be able to reduce any potential risks.
wherein the abort procedure comprises at least one of:
retracting the surgical instrument;
See above rationale of claim 7 where Mozes et al. disclose that the surgical plan being aborted also retracts the instrument 140 to a safe position.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAEWOOK JUNG whose telephone number is (571)272-5470. The examiner can normally be reached Monday - Friday, 9:00 AM - 5:00 PM..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached on (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.J./Examiner, Art Unit 3656
/WADE MILES/Supervisory Patent Examiner, Art Unit 3656