DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed on 12/22/2025 regarding claims 1, 3-9, 11-20 have been fully considered but they are not persuasive or moot.
Applicant argues on page 9 of the Applicant’s remarks that “However, Quaid does not disclose
"receive temporal force/torque data, wherein the temporal force torque data represents the forces at the instrument interface over time, sensed by the at least one force/torque sensor during the collaborative procedure with the user, analyze the temporal force/torque data to determine at least one of a current intention of the user or a predefined state of the collaborative procedure, determine a control mode which is predefined for the determined at least one of the current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter; and cause the robot controller to control the robotic arm in accordance with the control mode; and wherein the forces comprise at least one of-() forces applied indirectly to the tool guide during user manipulation of the tool; (ii) forces applied directly to the tool guide by the user; (iii) forces from an environment of the robotic arm; and (iv) forces generated by the tool," as recited by Applicant's independent claim.” The Examiner respectfully disagrees. Quaid teaches receiving force/ torque data at the instrument interface over time (See at least Para [0051] “… The haptic device will likely require a real-time operating system or special motion control hardware to generate high-frequency updates for the haptic control system…”, discloses high frequency updates for the haptic control system which is construed as receiving force/ torque data at the instrument interface over time, Para [0102] “… The criteria may be proximity of surgical tool 112 coupled to haptic device 113 to haptic object 20, penetration of haptic object 20 by surgical tool 112, gestural motions of surgical tool 112, gestural or other motion of surgical tool 112 relative to the position of haptic object 20, a fixed or variable time period…”) and forces applied directly to the tool guide by the user (See at least Para [0056] discloses portion of surgical tool for example the tip of surgical tool coupled with a haptic device is used to sense properties of local anatomy which includes force to position surgical tool or to verify the proper positioning of the surgical tool, as the force is received from the user, so sensing it to take an action means performing according to the desire of the user. Also, Quaid depends on Briquet-Kerestedjian for the teachings of analyzing temporal force/torque data to determine at least one of a current intention of the user or a predefined state of the collaborative procedure (See at least Page 3279 Col 1 Para 2 “… the choice of a robot reaction will depend on the type of impact (desired as part of the task or unintended) and its localization on the robot arm (see Fig. 1)…”, Page 3279 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact type (intended interaction on the left, undesired collision on the right)”.
Applicant further argues on page 10 of the Applicant’s remarks that “This filtering technique of torque induced by external loads is not the same as "determine[ing]a control mode which is predefined for the determined at least one of the current intention of the user or the state of the collaborative procedure... wherein the forces comprise at least one of (i) forces applied indirectly to the tool guide during user manipulation of the tool; (ii) forces applied directly to the tool guide by the user; (iii) forces from an environment of the robotic arm; and (iv) forces generated by the tool" as recited in Applicant's independent claim.”. The Examiner respectfully disagrees. Briquet-Kerestedjian classifies human-robot contact situations which could be intended or collision based (See at least Page 3279 Col 1 Para 2 “I. INTRODUCTION - Collaborative manipulators are specifically designed for human-robot interaction by both their lightweight mechanical structure and the embedded safety algorithms that enable to reliably detect external contacts [1], [2], [3], [4]. Although several reaction strategies are proposed in literature [5], [6], the choice of a robot reaction will depend on the type of impact (desired as part of the task or unintended) and its localization on the robot arm (see Fig. 1)…”, Page 3280 Col 2 Para 4 “B. Input data for classification - The classification of contact situations is intuitively mainly based on the temporal evolution of joint load torques.”, Page 3285 Col 1 Para 4 VI. Conclusion). Also, Briquet-Kerestedjian teaches that the forces being received from an environment of the robotic arm (See Fig 1).
Therefore, after determining the type of impact (unintended or desired), the reaction of the robot meaning controlling the robot is performed.
The same reasoning as applied to the independent claims above also apply to their
corresponding dependent claims.
Claim Objections
Claim 9 and 17 are objected to because of the following informalities:
Claim 9: “a force/torque sensor to sense forces at ht einstrument interface” should read “a force/torque sensor to sense forces at the instrument interface”.
Claim 17: “the instrument interface fomprising a tool guide” should read “the instrument interface comprising a tool guide”.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1, 3-9, 11-20 are rejected under 35 U.S.C. 103 as being unpatentable over Quaid et al. (US 2004/0106916 A1) (Hereinafter Quaid) in view of Briquet-Kerestedjian et al. (N. Briquet-Kerestedjian, A. Wahrburg, M. Grossard, M. Makarov and P. Rodriguez-Ayerbe, "Using Neural Networks for Classifying Human-Robot Contact Situations," 2019 18th European Control Conference (ECC), Naples, Italy, 2019, pp. 3279-3285) (Hereinafter Briquet-Kerestedjian).
Regarding Claim 1, Quaid teaches a system, comprising:
a robotic arm having one or more degrees of freedom of control, wherein the robotic arm includes an instrument interface comprising a tool guide which is configured to interface with a tool which can be manipulated during a collaborative procedure with a user (See at least Fig 1, Fig 2, Fig 3A, Fig 3B, Para [0045] “An example of the illustrated robotic arm is a robotic arm manufactured by Barrett Technology, and referred to as the "Whole-Arm Manipulator" or "WAM"…The WAM robotic arm has a four degrees of freedom of movement. However, it is augmented by a 1-DOF direct-drive wrist for trajectory-based medical applications. If desired, degrees of freedom may be added or removed without affecting the scope of the illustrated invention.”, Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”);
at least one force/torque sensor configured to sense forces applied to or at the instrument interface by the user while interfacing with the instrument (See at least Para [0028] “A haptic device is a mechanical or electro-mechanical device that interacts and communicates with a user, such as a surgeon, using sensory information such as touch, force, velocity, position, and/or torque…”, Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”);
a robot controller configured to control the robotic arm to move the instrument interface to a determined position and to control at least one robot control parameter (See at least Para [0052] “…When haptic device 113 comes within a predefined distance of haptic object 20, a stiffness parameter may be changed to make it more difficult to move haptic device 113…”, Para [0036] “Haptic device 113 is, in the illustrated example, a robotic device. Haptic device 113 may be controlled by a processor based system, for example a computer 10. Computer 20 may also include power amplification and input/output hardware. Haptic device 113 may communicate with computer-assisted surgery system 11 by any communication mechanism now known or later developed, whether wired or wireless.”); and
a system controller configured to:
receive temporal force/torque data, wherein the temporal force/torque data represents
the forces at the instrument interface over time, sensed by the at least one force/torque sensor during the collaborative procedure with the user (See at least Para [0028] “A haptic device is a mechanical or electro-mechanical device that interacts and communicates with a user, such as a surgeon, using sensory information such as touch, force, velocity, position, and/or torque…”, Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”), …
wherein the forces comprise at least one of:
forces applied indirectly to the tool guide during user manipulation of the tool;
forces applied directly to the tool guide by the user (See at least Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”);
forces from an environment of the robotic arm; and
forces generated by the tool.
However, Quaid does not explicitly spell out …
analyze the temporal force/torque data to determine at least one of a current intention
of the user or a predefined state of the collaborative procedure,
determine a control mode which is predefined for the determined at least one of the
current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter; and
cause the robot controller to control the robotic arm in accordance with the control
mode; and …
Briquet-Kerestedjian teaches …
analyze the temporal force/torque data to determine at least one of a current intention
of the user or a predefined state of the collaborative procedure (See at least Page 3276 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact
type (intended interaction on the left, undesired collision on the right)”),
determine a control mode which is predefined for the determined at least one of the
current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter (See at least Page 3279 Col 2 Para 1 “Another approach developed in [9] proposes to detect both interactions and collisions by low-pass and high-pass filtering the joint torques induced by external loads and applying adaptive thresholds tuned using experimental data. In [8], [10], the distinction is conducted in the frequency domain by calculating the Fast Fourier Transform (FFT) of the joint load torques and the derivative of the FFT with respect to time on a sliding window. Fixed frequency thresholds are determined experimentally in order to discriminate high-frequency collisions on the one hand from low-frequency interactions and on the other hand from noise under normal conditions. However, all of these frequency distinction-based methods require considerable parameter adjustment efforts that depend on the experimental data collected and that greatly affect the success of these methods.”); and
cause the robot controller to control the robotic arm in accordance with the control
mode (See at least Page 3283 Col 1 “IV. EXPERIMENTAL VALIDATION - A. Evaluation of the results For this reason, it reflects more realistic operational conditions to evaluate the results of the classification online. To this end, the trained neural network has been implemented directly inside the controller of an ABB YuMi robot.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to combine the system of Quaid with the teachings of Briquet-Kerestedjian and include the feature of analyzing the temporal force/torque data to determine at least one of a current intention of the user or a predefined state of the collaborative procedure, determine a control mode which is predefined for the determined at least one of the current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter and cause the robot controller to control the robotic arm in accordance with the control mode, thereby deciding how to control the robotic arm according to the user’s intention and increase data efficiency (See at least Page 3285 “VI. CONCLUSION - In this paper, we have presented a novel approach for classifying human-robot contact situations. The method applies supervised learning techniques by training a neural network, where the structure of the network is inspired by the physics of contact in order to increase data efficiency”).
Regarding Claim 3, modified Quaid teaches all the elements of claim 2.
Although Quaid teaches that haptic interaction forces and/or torques may be calculated using
a machine learning algorithm (See at least Para [0105] “… If desired, the haptic interaction forces and/or torques may be calculated using a mathematical, control theory, or machine learning algorithm.”) he does not explicitly spell out the system of claim 2, wherein the system controller is
configured to apply the temporal force/torque data to a neural network to determine the at least one of the current intention of the user or the state of the collaborative procedure.
Briquet-Kerestedjian teaches the system of claim 2, wherein the system controller is
configured to apply the temporal force/torque data to a neural network to determine the at least one of the current intention of the user or the state of the collaborative procedure (See at least Page 3276 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact type (intended interaction on the left, undesired collision on the right)”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to combine the system of Quaid with the teachings of Briquet-Kerestedjian and include the feature of the system controller being configured to apply the temporal force/torque data to a neural network to determine the current intention of the user or the state of the collaborative procedure, thereby using neural network to get data in order to decide how to control the robotic arm according to the user’s intention and increase data efficiency (See at least Page 3285 “VI. CONCLUSION - In this paper, we have presented a novel approach for classifying human-robot contact situations. The method applies supervised learning techniques by training a neural network, where the structure of the network is inspired by the physics of contact in order to increase data efficiency”).
Regarding Claim 4, modified Quaid teaches all the elements of claim 3. Although Quaid teaches reference pose may be associated with the desired trajectory of a drill guide attached to haptic device (See at least Para [0132] “For example, in one embodiment, the reference pose may be associated with the desired trajectory of a drill guide attached to haptic device 113. In such an embodiment, updating the reference pose in step 202 comprises changing the desired trajectory of the drill guide. When the user moves haptic device 113 from the reference pose for a prolonged period of time, the reference pose will be updated to move in the direction of the user's deflection. If, in step 210, an appropriate haptic feedback wrench is applied, then upon release of haptic device 113 by the user, haptic device 113 will assume the new reference pose. When the user is satisfied with the reference pose and the input mode is terminated in step 212, haptic device 113 will be in a pose such that the drill guide is aligned with the desired trajectory.”), he does not explicitly spell out the system of claim 3, wherein the neural network is configured to determine from the temporal force/torque data when the user is drilling with the tool, and is further configured to determine from the temporal force/torque data when the user is hammering with the tool.
Briquet-Kerestedjian further teaches the system of claim 3, wherein the neural network is
configured to determine from the temporal force/torque data when the user is drilling with the tool, and is further configured to determine from the temporal force/torque data when the user is hammering with the tool (See at least See at least Page 3276 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact
type (intended interaction on the left, undesired collision on the right)”, discloses neural network used to analyze the force to determine user’s intention).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to combine the system of Quaid with the teachings of Briquet-Kerestedjian and include the feature of the neural network is being configured to determine from the temporal force/torque data when the user is drilling with the tool, and is further configured to determine from the temporal force/torque data when the user is hammering with the tool, thereby using neural network to get data in order to decide how to control the robotic arm according to the user’s intention and increase data efficiency (See at least Page 3285 “VI. CONCLUSION - In this paper, we have presented a novel approach for classifying human-robot contact situations. The method applies supervised learning techniques by training a neural network, where the structure of the network is inspired by the physics of contact in order to increase data efficiency”).
Regarding Claim 5, modified Quaid teaches all the elements of claim 4. Quaid further teaches the system of claim 4, wherein the at least one robot control parameter controls a rendered stiffness of the tool guide against the forces applied in at least one direction (See at least Para [0051] “…For example, the computer-aided surgery system may send a command to the haptic device requesting it to enter into a joystick-like input mode with certain stiffness parameters…”).
Regarding Claim 6, modified Quaid teaches all the elements of claim 5. Quaid further teaches the system of claim 5, wherein when the neural network determines from the temporal force/torque data that the user is hammering with the tool, the neural network further determines whether the tool is hammering through bone or is hammering through tissue, wherein when the tool is determined to be hammering through tissue, the control mode is a first stiffness mode wherein the robot controller controls the tool guide to have a first stiffness, and wherein when the tool is determined to be hammering through bone, the control mode is a second stiffness mode wherein the robot controller controls the tool guide to have a second stiffness, wherein the second stiffness is less than the first stiffness (See at least Para [0052] “…An algorithm which computes the current position of haptic device 113 relative to haptic object 20 may be used to provide information to the surgeon about the location of haptic device 113 relative to haptic object 20. When haptic device 113 comes within a predefined distance of haptic object 20, a stiffness parameter may be changed to make it more difficult to move haptic device 113…”, Para [0117] “The stiffness or damping of the control algorithm may vary in different directions to indicate preferential directions of motion which may be aligned with any direction as described in the previous paragraph. This stiffness variation may include zero stiffness along certain directions or may lock the user to the preferred directions once the deviation from the reference position exceeds some threshold value. This stiffness variation assists with simplifying the planning process by allowing the user to focus their attention on a limited number of degrees of freedom at a time…”, Para [0118] “The stiffness and damping variations can occur automatically depending on the physical interaction of the user with the haptic device…”, Para [0121] “…Thus, haptic device 113 may be used to differentiate between hard and soft bones, healthy and diseases tissues, different types of healthy tissues, boundaries of anatomical structures, etc…”).
Regarding Claim 7, modified Quaid teaches all the elements of claim 1. Quaid further teaches the system of claim 1, wherein the system provides an alert to the user when the system changes the control mode (See at least Para [0051] “The haptic arm system checks if the parameters are safe and otherwise acceptable, and then enters into such a mode or responds with an appropriate error message.”, Para [0061]).
Regarding Claim 8, modified Quaid teaches all the elements of claim 1. Quaid further teaches the system of claim 1, wherein the system controller is further configured to receive auxiliary data comprising at least one of video data, image data, audio data, surgical plan data, diagnostic plan data and robot vibration data, and is still further configured to determine the current intention of the user or the state of the collaborative procedure based on the temporal force/torque data and the auxiliary data (See at least Para [0050] The image dataset(s), coupled with definitions of features to be avoided, can be used to create haptic "cues" that indicate to the surgeon that a violation of sensitive anatomy is taking place. A general function of these types of cues is to apply forces and/or torques that tend to repulse the haptic device from poses where an instrument attached to the device would, for example, impact the defined critical features. Similarly, the image dataset(s), coupled with the definitions of features to be targeted can also used to create haptic cues that indicate to the surgeon that the desired target region would be reached by the surgical instrument appropriately attached to the haptic arm. A general function of these types of cues is to attract the haptic device to such poses or lock the haptic device into these poses once they are attained. , Para [0010] “The local distance to a surface of interest, such as the surface of the defined object, or to a desired position, the local penetration distance of the surface of interest, or haptic repulsion force, often provides the most useful information for augmenting the interaction of the user with the image guided surgery system. The scalar value of the local distance may be conveyed to the user by visual, audio, tactile, haptic, or other means.”).
Regarding Claim 9, Quaid teaches a method of operating a robotic arm having one or more degrees of freedom of control, wherein the robotic arm includes an instrument interface and a force/torque sensor to sense forces at the instrument interface, the instrument interface comprising a tool guide configured to interface with a tool which can be manipulated configured to interface with a tool which can be manipulated during a collaborative procedure with a user (See at least Fig 1, Fig 2, Fig 3A, Fig 3B, Para [0045] “An example of the illustrated robotic arm is a robotic arm manufactured by Barrett Technology, and referred to as the "Whole-Arm Manipulator" or "WAM"…The WAM robotic arm has a four degrees of freedom of movement. However, it is augmented by a 1-DOF direct-drive wrist for trajectory-based medical applications. If desired, degrees of freedom may be added or removed without affecting the scope of the illustrated invention.”, Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”), the method comprising:
receiving temporal force/torque data, wherein the temporal force/torque data represents forces applied to or at the instrument interface over time by the user while interfacing with the instrument, sensed by at least one force/torque sensor during the collaborative procedure with the user (See at least Para [0028] “A haptic device is a mechanical or electro-mechanical device that interacts and communicates with a user, such as a surgeon, using sensory information such as touch, force, velocity, position, and/or torque…”, Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”);
wherein the forces comprise at least one of:
(i) forces applied indirectly to the tool guide during user manipulation of the tool;
(ii) forces applied directly to the tool guide by the user (See at least Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”);
(iii) forces from an environment of the robotic arm; and
(iv) forces generated by the tool…
However, Quaid does not explicitly spell out …
analyzing the temporal force/torque data to determine at least one of a current
intention of the user and a predefined state of the collaborative procedure,
determining a control mode which is predefined for the determined at least one of the
current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter; and
controlling the robotic arm in accordance with the control mode.
Briquet-Kerestedjian teaches …
analyzing the temporal force/torque data to determine at least one of a current
intention of the user or a predefined state of the collaborative procedure (See at least Page 3276 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact type (intended interaction on the left, undesired collision on the right)”),
determining a control mode which is predefined for the determined at least one of the
current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter (See at least Page 3279 Col 2 Para 1 “Another approach developed in [9] proposes to detect both interactions and collisions by low-pass and high-pass filtering the joint torques induced by external loads and applying adaptive thresholds tuned using experimental data. In [8], [10], the distinction is conducted in the frequency domain by calculating the Fast Fourier Transform (FFT) of the joint load torques and the derivative of the FFT with respect to time on a sliding window. Fixed frequency thresholds are determined experimentally in order to discriminate high-frequency collisions on the one hand from low-frequency interactions and on the other hand from noise under normal conditions. However, all of these frequency distinction-based methods require considerable parameter adjustment efforts that depend on the experimental data collected and that greatly affect the success of these methods.”); and
controlling the robotic arm in accordance with the control mode (See at least Page 3283
Col 1 “IV. EXPERIMENTAL VALIDATION - A. Evaluation of the resultsFor this reason, it reflects more realistic operational conditions to evaluate the results of the classification online. To this end, the trained neural network has been implemented directly inside the controller of an ABB YuMi robot.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to combine the system of Quaid with the teachings of Briquet-Kerestedjian and include the feature of analyzing the temporal force/torque data to determine at least one of a current intention of the user or a predefined state of the collaborative procedure, determine a control mode which is predefined for the determined at least one of the current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter and cause the robot controller to control the robotic arm in accordance with the control mode, thereby deciding how to control the robotic arm according to the user’s intention and increase data efficiency (See at least Page 3285 “VI. CONCLUSION - In this paper, we have presented a novel approach for classifying human-robot contact situations. The method applies supervised learning techniques by training a neural network, where the structure of the network is inspired by the physics of contact in order to increase data efficiency”).
Regarding Claim 11, modified Quaid teaches all the elements of claim 10.
Although Quaid teaches that haptic interaction forces and/or torques may be calculated using a machine learning algorithm (See at least Para [0105] “… If desired, the haptic interaction forces and/or torques may be calculated using a mathematical, control theory, or machine learning algorithm.”), he does not explicitly spell out the method of claim 10, wherein analyzing the temporal force/torque data to determine the at least one of the current intention of the user or the state of the collaborative procedure comprises applying the temporal force/torque data to a neural network to determine the current intention of the user or the state of the collaborative procedure.
Briquet-Kerestedjian teaches the method of claim 10, wherein analyzing the temporal
force/torque data to determine the at least one of the current intention of the user or the state of the collaborative procedure comprises applying the temporal force/torque data to a neural network to determine the current intention of the user or the state of the collaborative procedure (See at least Page 3276 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact type (intended interaction on the left, undesired collision on the right)”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to combine the system of Quaid with the teachings of Briquet-Kerestedjian and include analyzing the temporal force/torque data to determine at least one of the current intention of the user and the state of the collaborative procedure comprises by applying the temporal force/torque data to a neural network to determine the current intention of the user or the state of the collaborative procedure, thereby using neural network to get data in order to decide how to control the robotic arm according to the user’s intention and increase data efficiency (See at least Page 3285 “VI. CONCLUSION - In this paper, we have presented a novel approach for classifying human-robot contact situations. The method applies supervised learning techniques by training a neural network, where the structure of the network is inspired by the physics of contact in order to increase data efficiency”).
Regarding Claim 12, modified Quaid teaches all the elements of claim 11.
Although Quaid teaches reference pose may be associated with the desired trajectory of a drill guide attached to haptic device (See at least Para [0132] “For example, in one embodiment, the reference pose may be associated with the desired trajectory of a drill guide attached to haptic device 113. In such an embodiment, updating the reference pose in step 202 comprises changing the desired trajectory of the drill guide. When the user moves haptic device 113 from the reference pose for a prolonged period of time, the reference pose will be updated to move in the direction of the user's deflection. If, in step 210, an appropriate haptic feedback wrench is applied, then upon release of haptic device 113 by the user, haptic device 113 will assume the new reference pose. When the user is satisfied with the reference pose and the input mode is terminated in step 212, haptic device 113 will be in a pose such that the drill guide is aligned with the desired trajectory.”), he does not explicitly spell out the system of claim 3, wherein the neural network is configured to determine from the temporal force/torque data when the user is drilling with the tool, and is further configured to determine from the temporal force/torque data when the user is hammering with the tool.
Briquet-Kerestedjian further teaches the system of claim 3, wherein the neural network is
configured to determine from the temporal force/torque data when the user is drilling with the tool, and is further configured to determine from the temporal force/torque data when the user is hammering with the tool (See at least See at least Page 3276 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact
type (intended interaction on the left, undesired collision on the right)”, discloses neural network used to analyze the force to determine user’s intention).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to combine the system of Quaid with the teachings of Briquet-Kerestedjian and include the feature of the neural network is being configured to determine from the temporal force/torque data when the user is drilling with the tool, and is further configured to determine from the temporal force/torque data when the user is hammering with the tool, thereby using neural network to get data in order to decide how to control the robotic arm according to the user’s intention and increase data efficiency (See at least Page 3285 “VI. CONCLUSION - In this paper, we have presented a novel approach for classifying human-robot contact situations. The method applies supervised learning techniques by training a neural network, where the structure of the network is inspired by the physics of contact in order to increase data efficiency”).
Regarding Claim 13, modified Quaid teaches all the elements of claim 12. Quaid further teaches the method of claim 12, wherein the at least one robot control parameter controls a rendered stiffness of the tool guide against the forces applied in at least one direction (See at least Para [0051] “…For example, the computer-aided surgery system may send a command to the haptic device requesting it to enter into a joystick-like input mode with certain stiffness parameters…”).
Regarding Claim 14, modified Quaid teaches all the elements of claim 13. Quaid further teaches the method of claim 13, wherein when the neural network determines from the temporal force/torque data that the user is hammering with the tool, the neural network further determines whether the tool is hammering through bone or is hammering through tissue, wherein when the tool is determined to be hammering through tissue, the control mode is a first stiffness mode wherein the tool guide has a first stiffness, and wherein when the tool is determined to be hammering through bone, the control mode is a second stiffness mode wherein the tool guide has a second stiffness, wherein the second stiffness is less than the first stiffness (See at least Para [0052] “…An algorithm which computes the current position of haptic device 113 relative to haptic object 20 may be used to provide information to the surgeon about the location of haptic device 113 relative to haptic object 20. When haptic device 113 comes within a predefined distance of haptic object 20, a stiffness parameter may be changed to make it more difficult to move haptic device 113…”, Para [0058] “…If desired, the collected information may be logged for use in machine learning techniques. “, Para [0117] “The stiffness or damping of the control algorithm may vary in different directions to indicate preferential directions of motion which may be aligned with any direction as described in the previous paragraph. This stiffness variation may include zero stiffness along certain directions or may lock the user to the preferred directions once the deviation from the reference position exceeds some threshold value. This stiffness variation assists with simplifying the planning process by allowing the user to focus their attention on a limited number of degrees of freedom at a time…”, Para [0118] “The stiffness and damping variations can occur automatically depending on the physical interaction of the user with the haptic device…”, Para [0121] “…Thus, haptic device 113 may be used to differentiate between hard and soft bones, healthy and diseases tissues, different types of healthy tissues, boundaries of anatomical structures, etc…”).
Regarding Claim 15, modified Quaid teaches all the elements of claim 9. Quaid further teaches the method of claim 9, further comprising providing an alert to the user when the control mode is changed (See at least Para [0051] “The haptic arm system checks if the parameters are safe and otherwise acceptable, and then enters into such a mode or responds with an appropriate error message.”, Para [0061].
Regarding Claim 16, modified Quaid teaches all the elements of claim 9. Quaid further teaches the method of claim 9, further comprising:
receiving auxiliary data comprising at least one of video data, image data, audio data, surgical plan data, diagnostic plan data and robot vibration data (See at least Para [0050] The image dataset(s), coupled with definitions of features to be avoided, can be used to create haptic "cues" that indicate to the surgeon that a violation of sensitive anatomy is taking place. A general function of these types of cues is to apply forces and/or torques that tend to repulse the haptic device from poses where an instrument attached to the device would, for example, impact the defined critical features. Similarly, the image dataset(s), coupled with the definitions of features to be targeted can also used to create haptic cues that indicate to the surgeon that the desired target region would be reached by the surgical instrument appropriately attached to the haptic arm. A general function of these types of cues is to attract the haptic device to such poses or lock the haptic device into these poses once they are attained.”); and
determining the current intention of the user or the state of the collaborative procedure based on the temporal force/torque data and the auxiliary data (See at least Para [0050] The image dataset(s), coupled with definitions of features to be avoided, can be used to create haptic "cues" that indicate to the surgeon that a violation of sensitive anatomy is taking place. A general function of these types of cues is to apply forces and/or torques that tend to repulse the haptic device from poses where an instrument attached to the device would, for example, impact the defined critical features. Similarly, the image dataset(s), coupled with the definitions of features to be targeted can also used to create haptic cues that indicate to the surgeon that the desired target region would be reached by the surgical instrument appropriately attached to the haptic arm. A general function of these types of cues is to attract the haptic device to such poses or lock the haptic device into these poses once they are attained.”).
Regarding Claim 17, Quaid teaches a processing system for controlling a robotic arm having one or more degrees of freedom of control, wherein the robotic arm includes an instrument interface (See at least Fig 1, Fig 2, Fig 3A, Fig 3B, Para [0045] “An example of the illustrated robotic arm is a robotic arm manufactured by Barrett Technology, and referred to as the "Whole-Arm Manipulator" or "WAM"…The WAM robotic arm has a four degrees of freedom of movement. However, it is augmented by a 1-DOF direct-drive wrist for trajectory-based medical applications. If desired, degrees of freedom may be added or removed without affecting the scope of the illustrated invention.”, Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”) and a force/torque sensor to sense forces at the instrument interface, the instrument interface comprising a tool guide configured to interface with a tool during a collaborative procedure with a user, the processing system comprising:
a processor (See at least Para [0035] “FIG. 1 is a diagrammatic illustration of an exemplary operating room in which a haptic device 113 is used with a computer-assisted surgery system 11. Computer-assisted surgery system 11 comprises a display device 30, an input device 34, and a processor based system 36, for example a computer.”, [0036] “Haptic device 113 is, in the illustrated example, a robotic device. Haptic device 113 may be controlled by a processor based system, for example a computer 10. Computer 20 may also include power amplification and input/output hardware. Haptic device 113 may communicate with computer-assisted surgery system 11 by any communication mechanism now known or later developed, whether wired or wireless.”);
and memory having stored therein instructions (See at least Para [0037] “Also shown in FIG. 1 is a storage medium 12 coupled to processor based system 36. Storage medium 12 may accept a digital medium which stores software and/or other data…”) which, when executed by the processor, cause the processor to:
receive temporal force/torque data, wherein the temporal force/torque data represents
forces applied to or at the instrument interface over time by the user while interfacing with the instrument during the collaborative procedure with the user (See at least Para [0028] “A haptic device is a mechanical or electro-mechanical device that interacts and communicates with a user, such as a surgeon, using sensory information such as touch, force, velocity, position, and/or torque…”, Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”),
wherein the forces comprise at least one of:
(i) forces applied indirectly to the tool guide during user manipulation of the tool;
(ii) forces applied directly to the tool guide by the user (See at least Para [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”);
(iii) forces from an environment of the robotic arm; and
(iv) forces generated by the tool…
However, Quaid does not explicitly spell out …
analyze the temporal force/torque data to determine at least one of a current intention
of the user or a predefined state of the collaborative procedure,
determine a control mode which is predefined for the determined at least one of the
current intention of the user or the state of the collaborative procedure, wherein the control mode sets at least one robot control parameter; and
cause the robot controller to control the robotic arm in accordance with the control
mode.
Briquet-Kerestedjian teaches …
analyze the temporal force/torque data to determine at least one of a current intention
of the user or a predefined state of the collaborative procedure (See at least Page 3276 “Fig. 1. ABB YuMi, a dual-arm collaborative manipulator with 7DOF each arm used for the experiments - A neural network with extraction of features for each joint aims to characterize the contact situation: contact
type (intended interaction on the left, undesired collision on the right)”),
determine a control mode which is predefined for the determined at least one of the
current intention of the user or the state of the collaborative procedure, wherein the control mode sets at least one robot control parameter (See at least Page 3279 Col 2 Para 1 “Another approach developed in [9] proposes to detect both interactions and collisions by low-pass and high-pass filtering the joint torques induced by external loads and applying adaptive thresholds tuned using experimental data. In [8], [10], the distinction is conducted in the frequency domain by calculating the Fast Fourier Transform (FFT) of the joint load torques and the derivative of the FFT with respect to time on a sliding window. Fixed frequency thresholds are determined experimentally in order to discriminate high-frequency collisions on the one hand from low-frequency interactions and on the other hand from noise under normal conditions. However, all of these frequency distinction-based methods require considerable parameter adjustment efforts that depend on the experimental data collected and that greatly affect the success of these methods.”); and
cause the robot controller to control the robotic arm in accordance with the control
mode (See at least Page 3283 Col 1 “IV. EXPERIMENTAL VALIDATION - A. Evaluation of the resultsFor this reason, it reflects more realistic operational conditions to evaluate the results of the classification online. To this end, the trained neural network has been implemented directly inside the controller of an ABB YuMi robot.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to combine the system of Quaid with the teachings of Briquet-Kerestedjian and include the feature of analyzing the temporal force/torque data to determine at least one of a current intention of the user or a predefined state of the collaborative procedure, determine a control mode which is predefined for the determined at least one of the current intention of the user or the state of the collaborative procedure, wherein the control mode determines the at least one robot control parameter and cause the robot controller to control the robotic arm in accordance with the control mode, thereby deciding how to control the robotic arm according to the user’s intention and increase data efficiency (See at least 3283 “VI. CONCLUSION - In this paper, we have presented a novel approach for classifying human-robot contact situations. The method applies
supervised learning techniques by training a neural network, where the structure of the network is inspired by the physics of contact in order to increase data efficiency”).
Regarding Claim 18, modified Quaid teaches all the elements of claim 17. Quaid further teaches the system of claim 17, wherein the instrument interface comprises a tool guide which is configured to be interfaced with a tool which can be manipulated by the user during the collaborative procedure, and wherein the forces comprise at least one of: (1) forces exerted indirectly on the tool guide by the user during user manipulation of the tool; (2) forces applied directly to the tool guide by the user (See at least [0056] “A portion of surgical tool 112 coupled with a haptic device, for example the tip of surgical tool 112, may be used to sense properties of the local anatomy. The properties of the local anatomy may be used to position surgical tool 112 or to verify the proper positioning of surgical tool 112. The properties that may be sensed or monitored by the tool include electrical properties of the anatomy, force, pressure, stiffness, conductivity, etc…”); (3) forces from an environment of the robot; and (4) forces generated by the tool.
Regarding Claim 19, modified Quaid teaches all the elements of claim 18. Quaid further teaches the system of claim 18, wherein the instructions further cause the processor to analyze the temporal force/torque data to identify a command provided by the user to the system to instruct the system to switch the control mode to a predefined mode (See at least Para [0122] “FIG. 9 is a flowchart of a representative method 190 for using haptic device 113 as an input device. In step 192, the input mode is initiated. The user may initiate the input mode by any mechanism now known or later developed. For example, the user may use a graphical user interface, a footswitch, a keyboard, a button, and/or the like, to indicate that the user desires to use haptic device 113 as an input device…”).
Regarding Claim 20, modified Quaid teaches all the elements of claim 18. Quaid further teaches the system of claim 18, wherein the at least one robot control parameter controls a rendered stiffness of the tool guide against the forces applied in at least one direction (See at least Para [0051] “…For example, the computer-aided surgery system may send a command to the haptic device requesting it to enter into a joystick-like input mode with certain stiffness parameters…”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Gulhar et al. (US 2016/0144510 A1) teaches method for operating robotic device such as
medical-robotic device e.g. medical-surgical device with kinematic chain of mobile components in medical-surgical and/or medical-diagnostic procedure
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAHEDA HOQUE whose telephone number is (571)270-5310. The examiner can normally be reached Monday-Friday 8:00 am- 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached on 571-270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SHAHEDA HOQUE/Examiner, Art Unit 3658
/Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658