DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments filed on 01/05/2026 with respect to claim(s) 1-6, 9-16, 18-20 have been fully considered but are moot in view of new ground of rejection provided below which is necessitated based on Applicant’s amendments to the claims. The new ground of rejection of independent claims are based on Vonach, in view of Satler, and further in view of Chizeck. Satler teaches determining an estimated pose for the user based on the sensor data, comparing the estimated pose to the position constraint information, and generating robotic arm control commands to apply at least one of force or torque to the user via the end effector in a case where the estimated pose approaches, abuts and or intersects with all or part of the position constraints, wherein the robotic arm control commands are generated based on the force sensor data and the torque sensor data (See at least Page 416 Col 2 Para 5, Page 419 Col 1 Para 4, Fig 5, Fig 6). Therefore, combination of these references anticipates the limitations of independent claims.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-5, 9-11, 14-16, 18, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Vonach et al. (E. Vonach, C. Gatterer and H. Kaufmann, "VRRobot: Robot actuated props in an infinite virtual environment," 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 2017, pp. 74-83) (Hereinafter Vonach) in view of Satler et al. (M. Satler, C. A. Avizzano and E. Ruffaldi, "Control of a desktop mobile haptic interface," 2011 IEEE World Haptics Conference, Istanbul, Turkey, 2011, pp. 415-420) (Hereinafter Satler), Chizeck et al. (US 10226869 B2) (Hereinafter Chizeck),.
Regarding Claim 1, Vonach teaches a haptic feedback system (See at least Page 75 Col 1 “Introduction - …In this work we present a fully immersive VR system, incorporating aspects of encounter-type devices and prop-based haptic feedback…”) comprising:
at least one robotic arm, the robotic arm comprising at least one joint and an end effector … (See at least Fig 1, Fig 3 discloses robotic arm comprising at least one joint and an end effector, wherein the end effector is physically touching the user, Page 78 “4.2 Software 4.2.1 Robot Control …On that basis we implemented the controller with a wide variety of functions: It is possible to move the robot arm in the work envelope via xyz-coordinates and define the orientation of the end effector…”) ;
an external data connection to a computing device associated with an environment (See at least Page 78 Col 1 “4.2 Software - 4.2.1 Robot Control - The individual motor controllers on the actuators have to be coordinated over a main controller, which we implemented in C++. This controller runs on a PC and communicates with the individual actuators over the asynchronous serial Communication Protocol 1.0 of the Dynamixel SDK…”); and
a controller configured to:
determine an estimated pose for the user based on the sensor data (See at least Page 77 “4.1.2 Human-Robot Interaction - In order to control a virtual avatar as well as for safety reasons (see section 4.3), the user is equipped with a Perception Neuron inertial motion suit (Figure 2c). 17 inertial sensors report the pose of every part of the body with 120 Hz. ”, Page 77 Col 2 Para 2 “…Finally, the key component is the employed 7-axis Crusterawler Pro-Series robotic arm (Figure 2e), which enables our VR system to actuate physical props for haptic feedback. As soon as the user is in range of a virtual object he or she might want to interact with, the robot arm can select an appropriate physical prop and present it, dynamically matching pose and location of the object in respect to the user's current position in the virtual world…”); and
control the at least one robotic arm based on the estimated pose to apply haptic feedback to the user via the end effector (See at least Page 77 Col 2 Para 2 “…Finally, the key component is the employed 7-axis Crusterawler Pro-Series robotic arm (Figure 2e), which enables our VR system to actuate physical props for haptic feedback. As soon as the user is in range of a virtual object he or she might want to interact with, the robot arm can select an appropriate physical prop and present it, dynamically matching pose and location of the object in respect to the user's current position in the virtual world…”).
However, Vonach does not explicitly spell out … wherein the end effector is configured to be physically coupled to a user, and wherein the end effector comprises as least one end effector sensor;…
receive sensor data comprising torque sensor data and force sensor data from the at least one robotic arm …
receive position constraint information comprising one or more position constraints from the computing device and compare the estimated pose to the position constraint information; and
generate robotic arm control commands to apply at least one of force or torque to the user via the end effector in a case where the estimated pose approaches, abuts and/or intersects with all or part of the position constraints;
wherein the position constraint information corresponds with at least one position in the virtual environment where user movement is restricted.
wherein the robotic arm control commands are generated based on the force sensor data and the torque sensor data received from the at least one robotic arm.
Satler teaches … wherein the end effector is configured to be physically coupled to a user, and wherein the end effector comprises as least one end effector sensor (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”, Figure 5 shows that the robot is physically coupled to the user);…
receive sensor data comprising torque sensor data and force sensor data from the at least one robotic arm (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”) …
receive position constraint information comprising one or more position constraints from the computing device and compare the estimated pose to the position constraint information (See at least Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, discloses the system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path which is construed as the pose of the user being compared with the position constraint information, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”); and …
generate robotic arm control commands to apply at least one of force or torque to the user via the end effector in a case where the estimated pose approaches, abuts and/or intersects with all or part of the position constraints (See at least Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, discloses that the robot tried to compensate the wrong movements, and according to the level of compensation set, exerted the force feedback to guide the user toward the reference path which is construed as generating robotic arm control commands to apply at least one of force or torque to the user via the end effector in a case where the estimated pose approaches, abuts and/or intersects with all or part of the position constraints, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”); …
wherein the robotic arm control commands are generated based on the force sensor data and the torque sensor data received from the at least one robotic arm (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”, Page 418 Col 2 Para 3 “The embedded control unit runs both the low level control (sensors measurement and motors control)”, Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Vonach with the teachings of Satler and include the feature of comprising one or more position constraints from the computing device and compare the estimated pose to the position constraint information and robotic arm control commands being generated based on the force sensor data and the torque sensor data received from the at least one robotic arm, thereby providing safety and movement accuracy of the end effector (See at least Page 415 “Abstract - … The sensor fusion algorithm has been optimized to provide users with a quality force feedback while ensuring accurate position tracking…”, Page 420 Col 1 Para 1 “Our system has been oriented toward robotics rehabilitation thus during the design phase we paid particular attention in order to provide a low cost, safe and easy-to-use, robotic-device that assists the patient and therapist in order to achieve more systematic therapy…”, Page 418 Col 2 Para 2 “…. Because we are interested in a quantitative evaluation of task performance, the implemented algorithm has to provide accurate position and force measurements…”).
Chizek teaches …
wherein the position constraint information corresponds with at least one position in the virtual environment where user movement is restricted (See at least Fig 3A item 320, Col 11 Lines 31-40 “as shown at lower-center of FIG. 3A, a virtual environment is shown with VRobot 340 in position to move toward window 354 of door 350. In this example, forbidden region fixture 322 is covering window 354, as window 354 is a restricted object. That is, if VRobot 340 gets within a forbidden region about window 354 provided by forbidden region fixture 354, then forbidden region fixture 322 can provide force and torque feedback to slow or stop motion of the manipulator controlling VRobot toward the restricted object; e.g., window 354…”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Vonach with the teachings of Chizeck and include the feature of position constraint information to apply haptic feedback to the user via the end effector, thereby providing safety and movement accuracy of the end effector (See at least Col 4 Lines 46-49 “Haptic navigation can improve speed, safety, and accuracy of a remotely operated vehicle (ROV) and/or other robotic tool(s) manipulation by incorporating intuitive and responsive force feedback into operator controls. ”).
Regarding Claim 2, modified Vonach teaches all the elements of claim 1. Vonach further teaches the haptic feedback system of claim 1, wherein the controller is further configured to control a proxy of the user within the virtual environment based on the estimated position (See at least Page 77 “4.1.2 Human-Robot Interaction - In order to control a virtual avatar as well as for safety reasons (see section 4.3), the user is equipped with a Perception Neuron inertial motion suit (Figure 2c). 17 inertial sensors report the pose of every part of the body with 120 Hz. ”, discloses controlling a virtual avatar which is construed as controlling a proxy of the user, Page 78 “4.2.2 Game Engine Integration - For general and convenient use we provided a C# wrapper for most of the controller functions and integrated it into a package for the game development engine Unity3D. There is a template for a virtual scene that includes models of the physical setup. The user's avatar is fully customizable to different body dimensions. Its position and orientation in the virtual world is determined by the analog signal from the Virtualizer, describing the movement of the user in the physical world. The Perception Neuron Unity SDK is integrated to apply the motions captured by the motion suit to the limbs and posture of the avatar. The models in the Unity scene are always in synchronization with their counterparts in OpenRAVE, which are used for path planning of the robotic arm. The scene is configured to render the virtual world on the Oculus Rift HMD and incorporates the Leap Motion Orion SDK to provide finger tracking. The developed Unity package also provides a template that contains all required components to fit any object in the virtual world with a haptic representation. As soon as the user's avatar approaches a specified distance to such an object, the robot arm is assigned exclusively to it. If necessary the arm can pick up the appropriate physical prop, and then takes a standby position outside reach, in a pose that is most likely ideal depending on the approach vector of the user. When the avatar closes to the user's reach, the robot arm presents the physical prop attached to the gripper, matching the correct position relative to the user to the corresponding object in the virtual world. If the path the robot arm takes to this position would last longer than a certain timespan (typically 2 s), we display a loading icon at the location of the virtual object to signal that the physical representation is not ready yet. For shorter paths this is not necessary. If a specific path is desired or if a required position is known in advance, it is possible to calculate a path in advance, which speeds up the response time of the arm since the IKs don't have to be calculated at runtime. When the avatar moves out of range, the robot arm moves to an idle position and access is released to all objects again.”).
Regarding Claim 3, modified Vonach teaches all the elements of claim 1. Vonach further teaches … control the robotic arm to limit at least one of a force applied to the user and a torque applied
to the user (See at least Page 78 “4.2 Software - 4.2.1 Robot Control - The individual motor controllers on the actuators have to be coordinated over a main controller, which we implemented in C++. This controller runs on a PC and communicates with the individual actuators over the asynchronous serial Communication Protocol 1.0 of the Dynamixel SDK. In that way all available parameters can be read or set, for example the desired joint angle as well as limits for speed, angle or torque…”, Page 79 Col 1 Para 3 “Torque Limits - If despite all safety measures a collision should occur or the user is somehow caught between joints, it would cause the measured torque in one of the actuators to exceed the designated safety limits.”).
However, Vonach does not explicitly spell out the haptic feedback system of claim 1, wherein the controller is further configured to:
receive force and torque information via the external data connection; and …
Satler teaches the haptic feedback system of claim 1, wherein the controller is further configured to:
receive force and torque information via the external data connection (See at least Page 416 Col
2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”, Page 418 Col 2 Para 3 “The embedded control unit runs both the low level control (sensors measurement and motors control)”, Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”); and …
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Vonach with the teachings of Satler and include robotic arm control by limiting force or torque applied to the user using force and torque information, thereby providing precise control of the robotic arm for haptic feedback (See at least Page 415 “Abstract - … The sensor fusion algorithm has been optimized to provide users with a quality force feedback while ensuring accurate position tracking…”, Page 418 Col 2 Para 2 “…. Because we are interested in a quantitative evaluation of task performance, the implemented algorithm has to provide accurate position and force measurements…”).
Regarding Claim 4, modified Vonach teaches all the elements of claim 3.
However, Vonach does not explicitly spell out the haptic feedback system of claim 3, wherein the robotic arm further comprises sensors, and where controlling the robotic arm based on the force and torque information comprises measuring at least one of a force and a torque applied to the user at the end effector using the sensors.
Satler teaches the haptic feedback system of claim 3, wherein the robotic arm further
comprises sensors (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”), and where controlling the robotic arm based on the force and torque information comprises measuring at least one of a force and a torque applied to the user at the end effector using the sensors (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”, Page 418 Col 2 Para 3 “The embedded control unit runs both the low level control (sensors measurement and motors control)”, Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Vonach with the teachings of Satler and include robotic arm control by using force and/or torque sensor information, thereby providing precise control of the robotic arm for haptic feedback (See at least Page 415 “Abstract - … The sensor fusion algorithm has been optimized to provide users with a quality force feedback while ensuring accurate position tracking…”, Page 418 Col 2 Para 2 “…. Because we are interested in a quantitative evaluation of task performance, the implemented algorithm has to provide accurate position and force measurements…”).
Regarding Claim 5, modified Vonach teaches all the elements of claim 4.
However, Vonach does not explicitly spell out the haptic feedback system of claim 4, wherein
the controller is a closed-loop controller and receives data measured by the sensors.
Satler teaches the haptic feedback system of claim 4, wherein the controller is a closed-loop
controller and receives data measured by the sensors (See at least Page 418 Col 2 Para 7 “3.3 Control loops – From the faster operation frequency to the slower one the closed loops are the current controller…”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to modify the system of Vonach with the teachings of Satler and include a closed-loop controller and receives data measured by the sensors, thereby providing precise control of the robotic arm for haptic feedback (See at least Page 415 “Abstract - … The sensor fusion algorithm has been optimized to provide users with a quality force feedback while ensuring accurate position tracking…”, Page 418 Col 2 Para 2 “…. Because we are interested in a quantitative evaluation of task performance, the implemented algorithm has to provide accurate position and force measurements…”).
Regarding Claim 9, modified Vonach teaches all the elements of claim 1. Vonach further teaches the haptic feedback system of claim 1, wherein the end effector is physically connected to an end of a limb of the user (See at least Fig 1, Fig 2 disclose the end effector is physically connected to an end of a limb of the user).
Regarding Claim 10, modified Vonach teaches all the elements of claim 1. Vonach further teaches the haptic feedback system of claim 1, wherein the end effector is physically connected to a garment worn by the user (See at least Fig 1, Fig 2 disclose the end effector is physically connected to a garment worn by the user which is a glove).
Regarding Claim 11, modified Vonach teaches all the elements of claim 10. Vonach further teaches the haptic feedback system of claim 10, wherein the garment comprises at least one selected from a group consisting of a glove, a shoe, a sleeve, a belt, a vest, a helmet, and a neck wrap (See at least Fig 1, Fig 2 disclose the end effector is physically connected to a garment worn by the user which is a glove).
Regarding Claim 14, modified Vonach teaches all the elements of claim 1. Vonach further teaches the haptic feedback system of claim 1, wherein the at least one robotic arm is further physically coupled to a fixed structure surrounding the user (See at least Fig 1, Fig 2 discloses a physical prop coupled with robotic arm which is construed as the robotic arm being physically coupled to a fixed structure surrounding the user).
Regarding Claim 15, Vonach teaches a method of using a haptic feedback system comprising at least one robotic arm, the robotic arm comprising at least one joint and an end effector, wherein the end effector is configured to be physically coupled to a user, and wherein the end effector comprises at least one end effector sensor, an external data connection to a computing device associated with a virtual environment, and a controller, the method comprising:
determining an estimated pose for the user based on the sensor data received by the controller (See at least Page 77 “4.1.2 Human-Robot Interaction - In order to control a virtual avatar as well as for safety reasons (see section 4.3), the user is equipped with a Perception Neuron inertial motion suit (Figure 2c). 17 inertial sensors report the pose of every part of the body with 120 Hz. ”); and
controlling the at least one robotic arm by the controller based on the estimated pose to apply haptic feedback to the user (See at least Page 77 “4.1.2 Human-Robot Interaction - In order to control a virtual avatar as well as for safety reasons (see section 4.3), the user is equipped with a Perception Neuron inertial motion suit (Figure 2c). 17 inertial sensors report the pose of every part of the body with 120 Hz. ”, Page 77 Col 2 Para 2 “…Finally, the key component is the employed 7-axis Crusterawler Pro-Series robotic arm (Figure 2e), which enables our VR system to actuate physical props for haptic feedback. As soon as the user is in range of a virtual object he or she might want to interact with, the robot arm can select an appropriate physical prop and present it, dynamically matching pose and location of the object in respect to the user's current position in the virtual world…”).
However, Vonach does not explicitly spell out …
receiving sensor data comprising torque sensor data and force sensor data from the at least one robotic arm by the controller…
receiving position constraint information comprising one or multiple position constraints from the computing device by the controller and comparing the estimated pose to the position constraint information; and …
generating robotic arm control commands to apply at least one of force or torque to the user via the end effector in a case where the estimated pose approaches, abuts and/or intersects with all or part of the position constraints;
wherein the position constraint information corresponds with at least one position in the virtual environment where user movement is restricted;
wherein the robotic arm control commands are generated based on the force sensor data and the torque sensor data received from the at least one robotic arm.
Satler teaches …
receiving sensor data comprising torque sensor data and force sensor data from the at least one robotic arm by the controller (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”) …
receiving position constraint information comprising one or multiple position constraints from the computing device by the controller and comparing the estimated pose to the position constraint information (See at least Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, discloses the system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path which is construed as the pose of the user being compared with the position constraint information, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”); and …
generating robotic arm control commands to apply at least one of force or torque to the user via the end effector in a case where the estimated pose approaches, abuts and/or intersects with all or part of the position constraints (See at least Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, discloses that the robot tried to compensate the wrong movements, and according to the level of compensation set, exerted the force feedback to guide the user toward the reference path which is construed as generating robotic arm control commands to apply at least one of force or torque to the user via the end effector in a case where the estimated pose approaches, abuts and/or intersects with all or part of the position constraints, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”); …
wherein the robotic arm control commands are generated based on the force sensor data and the torque sensor data received from the at least one robotic arm (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”, Page 418 Col 2 Para 3 “The embedded control unit runs both the low level control (sensors measurement and motors control)”, Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Vonach with the teachings of Satler and include the feature of comprising one or more position constraints from the computing device and compare the estimated pose to the position constraint information and robotic arm control commands being generated based on the force sensor data and the torque sensor data received from the at least one robotic arm, thereby providing safety and movement accuracy of the end effector (See at least Page 415 “Abstract - … The sensor fusion algorithm has been optimized to provide users with a quality force feedback while ensuring accurate position tracking…”, Page 420 Col 1 Para 1 “Our system has been oriented toward robotics rehabilitation thus during the design phase we paid particular attention in order to provide a low cost, safe and easy-to-use, robotic-device that assists the patient and therapist in order to achieve more systematic therapy…”, Page 418 Col 2 Para 2 “…. Because we are interested in a quantitative evaluation of task performance, the implemented algorithm has to provide accurate position and force measurements…”).
Chizeck teaches …
wherein the position constraint information corresponds with at least one position in the virtual environment where user movement is restricted (See at least Fig 3A item 320, Col 11 Lines 31-40 “as shown at lower-center of FIG. 3A, a virtual environment is shown with VRobot 340 in position to move toward window 354 of door 350. In this example, forbidden region fixture 322 is covering window 354, as window 354 is a restricted object. That is, if VRobot 340 gets within a forbidden region about window 354 provided by forbidden region fixture 354, then forbidden region fixture 322 can provide force and torque feedback to slow or stop motion of the manipulator controlling VRobot toward the restricted object; e.g., window 354…”);
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Vonach with the teachings of Chizeck and include the feature of position constraint information in the virtual environment to apply haptic feedback to the user via the end effector, thereby providing safety and movement accuracy of the end effector (See at least Col 4 Lines 46-49 “Haptic navigation can improve speed, safety, and accuracy of a remotely operated vehicle (ROV) and/or other robotic tool(s) manipulation by incorporating intuitive and responsive force feedback into operator controls. ”).
Regarding Claim 16, modified Vonach teaches all the elements of claim 15. Vonach further teaches the method of claim 15, wherein the controller is further configured to control a proxy of the user within the virtual environment based on the estimated position (See at least Page 77 “4.1.2 Human-Robot Interaction - In order to control a virtual avatar as well as for safety reasons (see section 4.3), the user is equipped with a Perception Neuron inertial motion suit (Figure 2c). 17 inertial sensors report the pose of every part of the body with 120 Hz. ”, discloses controlling a virtual avatar which is construed as controlling a proxy of the user, Page 78 “4.2.2 Game Engine Integration - For general and convenient use we provided a C# wrapper for most of the controller functions and integrated it into a package for the game development engine Unity3D. There is a template for a virtual scene that includes models of the physical setup. The user's avatar is fully customizable to different body dimensions. Its position and orientation in the virtual world is determined by the analog signal from the Virtualizer, describing the movement of the user in the physical world. The Perception Neuron Unity SDK is integrated to apply the motions captured by the motion suit to the limbs and posture of the avatar. The models in the Unity scene are always in synchronization with their counterparts in OpenRAVE, which are used for path planning of the robotic arm. The scene is configured to render the virtual world on the Oculus Rift HMD and incorporates the Leap Motion Orion SDK to provide finger tracking. The developed Unity package also provides a template that contains all required components to fit any object in the virtual world with a haptic representation. As soon as the user's avatar approaches a specified distance to such an object, the robot arm is assigned exclusively to it. If necessary the arm can pick up the appropriate physical prop, and then takes a standby position outside reach, in a pose that is most likely ideal depending on the approach vector of the user. When the avatar closes to the user's reach, the robot arm presents the physical prop attached to the gripper, matching the correct position relative to the user to the corresponding object in the virtual world. If the path the robot arm takes to this position would last longer than a certain timespan (typically 2 s), we display a loading icon at the location of the virtual object to signal that the physical representation is not ready yet. For shorter paths this is not necessary. If a specific path is desired or if a required position is known in advance, it is possible to calculate a path in advance, which speeds up the response time of the arm since the IKs don't have to be calculated at runtime. When the avatar moves out of range, the robot arm moves to an idle position and access is released to all objects again.”).
Regarding Claim 18, modified Vonach teaches all the elements of claim 15.
However, Vonach does not explicitly spell out the method of claim 15, wherein the robotic arm comprises sensors, and where controlling the robotic arm comprises measuring at least one of a force and a torque applied to the user using the sensors.
Satler teaches the method of claim 15, wherein the robotic arm comprises sensors (See at least
Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”), and where controlling the robotic arm comprises measuring at least one of a force and a torque applied to the user using the sensors (See at least Page 416 Col 2 Para 5 “As mentioned above a two axes force sensor was embedded in the robot handle to detect and measure the interaction forces with the user’s hand…”, Page 418 Col 2 Para 3 “The embedded control unit runs both the low level control (sensors measurement and motors control)”, Page 419 Col 1 Para 4 “… Figure 6. In particular the user had to follow the desired trajectory (the red one in the picture), while the robot tried to compensate the wrong movements. It has to be noted the excellent repeatability of the user’s trajectory (the blue one in the picture) and the lack of drift in the robot position estimation… The trajectory deviation near the point [-0.07;-0.02] occurred when the user tried to move the his/her arm in the wrong direction. The system, according to the level of compensation set, exerted the force feedback to guide the user toward the reference path. During the experimentation we observed that the robot is able to provide the force feedback up to 30N.”, Page 419 Col 2 “Figure 6: Trajectory tracking example. The user try to move along the reference path (the red line) while to robot compensates the position error with an amount of force feedback.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Vonach with the teachings of Satler and include robotic arm control by using force and/or torque sensor information, thereby providing precise control of the robotic arm for haptic feedback (See at least Page 415 “Abstract - … The sensor fusion algorithm has been optimized to provide users with a quality force feedback while ensuring accurate position tracking…”, Page 418 Col 2 Para 2 “…. Because we are interested in a quantitative evaluation of task performance, the implemented algorithm has to provide accurate position and force measurements…”).
Regarding Claim 19, modified Vonach teaches all the elements of claim 18.
However, Vonach does not explicitly spell out the method of claim 18, wherein
the controller is a closed-loop controller and receives data measured by the sensors.
Satler teaches the method of claim 18, wherein the controller is a closed-loop controller and
receives data measured by the sensors (See at least Page 418 Col 2 Para 7 “3.3 Control loops – From the faster operation frequency to the slower one the closed loops are the current controller…”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to modify the system of Vonach with the teachings of Satler and include a closed-loop controller and receives data measured by the sensors, thereby providing precise control of the robotic arm for haptic feedback (See at least Page 415 “Abstract - … The sensor fusion algorithm has been optimized to provide users with a quality force feedback while ensuring accurate position tracking…”, Page 418 Col 2 Para 2 “…. Because we are interested in a quantitative evaluation of task performance, the implemented algorithm has to provide accurate position and force measurements…”).
Claim 6 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Vonach et al. (E. Vonach, C. Gatterer and H. Kaufmann, "VRRobot: Robot actuated props in an infinite virtual environment," 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 2017, pp. 74-83) (Hereinafter Vonach) in view of Satler et al. (M. Satler, C. A. Avizzano and E. Ruffaldi, "Control of a desktop mobile haptic interface," 2011 IEEE World Haptics Conference, Istanbul, Turkey, 2011, pp. 415-420) (Hereinafter Satler), Chizeck et al. (US 10226869 B2) (Hereinafter Chizeck), and further in view of Gao et al. (CN113478507A) (Hereinafter Gao).
Regarding Claim 6, modified Vonach teaches all the elements of claim 4.
However, Vonach does not explicitly spell out the haptic feedback system of claim 4, wherein
the sensors are arranged in a series orientation.
Gao teaches the haptic feedback system of claim 4, wherein the sensors are arranged in a
series orientation (See at least Page 2 Para 8, “(1) Construct a robot operation experiment platform; it is mainly composed of a robot experiment platform support, a series six-dimensional force sensor, a depth camera (or binocular camera), a robot and a supporting device for the robot operation experiment platform. The series six-dimensional force sensor mainly includes a traction force sensor and a contact force sensor, which are mounted to the end of the robot through a flange, the traction force sensor is on the top, and the contact force sensor is on the bottom.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to modify the system of Vonach with the teachings of Gao and arrange sensors in a series orientation, thereby providing downtime reduction and increased reliability.
Regarding Claim 20, modified Vonach teaches all the elements of claim 18.
However, Vonach does not explicitly spell out the method of claim 18, wherein
the sensors are arranged in a series orientation.
Gao teaches the method of claim 18, wherein the sensors are arranged in a series orientation
(See at least Page 2 Para 8, “(1) Construct a robot operation experiment platform; it is mainly composed of a robot experiment platform support, a series six-dimensional force sensor, a depth camera (or binocular camera), a robot and a supporting device for the robot operation experiment platform. The series six-dimensional force sensor mainly includes a traction force sensor and a contact force sensor, which are mounted to the end of the robot through a flange, the traction force sensor is on the top, and the contact force sensor is on the bottom.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to modify the system of Vonach with the teachings of Gao and arrange sensors in a series orientation, thereby providing downtime reduction and increased reliability.
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Vonach et al. (E. Vonach, C. Gatterer and H. Kaufmann, "VRRobot: Robot actuated props in an infinite virtual environment," 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 2017, pp. 74-83) (Hereinafter Vonach) in view of Satler et al. (M. Satler, C. A. Avizzano and E. Ruffaldi, "Control of a desktop mobile haptic interface," 2011 IEEE World Haptics Conference, Istanbul, Turkey, 2011, pp. 415-420) (Hereinafter Satler), Chizeck et al. (US 10226869 B2) (Hereinafter Chizeck), and further in view of Kobayashi et al. (US 2018/0059777 A1) (Hereinafter Kobayashi).
Regarding Claim 12, modified Vonach teaches all the elements of claim 1.
However, Vonach does not explicitly spell out the haptic feedback system of claim 1, further
comprising multiple robotic arms that are physically coupled to the user at multiple points.
Kobayashi teaches the haptic feedback system of claim 1, further comprising multiple robotic
arms that are physically coupled to the user at multiple points (See at least Para [0017] “In a system and method, in accordance with implementations described herein, movement of multiple controllers, and in particular, rotation of multiple controllers providing user inputs in a virtual reality environment, may be resolved to a single coordinate system to determine the intended for implementation with respect to a particular virtual object.”, Para [0024] “When interacting in a virtual environment using multiple 6DOF controllers, such as, for example, the first controller A and the second controller B as described above, a way to translate, uniformly scale, and rotate virtual objects using two 6DOF controllers may allow the user to interact with virtual objects in a natural manner, with the first controller A in one hand, and the second controller B in the other hand. In a system and method, in accordance with implementations described herein, a single, consistent, common coordinate system may be defined for multiple 6DOF controllers operating in the same virtual environment (and the same real world environment), with each of the controllers moving independently, within their own respective coordinate systems.”).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to modify the system of Vonach with the teachings of Kobayashi and include multiple robotic arms that are physically coupled to the user at multiple points, thereby providing haptic feedback to multiple points.
Claim(s) 13 is rejected under 35 U.S.C. 103 as being unpatentable over Vonach et al. (E. Vonach, C. Gatterer and H. Kaufmann, "VRRobot: Robot actuated props in an infinite virtual environment," 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 2017, pp. 74-83) (Hereinafter Vonach) in view of Satler et al. (M. Satler, C. A. Avizzano and E. Ruffaldi, "Control of a desktop mobile haptic interface," 2011 IEEE World Haptics Conference, Istanbul, Turkey, 2011, pp. 415-420) (Hereinafter Satler), Chizeck et al. (US 10226869 B2) (Hereinafter Chizeck), and further in view of Bimbo et al. (J. Bimbo, C. Pacchierotti, M. Aggravi, N. Tsagarakis and D. Prattichizzo, "Teleoperation in cluttered environments using wearable haptic feedback," 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 2017, pp. 3401-3408) (Hereinafter Bimbo).
Regarding Claim 13, modified Vonach teaches all the elements of claim 1.
Although Vonach teaches the haptic feedback system of claim 1, wherein the end effector is a 7-axis Crusterawler Pro-Series robotic arm (See at least Page 77 Col 2 Para 2 “Finally, the key component is the employed 7-axis Crusterawler Pro-Series robotic arm (Figure 2e)”, Fig 3), he does not explicitly spell out the end effector is capable of movement with 6 degrees of freedom.
Bimbo teaches the haptic feedback system of claim 1, wherein the end effector is capable of movement with 6 degrees of freedom (See at least Page 3401 Col1 “I. INTRODUCTION … It is composed of a mobile robot equipped with a 6-degrees-of-freedom (6-DoF) manipulator.”, Page 3402 “Fig 1 Haptic-enabled teleoperation system. The slave system is composed of an anthropomorphic IIT/PISA softhand, attached to an ATI gamma force-torque sensor, which is in turn fixed to a 6-DOF universal robot arm.).
Therefore, it would have been obvious to one of the ordinary skill in the art before the
effective filing date of the claimed invention to modify the system of Vonach with the teachings of Bimbo and include end effector with capability of movement with 6 degrees of freedom, thereby allowing for a greater range of motion and more complex movement (See at least Page 3401 Col 2 Para 2 “Finally, ungrounded haptic devices have also been proven to guarantee the stability and safety of haptic-enabled teleoperation loops”, Page 3407 Para 3 “Moreover, using a torque controlled robot would also improve the quality of feedback signals.”, Page 3401 Col 2 Para 2 “Nonetheless, the use of wearable haptic interfaces in telemanipulation can significantly improve the comfort, workspace, and range of application of these systems”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Quaid et al. (US 2006/0142657 A1) teaches haptic guidance system and method for
surgical device
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAHEDA HOQUE whose telephone number is (571)270-5310. The examiner can normally be reached Monday-Friday 8:00 am- 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached on 571-270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like ssistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SHAHEDA HOQUE/Examiner, Art Unit 3658
/Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658