Difference between revisions of "Haptic controller"
(→Testing your controller works)
|Line 31:||Line 31:|
* Run the controller, specifying which arm you want to use.
* Run the controller, specifying which arm you want to use.
** roslaunch hrl_haptic_mpc start_pr2_mpc.launch arm:=
** roslaunch hrl_haptic_mpc start_pr2_mpc.launch arm:=l
* Run Rviz
* Run Rviz
** rosrun rviz rviz
** rosrun rviz rviz
Revision as of 18:28, 18 March 2013
This page describes the revised haptic MPC control system architecture, what each module does and where the associate parameters reside, and how to run the system.
- 1 Getting started
- 2 Running the MPC
- 3 Control System Architecture
- 4 Tactile Skin Client
- 5 Haptic State Publisher
- 6 Waypoint Generator
- 7 Haptic MPC
- 8 Teleop interface
- 9 Adding a new robot to the controller
- 10 PR2 notes
To run the haptic MPC you need the following:
- Python (2.7)
- OpenOpt python package:
- ROS Fuerte. Should also work fine on Electric (has been tested intermittently, but not extensively regression tested).
- ROS PR2 desktop package
- ROS PR2 simulation package (for testing everything before you try the robot):
NB: The required source checkouts can be installed using the rosinstall file in the haptic mpc directory. See 'hrl_haptic_mpc.rosinstall' for details of the git repositories.
Testing your controller works
With all the above dependencies checked out/installed, do the following to ensure everything works.
- Build the MPC and dependencies. It's mostly python but may not have the messages built:
- rosmake hrl_haptic_mpc
- Run the PR2 simulator
- roslaunch pr2_gazebo pr2_empty_world.launch
- Run a script to change the PID gains on the PR2 arm joint controllers. The standard ones are stiffer than ideal for the MPC.
- roscd hrl_haptic_mpc
- Run the controller, specifying which arm you want to use.
- roslaunch hrl_haptic_mpc start_pr2_mpc.launch arm:=l
- Run Rviz
- rosrun rviz rviz
- In Rviz, add an interactive marker and point it at the mpc teleop topic.
- Move the marker around to some meaningful point in space.
- Right click the marker, click "Go". The controller should move the end effector to the specified position
- Right click the marker again, click "Orient". The controller should orient the gripper to match the marker.
If these steps worked - the controller is set up correctly.
Running the MPC
Using a launch file
The simplest way to run the controller is to roslaunch of the specified launch files. These launch files will handle most of the parameter initialisation and simplify things.
- Start the skin:
- roslaunch hrl_fabric_based_tactile_sensor pr2_tactile_sleeve_combined arm_to_use:=<l or r>
- Start the controller for the PR2:
- roslaunch hrl_haptic_mpc start_pr2_mpc.launch arm:=<l or r>
- Start the skin and robot.
- Meka: roslaunch cody cody.launch use_forearm_skin_sensor:=1 use_netft:=1 arm_to_use:=[r or b]
- Fabric: roslaunch hrl_fabric_based_tactile_sensor cody_tactile_sleeve.launch
- Start the controller for Cody. NB: There are two launch files depending on sensor - the Meka skin is ONLY on the right arm so this is enforced by the launch file.
- roslaunch hrl_haptic_mpc start_cody_mpc_meka.launch
- roslaunch hrl_haptic_mpc start_cody_mpc_fabric.launch
- Start the simulation (includes skin simulation:
- roslaunch hrl_software_simulation_darpa_m3 gen_simulator random:=1
- Start the controller for the sim:
- roslaunch hrl_haptic_mpc start_sim_mpc.launch
Running scripts directly
Sometimes it may be preferable to run the controller scripts directly. Running the launch files will mean that much of the debug output is swallowed (only WARN level or above will print by default). Each of the controller nodes must be started manually. The controller will not act until it receives valid messages, hence the order is not critical.
- ROBOT: Which robot parameters to use. Options: pr2, cody, sim3
- ARM: Which arm to use on the multi-armed robots (pr2, cody). Options: l, r
- SENSOR: Which sensor type to listen to. Options: fabric, hil
Sequence to execute:
- roscd hrl_haptic_mpc/src/hrl_haptic_mpc
- ./robot_haptic_state_node.py -r <ROBOT> -s <SENSOR> -a <ARM>
- ./mpc_teleop_server.py -r <ROBOT>
- ./arm_trajectory_generator.py -r <ROBOT> -a <ARM>
- ./haptic_mpc.py -r <ROBOTNAME> -a <ARM>
The controller requires two parameter files to be loaded onto the ROS parameter server - one for the robot specific interfaces (eg, what skin topics to listen to and joint limits), and a second for controller parameters (weights associated with position/orientation, etc). For different robots you may want to copy one of the base ones and modify these to tune the controller's performance.
- pr2_config_params.yaml - robot parameters
- mpc_params_pr2.yaml - controller parameters
To load these onto the parameter server, do the following. Make sure your ROS master is set correctly!
- rosparam load <filename>
Control System Architecture
Tactile Skin Client
- Filename: skin_client.py, plus robot specific versions (pr2_skin_client, sim_skin_client)
- Arguments: None (don't run this directly - it's instantiated by the haptic state node).
The skin client class monitors a specified list of topics for TaxelArray messages. The "TaxelArray" message is a message containing lists of arbitrary length of taxel data: taxel centre (x,y,z), the force vector (x,y,z) and the force magnitude (x,y,z).
The skin client subscribes to all provided taxel array topics and buffers the data. It also provides an interface to add or remove subscriptions to taxel array topics on the fly. This allows the skin client to selectively ignore data based on some external variable. For example, if a joint is in a configuration which will fire a taxel due to skin bunching, it would be sensible to ignore this taxel data. This configuration dependent taxel logic will (soon) be added by Survy.
The functionality in skin_client.py is agnostic of the robot it is running on, however, due to the constraints of the original controller there is one function call which needs to know about robot specific joints. Currently this is implemented as a class which inherits from skin_client.py and overloads the function (which is defined in skin_client.py but throws an exception if not overloaded). This functionality will eventually be moved into robot_haptic_state to separate skin_client more cleanly.
Haptic State Publisher
- Filename: robot_haptic_state_node.py
- Arguments: -r <robot eg pr2, sim3, cody> -a <arm eg l, r> -s <sensor eg fabric, meka>
The haptic state publisher exists to centralise all the relevant haptic state data used by the controller, and publish it in one single message which the controller can use. This abstracts the robot specific parameters from the controller, simplifying changing robots, sensors, etc.
The robot haptic state parameter reads the current arm state and current skin state, then performs a number of calculations to derive the various Jacobian matrices. These are packaged in a simple haptic state message which is then published.
- Robot specific. Accessed through the robot client class and the skin client class.
- Haptic state: /haptic_mpc/robot_state
- Filename: waypoint_generator.py
- Arguments: -r <robot eg pr2, sim3, cody> -a <arm eg l, r>
The waypoint generator exists as an interface between the teleoperation/motion planning modules and the controller. As inputs, it listens to both end goal poses and trajectories. If the provided trajectory is in joint space, the node will perform forward kinematics on the trajectory to convert it to cartesian space. The waypoint generator will attempt to interpolate any given goal and provide the controller with a local goal 50mm from the current end effector pose. These parameters are configurable, and the waypoint generator can be bypassed altogether if required (the controller will internally interpolate single goals). On hearing an updated goal of any sort, the internal buffers are flushed and the latest goal is used.
- Single goal location: /haptic_mpc/goal_pose
- Cartesian trajectory:
- Joint trajectory:
- Controller waypoint: /haptic_mpc/traj_pose
- Filename: haptic_mpc.py
- Arguments: -r <robot eg pr2, sim3> -a <arm eg l,r> [NB: Soon to be deprecated - the robot parameter was only required to get joint limits from a kinematics class]
The core haptic MPC node. This takes the packaged robot haptic state data and computes deltas for the desired joint angles. These are then sent back to the robot haptic state node to be sent to the robot as appropriate. The python class mostly listens for messages, unpacks the data and forms it appropriately for the maths functions found in epc_skin_math.py. This script formulates the problem for the QP solver.
- Waypoint pose: /haptic_mpc/traj_pose
- Robot haptic state: /haptic_mpc/robot_state
- Delta Q_des: /haptic_mpc/delta_q_des
- Q_des: /haptic_mpc/q_des (used only to reset the equilibrium angles when the controller starts up)
- Filename: mpc_teleop_rviz.py
- Arguments: -r <robot eg pr2, sim3> -a <arm eg l,r>
The RViz teleop interface uses interactive markers to make it easy to send goals to the controller. It will display an appropriate marker for the gripper based on the robot input argument, allowing the user to drag the marker in space to the desired goal. To send the goal to the controller, right click the marker and select either "Go" or "Orient". "Go" will send the goal pose to the controller and enable it in position-only mode. Using "Orient" will do much the same, but will make the controller try to achieve the specified orientation as well as position.
The teleop interface menu also allows the gripper to be opened/closed and the skin re-zeroed.
- N/A (some interactive markers but not important to the user).
- Goal pose: /haptic_mpc/goal_pose
Adding a new robot to the controller
Adding a new robot type to the controller is simple. Most of the changes need to be made to the robot_haptic_state_node module.
- Create a robot client class. This class must provide basic kinematics/jacobian functions and also provide handlers to interface with the joint controllers on the arm. See pr2_arm_darpa_m3.py as an example.
- Create a skin client class which inherits from skin_client. See pr2_skin_client.py as an example.
- Modify the option parsing script in haptic_mpc_util.py to add your new robot type as a valid input argument. If necessary, add constraints to enforce certain sensor types or a left/right arm.
- Modify robot_haptic_state_node.py.
- Add an initialisation routine for your robot. This should set up all relevant parameters for your robot and instantiate the robot and skin clients classes.
- Ensure you are correctly setting up the joint limits used by the controller. These are stored on the parameter server at /haptic_mpc/joint_limits/[min, max]. The haptic state node should copy the appropriate joint limits to this location before the controller starts.
- Add a selective taxel handling routine for your robot. This functionality is not fully implemented yet for the PR2 (but is close). If taxels will always fire in certain configurations (eg, the wrist taxel on a PR2 when the wrist is bent), ignore it based on the joint positions.
- Create a YAML file with your robot configuration parameters (see pr2_params.yaml for an example).
- Create a YAML file with the appropriate controller parameters, if the standard ones don't suit (see mpc_params.yaml for an example).
- Test the controller. The nodes should be able to be started out of order, but it's a good idea to start the MPC last.
- If all goes well, create a launch file for your modified controller. See launch/start_pr2_mpc.launch for an example.
NB: If the arm is running slowly on the PR2, it probably means you need to restart the arm controllers with lower gains. The arm should be relatively compliant if you push it away from its holding position.
To change these, perform the following steps:
- roscd hrl_pr2_gains
The controllers will shut down (the arms will fall), and will restart when you press a key.