User:Haidai

From Healthcare Robotics Wiki
Revision as of 22:10, 13 July 2012 by Haidai (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

ROBHUM

cbsdemo

mkdir cbsdemo 
cd cbsdemo
wget http://gt-ros-pkg.googlecode.com/git/cbsdemo/cbsdemo.rosinstall
rosinstall . cbsdemo.rosinstall
ln -s /u/cbsdemo/vcs/hrl-demo/cbsdemo/robot_behaviors/ /u/cbsdemo/robot_behaviors

sshfs

sshfs -o idmap=user haidai@c1:/u/haidai/robot_behaviors/Behaviors pr2_behaviors fusermount -u pr2_behaviors

Rosinstall

https://code.ros.org/svn/wg-ros-pkg/branches/trunk_electric/rosinstall_files

Need to update to latestest rosinstall for fuerte install files.

Installing face_detector (old):

cd ~
svn co https://code.ros.org/svn/wg-ros-pkg/stacks/people/branches/electric_trunk face_detector
sudo mv face_detector /opt/ros/electric/stacks
rosmake face_detector
cd /opt/ros/electric/stacks
sudo chown -R root face_detector
sudo chgrp -R root face_detector
rosmake face_detector

Installing

rosinstall . robhum.rosinstall
rosmake pr2_interactive_manipulation pr2_interactive_manipulation
rosmake rcommander_pr2
rosrun rcommander_pr2_gui make_folder.sh

Starting Simulation

Starting just the simulator...

roslaunch pr2_gazebo pr2.launch

But we don't want that normally, so on the first computer run (in separate terminals):

roslaunch henry_gazebo pr2_henry_home.launch 
export ROBOT=sim && roslaunch henry_manipulation henry_manipulation_robot.launch

On the second run (in the same terminal)

export ROS_MASTER_URI=http://montybase.hsi.gatech.edu:11311
roslaunch henry_manipulation henry_manipulation_desktop.launch

Starting RCommander

On robot/sim machine:

roslaunch pr2_interactive_manipulation pr2_interactive_manipulation_robot.launch 
roslaunch rcommander_pr2_gui rcommander_only.launch

On desktop:

roslaunch pr2_interactive_manipulation pr2_interactive_manipulation_desktop.launch
roslaunch rcommander_pr2_gui run_rcommander_pr2.launch

Random ROS Things

  • Checking out face detection package (annoyingly not a deb anymore for some reason):
svn co https://code.ros.org/svn/wg-ros-pkg/stacks/people/branches/electric_trunk/ people

Task Relevant Feature Learning

purple light switch uses lower res old aware home map.

On robot:

roslaunch trf_learn trf_learn.launch
roslaunch laser_interface laser_detector_office_local.launch 2> /dev/null

On robot:

roslaunch trf_learn execute.launch

On controlling computer:

rosrun pr2_dashboard pr2_dashboard 
rosrun rviz rviz
rosrun laser_interface user_interface_node.py (after X-forward)
rosrun image_view image_view image:=/active_learn/image


Prior to running:

  • Tuck arm
rosrun pr2_tuckarm tuck_arms.py -r t -l t
  • Localize robot

Plotting scripts:

find . -name "*execute*.pkl" -exec rosrun trf_learn recognize_3d_density_plot.py '{}' \;

Admin Notes

  • Adding a new user account
sudo adduser leibs 
usermod -a -G admin leibs 
usermod -a -G apt watts
usermod -a -G rosadmin vpradeep

Tricks

Look for parameters buried in YAML files:

find . -name *.yaml -exec grep -Hn occdist_scale {} \;

Processing Recorded Data

Processing raw image frames:

ROS_NAMESPACE=wide_stereo/right rosrun image_proc image_proc

Processing raw laser scans into pointclouds:

roslaunch hai_sandbox laser_filter_assemble.launch


Starting laser_interface

Parameter Tuning!

  1. Launch laser_detector_office_local.launch
  2. In the GUI press d (display), then g (debug)
  3. Pull the intensity b & w image window to the front
rosparam set laser_pointer_detector/intensity_threshold_low 20
rosparam set laser_pointer_detector/intensity_threshold_high 215

Training

  1. Point the PR2's head to a static scene without any moving objects.
  2. With the GUI, set the detector's mode to positive
  3. Light the laser pointer and move it around the static scene...

Running Demo

  • on pr2c1
ssh pr2c1
roslaunch laser_pointer_action follow_pointer.launch 2> /dev/null
  • on skynet
rosrun laser_interface user_interface_node.py

Starting omni_teleop

Prerequisites

  1. Checked out copy of gt-ros-pkg (on all machines involved)
  2. Checked copy of the stack pr2_cockpit (insert into ROS_PACKAGE_PATH)
  3. Time synch'ed machines (be wary of chrony).

Preparation

  1. Start the robot with (make sure the pr2_cockpit stack is in your package path):
sudo robot start -e   
  1. Pull out omni end-effectors from calibration wells.
  2. Run-stop the PR2

On machine 1 (laptop). Be sure to set ROS_IP if connecting to gtpr2LAN before executing the command below':

rosrun phantom_omni omni2.sh

Now place omni2's tip back into its calibration well.

On machine 2 (some HSI computer):

rosrun omni_teleop stereo_omni_pr2.sh

Place omni1's tip back into its calibration well. Open rosconsole on pr2_dashboard.

Tips/Things to watch out for

  • To restart omni drivers, just unplug and replug the associated omni's firewire cable.
  • ROS_IP might not be set correctly if you're not connected to gtpr2LAN.

Features/TODO

  • Make the PR2's head track one of the grippers tutorial.
  • Change stereo-anaglyph's separation distance based on gripper's distance from cameras.
  • Force feedback

Recording Teleoperation

step 1

rosrun pr2_omni_telop capture_experiment.py initial_state_on
rosbag record /tf /prosilica/image_rect_color /narrow_stereo/points /wide_stereo/points 

step 2

Records from wide stereo, forearm cameras, cartesian controllers, accelerometers, base, grippers, torso, joint states, joystick, base laser, tilting laser, and tilting controller.

rosbag record  /wide_stereo/left/image_rect_color /wide_stereo/left/camera_info /wide_stereo/right/image_rect_color /wide_stereo/right/camera_info /narrow_stereo_textured/points /l_forearm_cam/image_rect_color /r_forearm_cam/image_rect_color /l_forearm_cam/camera_info /r_forearm_cam/camera_info /accelerometer/l_gripper_motor /accelerometer/r_gripper_motor /torso_lift_imu/data /l_cart/command_pose /l_cart/command_posture /l_cart/state /r_cart/command_pose /r_cart/command_posture /r_cart/state /head_traj_controller/command /base_controller/command /l_gripper_controller/command /r_gripper_controller/command /torso_controller/command /torso_controller/state /joint_states /joy /base_scan /tf /tilt_scan /laser_tilt_controller/laser_scanner_signal /amcl_pose /pressure/l_gripper_motor /pressure/r_gripper_motor /omni1_button /omni2_button

misc notes

/omni1_button
/omni2_button
/pressure/l_gripper_motor
/pressure/r_gripper_motor
/wide_stereo/left/image_rect
/wide_stereo/left/camera_info
/wide_stereo/right/image_rect
/wide_stereo/right/camera_info
/wide_stereo/points
/narrow_stereo_textured/points
/l_forearm_cam/image_rect_color
/r_forearm_cam/image_rect_color
/accelerometer/l_gripper_motor
/accelerometer/r_gripper_motor
/torso_lift_imu/data
/l_cart/command_pose
/l_cart/command_posture
/l_cart/state/
/r_cart/command_pose
/r_cart/command_posture
/r_cart/state/
/head_traj_controller/command
/base_controller/command
/l_gripper_controller/command
/r_gripper_controller/command
/torso_controller/command
/torso_controller/state
#Questionable..?
/narrow_stereo_textured/points2
/wide_stereo/points2
/laser_tilt_controller/laser_scanner_signal
/laser_tilt_controller/set_periodic_cmd
/laser_tilt_controller/set_traj_cmd

Moving the Laser

Lifted from training slides

rostopic list laser_tilt_controller
rosservice list laser_tilt_controller
rosservice type /laser_tilt_controller/set_periodic_cmd
rossrv show pr2_msgs/SetPeriodicCmd
rosrun pr2_mechanism_controllers send_periodic_cmd_srv.py
rosrun pr2_mechanism_controllers send_periodic_cmd_srv.py laser_tilt_controller linear 3 0.4 0.0
rosrun pr2_controller_manager pr2_controller_manager list-joints
rostopic info joint_states
rosmsg show sensor_msgs/JointState
rxplot -b 60 /joint_states/position[15]

Or

rosservice call laser_tilt_controller/set_periodic_cmd '{ command: { header: { stamp: 0 }, profile: "linear" , period: 3 , amplitude: 1 , offset: 0 }}'

To stop tilting use:

rosservice call laser_tilt_controller/set_periodic_cmd '{ command: { header: { stamp: 0 }, profile: "linear" , period: 3 , amplitude: 1 , offset: 0 }}'