[Documentation] [TitleIndex] [WordIndex

!

Note: This tutorial assumes that you have completed the previous tutorials: Camera Frames, Calibration Patterns.
(!) Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags.

Performing a Hand-Eye Calibration

Description: This tutorial shows how a hand-eye calibration can be performed with the Ensenso camera node.

Tutorial Level: ADVANCED

With a hand-eye calibration, you can determine the position of the camera relative to a robot. The NxLib supports two different types of setup. The camera can either be fixed with respect to the robot's base or it can be mounted on the robot's hand and move with it.

When the camera is fixed, the calibration can find out the position of the camera relative to the robot's base. In order to perform a calibration, you need to grab a calibration pattern with the robot's hand and move it to different positions where the camera can see it.

When the camera is mounted on the robot's hand, the calibration can find out the exact position of the camera relative to the robot's hand. In order to perform a calibration, you need to place a calibration pattern in the robot's working space and capture images of it from different robot poses.

You can choose whether you have a fixed or a moving camera by specifying the fixed parameter of the camera node.

Steps to Perform a Hand-Eye Calibration

To perform a hand-eye calibration, you need to specify the robot_frame and wrist_frame parameters of the camera node. These frames will be used for getting the current robot pose from TF. For a moving camera, the wrist frame defaults to the camera frame and the base frame needs to be given. For a fixed camera, the base frame defaults to the camera frame and the wrist frame needs to be given.

All steps of the calibration are performed with the calibrate_hand_eye action (ensenso_camera_msgs/CalibrateHandEye).

After the calibration is done, the camera's link is updated, so that the transformation between the camera and the robot wrist (or base respectively) is taken into account for all future data.

Precalibration

For a hand eye calibration, it is sometimes useful to know whether the pattern will be visible from a certain robot position before actually moving there. You can do this with a precalibration.

To get an initial guess for the camera's position relative to the robot wrist, you have to capture at least five patterns. Then you can start the hand eye calibration. After it is done, the collected patterns will still be memorized, so you can continue to collect more patterns and start the calibration again when you have enough of them.

This time, though, you can use the project_pattern action (ensenso_camera_msgs/ProjectPattern) to estimate whether patterns would be visible from a hypothetical robot position. You already know the pattern pose in the camera's target frame from the initial calibration. Additionally, you can give the project_pattern action an arbitrary transformation between the camera and the target frame. It will then project the pattern into the camera as if the camera was at the given position and the action result contains the pattern_is_visible flag that indicates whether the pattern would be fully visible or not.

Examples

Below are examples for a fixed and moving hand eye calibration.

Moving Hand Eye Calibration

In a moving hand eye calibration setup, the stereo camera is usually mounted on top of the robot wrist, or another moving part, that is located on the robot. The Pattern is usually located somewhere static in the workspace of the robot.

Below is a launch file that sets up a stereo camera for a hand eye calibration and a python script, that does the corresponding action calls (ensenso_camera_msgs/CalibrateHandEye). All in all a moving hand eye calibration is performed and the transformation is stored in the end.

   1 <launch>
   2   <arg name="camera_serial" doc="Serial of the stereo camera to calibrate with." />
   3   <arg name="robot_frame" doc="The robots base frame - base_link in the most cases." />
   4   <arg name="wrist_frame" doc="The frame on which the camera is mounted - defaults to the camera frame."/>
   5   <arg name="is_fixed" doc="whether the handeye calibration is fixed or a moving one" />
   6 
   7   <node pkg="nodelet" type="nodelet" name="manager_"  args="manager" output="screen" />
   8   <node pkg="nodelet" type="nodelet" name="Ensenso_$(arg camera_serial)" args="load ensenso_camera/nodelet /manager_" output="screen">
   9     <param name="serial" type="string" value="$(arg camera_serial)" />
  10     <param name="robot_frame" type="string" value="$(arg robot_frame)" />
  11     <param name="wrist_frame" type="string" value="$(arg wrist_frame)" />
  12     <param name="fixed" type="bool" value="$(arg is_fixed)" />
  13   </node>
  14 </launch>

To set up the stereo camera with this launch file above, you will have to define the camera_serial, the wrist_frame, robot_frame and the parameter is_fixed. The wrist_frame should be the TF frame the camera is mounted on. The robot_frame should be the robot's base TF frame. The is_fixed parameter defines, whether the hand eye calibration is a fixed one or not. In this case it should be false. All in all you can call the launch-file like this:

roslaunch ensenso_camera calbrate_hand_eye.launch camera_serial:=<serial of camera> robot_frame:=<robot tf frame> wrist_frame:=<tf frame the camera is mounted on> is_fixed:=false

Now the camera should be set up correctly and you can make the corresponding action calls, which are done in the python script below:

   1 #!/usr/bin/env python
   2 import rospy
   3 import sys
   4 import actionlib
   5 from actionlib_msgs.msg import GoalStatus
   6 from ensenso_camera_msgs.msg import CalibrateHandEyeAction, CalibrateHandEyeGoal
   7 
   8 
   9 def reset_patterns(hand_eye_client):
  10     rospy.loginfo("Resetting pattern buffer...")
  11     goal = CalibrateHandEyeGoal()
  12     goal.command = CalibrateHandEyeGoal.RESET
  13     hand_eye_client.send_goal(goal)
  14     wait_for_result(hand_eye_client)
  15 
  16 
  17 def wait_for_result(hand_eye_client):
  18     hand_eye_client.wait_for_result()
  19     if hand_eye_client.get_state() != GoalStatus.SUCCEEDED:
  20         rospy.logerr("Action was not successful.")
  21     rospy.loginfo("Action successfull")
  22 
  23 
  24 def capture_pattern(hand_eye_client):
  25     rospy.loginfo("Capturing a pattern.")
  26     goal = CalibrateHandEyeGoal()
  27     goal.command = CalibrateHandEyeGoal.CAPTURE_PATTERN
  28     hand_eye_client.send_goal(goal)
  29     wait_for_result(hand_eye_client)
  30     try:
  31         # Check if a stereo pattern has been found.
  32         result = hand_eye_client.get_result()
  33         if not result.found_pattern:
  34             rospy.logerr("Did not find any patterns.")
  35     except:
  36         pass
  37 
  38 
  39 def calibrate_hand_eye(hand_eye_client, store_calibration=True):
  40     rospy.loginfo("Running the calibration. Will also store calibration: {}.".format(store_calibration))
  41     goal = CalibrateHandEyeGoal()
  42     goal.command = CalibrateHandEyeGoal.START_CALIBRATION
  43     goal.write_calibration_to_eeprom = store_calibration
  44     hand_eye_client.send_goal(goal)
  45     wait_for_result(hand_eye_client)
  46 
  47 
  48 def main():
  49     capture_wait = rospy.get_param("~capture_wait", 10.0)
  50     count_poses = rospy.get_param("~count_poses", 6)
  51     timeout = rospy.get_param("~timeout", 60)
  52 
  53     hand_eye_calib_client = actionlib.SimpleActionClient("calibrate_hand_eye", CalibrateHandEyeAction)
  54     if not hand_eye_calib_client.wait_for_server(rospy.Duration(timeout)):
  55         rospy.logerr("Could not reach the Action Server. The camera node is not running!")
  56         sys.exit()
  57 
  58     reset_patterns(hand_eye_calib_client)
  59     for i in range(count_poses):
  60         #
  61         # ----The Robot should stand still here.
  62         # 
  63         rospy.loginfo("Pattern number: {}".format(i + 1))
  64         now = rospy.get_time()
  65         capture_pattern(hand_eye_calib_client)
  66 
  67         if i == count_poses - 1:
  68             rospy.loginfo("Finished capturing patterns")
  69             break
  70 
  71         #
  72         # ---- Move the robot to the next position.
  73         #
  74         time_left_to_move_robot = capture_wait - (rospy.get_time() - now)
  75         while time_left_to_move_robot > 0.8:
  76             start_loop = rospy.get_time()
  77             rospy.loginfo("Capturing the next pattern in: {} seconds.".format(time_left_to_move_robot))
  78             rospy.sleep(0.75)
  79             end_loop = rospy.get_time()
  80             time_left_to_move_robot -= (end_loop - start_loop)
  81         rospy.sleep(time_left_to_move_robot)
  82 
  83     calibrate_hand_eye(hand_eye_calib_client, True)
  84 
  85 
  86 if __name__ == "__main__":
  87     try:
  88         rospy.init_node("ensenso_camera_hand_eye_calibration")
  89         main()
  90     except rospy.ROSInterruptException:
  91         pass

The python script does the following essential parts:

The action calls are wrapped in the functions reset_patterns(), capture_pattern() and calibrate_hand_eye() within this script.

First of all the script waits for the hand eye calibration server to come online until the timeout parameter is reached. The script will exit otherwise.

  53     hand_eye_calib_client = actionlib.SimpleActionClient("calibrate_hand_eye", CalibrateHandEyeAction)
  54     if not hand_eye_calib_client.wait_for_server(rospy.Duration(timeout)):
  55         rospy.logerr("Could not reach the Action Server. The camera node is not running!")
  56         sys.exit()

If the connection to the server has been successful, all previously captured patterns are deleted from the buffer. The action call is done in the following code line:

  58     reset_patterns(hand_eye_calib_client)

Here an action goal (ensenso_camera_msgs/CalibrateHandEye) is sent with the constant parameter RESET, which tells the camera to reset its pattern buffer.

The following loop captures images of a pattern. The loop count is defined though the paramter count_poses, which has to be defined by you. It must be a minimum of 5 Poses.

  59     for i in range(count_poses):
  60         #
  61         # ----The Robot should stand still here.
  62         # 
  63         rospy.loginfo("Pattern number: {}".format(i + 1))
  64         now = rospy.get_time()
  65         capture_pattern(hand_eye_calib_client)
  66 
  67         if i == count_poses - 1:
  68             rospy.loginfo("Finished capturing patterns")
  69             break
  70 
  71         #
  72         # ---- Move the robot to the next position.
  73         #
  74         time_left_to_move_robot = capture_wait - (rospy.get_time() - now)
  75         while time_left_to_move_robot > 0.8:
  76             start_loop = rospy.get_time()
  77             rospy.loginfo("Capturing the next pattern in: {} seconds.".format(time_left_to_move_robot))
  78             rospy.sleep(0.75)
  79             end_loop = rospy.get_time()
  80             time_left_to_move_robot -= (end_loop - start_loop)
  81         rospy.sleep(time_left_to_move_robot)

In this example the robot moves to a position and then a pattern capture is done. The following line

  65         capture_pattern(hand_eye_calib_client)

triggers a capture with the action goal parameter CAPTURE_PATTERN. The pattern is stored into the camera's pattern buffer. It's very important the robot stands still before capturing images, as it will result in a bad calibration otherwise. The commented code lines 54-58 show, where the robot should stand still. Optionally wait a bit longer to be sure the robot is not moving.

After the capture_pattern() call the robot can be moved to the next position. The lines 70 to 77 are a timer, that only shows the remaining time the robot has left to move to the next capture position.

After the loop, when all patterns are captured, the hand eye calibration is initiated.

  83     calibrate_hand_eye(hand_eye_calib_client, True)

With the goal action START_CALIBRATION the calibration is calculated with the patterns taken beforehand. With the optional parameter write_calibration_to_eeprom = True, the calibration is also stored onto the camera's eeprom and is now available as an internal transformation link from the camera_frame to the wrist_frame of the robot.

Launching the hand eye calibrated (moving) camera

After successful calibration, the hand eye calibration transformation camera_frame to wrist_frame is stored onto the camera's eeprom and can now be used through camera restarts. You can now shutdown the camera by killing the launch file (calbrate_hand_eye.launch) you started before.

In the new launch file below, you can use the stored calibration in the eeprom of the camera. The hand eye calibrated transformation will be published between the camera_frame and the link_frame (which is the same as the wrist_frame before) in TF.

If you want to receive the 3D data relative to the robot base, you used in the calibration before (robot_frame), you will have to set the target_frame to the robot base frame. You can use any frame you want though. The camera will fetch the transformation between the link_frame and the target_frame from TF and will transform the 3D data correspondingly.

   1 <launch>
   2   <arg name="camera_serial" doc="Serial of the stereo camera to calibrate with." />
   3   <arg name="target_frame" doc="The frame you want to receive 3D data in." />
   4   <arg name="link_frame" doc="The frame on which the camera is mounted - defaults to the camera frame for a fixed hand eye calibration."/>
   5 
   6   <node pkg="nodelet" type="nodelet" name="manager_"  args="manager" output="screen" />
   7   <node pkg="nodelet" type="nodelet" name="Ensenso_$(arg camera_serial)" args="load ensenso_camera/nodelet /manager_" output="screen">
   8     <param name="serial" type="string" value="$(arg camera_serial)" />
   9     <param name="target_frame" type="string" value="$(arg target_frame)" />
  10     <param name="link_frame" type="string" value="$(arg link_frame)" />
  11   </node>
  12 </launch>

In order to run this launch-file above (which is included in the ensenso_camera package), you can run it with

roslaunch ensenso_camera hand_eye_moving_calibrated.launch camera_serial:=<camera serial> target_frame:=<frame you want to receive the data in> link_frame:=<the wrist_frame from the hand eye calibration>

WIP: Fixed Hand Eye Calibration

This Section is still under development.

For a fixed hand eye calibration, the camera is mounted somewhere static in the scene, whereas the pattern is usually mounted on the robot wrist, or is even held in the gripper of the robot. Therefore you can use the same frames (robot_frame and wrist_frame) as in the moving hand eye calibration. The only difference, is that the parameter fixed has to be true. The call of the launch file in the terminal is given below:

roslaunch ensenso_camera calbrate_hand_eye.launch camera_serial:=<serial of camera> robot_frame:=<robot tf frame> wrist_frame:=<tf frame the camera is mounted on> is_fixed:=true

In this example however, we will not use the live TF data to get the pose of the wrist for every pattern pair. We will use the poses and images we captured beforehand and we will overwrite the pattern buffer with our own captured patterns, as well as the pose buffer, which stores the corresponding poses of the needed coordinate frames.


2020-01-25 12:37