#################################### ##FILL ME IN #################################### ## for a custom note with links: ## note = This tutorial assumes that you have completed the previous tutorial: [[ROS/Tutorials]]. For more information about the laser drivers, see the [[laser_drivers]] stack ## descriptive title for the tutorial ## title = Introduction to Working With Laser Scanner Data ## multi-line description to be displayed in search ## description = This tutorial guides you through the basics of working with the data produced by a planar laser scanner (such as a Hokuyo URG or SICK laser). To learn how to actually produce or change data from laser scanners, please see the [[laser_drivers]] stack. ## the next tutorial description (optional) ## next = ## links to next tutorial (optional) ## next.0.link=[[http://www.ros.org/wiki/laser_filters/Tutorials/Laser filtering using the filter nodes| Applying filters to laser scans]] ## next.1.link=[[laser_assembler/Tutorials/HowToAssembleLaserScans|Assembling (aggregating) single scan lines into a composite cloud]] ## what level user is this tutorial for ## level= IntermediateCategory ## keywords = lasers #################################### <> <> When you're done with this tutorial you should be able to: 1. Visualize the laser scan in rviz. 1. Understand the data that the laser scan (`laser_scan` message) includes. 1. Understand how to convert the laser scan into a more intuitive and useful point cloud (point_cloud message). === Generating Laser Scan Data === To try the tools described in this tutorial, you'll need a source of laser scanner data. If you have a laser scanner available with a driver, you can go ahead and use it. Otherwise, we've provided a source of laser data for you. ROS can record sensor data (actually any ROS system data) into a bag file. You can download a bag file containing laser data [[http://vault.willowgarage.com/wgdata1/vol1/wiki_docs/laser_pipeline/laser.bag|here]] (save link as. That file is no longer accessible, but http://classes.engineering.wustl.edu/cse550/a02.php links to http://classes.engineering.wustl.edu/cse550/data/Mapping1.bag which has a LaserScan and may be a good substitute). To play back the bag, first run a roscore: {{{ $ roscore }}} Then play back the bag: {{{ $ rosbag play laser.bag }}} The bag contains four topics, two of which you'll need for this tutorial: * `/tf [tf/tfMessage]` * `/tilt_scan [sensor_msgs/LaserScan]` The `/tilt_scan` topic was recorded from a planar laser mounted on a tilting platform on the PR2 robot. The `/tf` topic contains the robot transforms. === Visualizing Your Laser Scan === Your laser scan will be published as a sensor_msgs/LaserScan.msg on a hopefully well-named topic (like `/tilt_scan` above). You can use [[rostopic]] to find which topic your laser is publishing on: {{{ rostopic list -v | grep sensor_msgs/LaserScan }}} In `laser.bag`, there is one scanner data stream, so the output is: {{{ * /tilt_scan [sensor_msgs/LaserScan] 1 publisher }}} To visualize a scan, start `rviz` and add a new display panel (follow the rviz instructions [[rviz|here]]) subscribed to your scanner topic and the message type `sensor_msgs/LaserScan`. If you visualize the `/tilt_scan` topic above, the display should look something like this: {{attachment:rviz_base_small.png}} === The Laser Scan Message === Each laser scan is a single scan line. The sensor_msgs/LaserScan message contains the following information: {{{ # Single scan from a planar laser range-finder Header header # stamp: The acquisition time of the first ray in the scan. # frame_id: The laser is assumed to spin around the positive Z axis # (counterclockwise, if Z is up) with the zero angle forward along the x axis float32 angle_min # start angle of the scan [rad] float32 angle_max # end angle of the scan [rad] float32 angle_increment # angular distance between measurements [rad] float32 time_increment # time between measurements [seconds] - if your scanner # is moving, this will be used in interpolating position of 3d points float32 scan_time # time between scans [seconds] float32 range_min # minimum range value [m] float32 range_max # maximum range value [m] float32[] ranges # range data [m] (Note: values < range_min or > range_max should be discarded) float32[] intensities # intensity data [device-specific units]. If your # device does not provide intensities, please leave the array empty. }}} The above message tells you everything you need to know about a scan. Most importantly, you have the angle of each hit and its distance (range) from the scanner. If you want to work with raw range data, then the above message is all you need. However, it might be more convenient to work with points in 3D Cartesian (x,y,z) format instead. Luckily, there are simple ways to convert a laser scan to a point cloud. === Converting a Laser Scan to a Point Cloud === To get your laser scan as a set of 3D Cartesian (x,y,z) points, you'll need to convert it to a point cloud message. The [[sensor_msgs|sensor_msgs/PointCloud]] message looks like this: {{{ Header header geometry_msgs/Point32[] points # Vector of 3d points (x,y,z) ChannelFloat32[] channels # Space for extra information like color. }}} The points array contains the point cloud in the format we want. To convert the laser scan to a point cloud (in a different frame), we'll use the [[http://pr.willowgarage.com/pr-docs/ros-packages/laser_scan/html/classlaser__geometry_1_1LaserProjection.html|laser_geometry::LaserProjector]] class (from the [[laser_geometry]] package). The `laser_geometry` package provides two functions to convert a scan into a point cloud: `projectLaser` and `transformLaserScanToPointCloud`. `projectLaser` does a straight projection from range-angle to 3D (x,y,z), without using `tf`. This means two things: 1) your point cloud will be in the same frame as the scan, and 2) your point cloud will look very strange if the laser or robot were moving while the scan was being taken. The advantage of `projectLaser`, however, is its speed. The function you'll want to use the majority of the time is the handy `transformLaserScanToPointCloud` function, which uses `tf` to transform your laser scan into a point cloud in another (preferably fixed) frame. `transformLaserScanToPointCloud` uses `tf` to get a transform for the first and last hits in the laser scan and then interpolates to get all the transforms inbetween. This is much more accurate (although not absolutely perfect) than using `projectLaser` and `tf::transformPointCloud()` if the laser was moving in the world. You'll likely want the frame for the point cloud to be a fixed frame, that is a frame that's not moving over time. If your robot is stationary, something like the `base_link` might be a good choice. If your robot is moving, choose a stationary `map` frame. That way, point clouds from one time are comparable to point clouds at another time (or any other data in your system). Here's a code snippet that converts a laser scan into a point cloud in the `base_link` frame: {{{#!cplusplus laser_geometry::LaserProjection projector; tf::TransformListener listener; void scanCallback (const sensor_msgs::LaserScan::ConstPtr& scan_in) { sensor_msgs::PointCloud cloud; projector.transformLaserScanToPointCloud("base_link",*scan_in, cloud,listener); // Do something with cloud. } }}} Notice that you need to use `tf` to transform the scan into a point cloud in another frame. As you learned in the [[TFMessageFiltersTutorial|tf message filters tutorial]], you should always use a `tf::MessageFilter` when using `tf` with sensor data. Why? Despite the fact that your node has received the laser scan message, `tf` might not have received all of the information it needs to transform it. By using a `tf::MessageFilter`, you are assured that the laser scan can be converted into the desired frame when the callback starts. Despite using the `tf::MessageFilter`, it is recommended practice to wrap your usage of `tf` in a try-catch loop. Although the transform between the laser frame and the desired point cloud frame existed at the beginning of the callback, there is no guarantee that it still exists when you call `transformLaserScanToPointCloud`. Here is a more complete version of the code snippet above that uses a message filter and a try-catch loop, and publishes the resulting cloud: {{{#!cplusplus #include "ros/ros.h" #include "tf/transform_listener.h" #include "sensor_msgs/PointCloud.h" #include "tf/message_filter.h" #include "message_filters/subscriber.h" #include "laser_geometry/laser_geometry.h" class LaserScanToPointCloud{ public: ros::NodeHandle n_; laser_geometry::LaserProjection projector_; tf::TransformListener listener_; message_filters::Subscriber laser_sub_; tf::MessageFilter laser_notifier_; ros::Publisher scan_pub_; LaserScanToPointCloud(ros::NodeHandle n) : n_(n), laser_sub_(n_, "base_scan", 10), laser_notifier_(laser_sub_,listener_, "base_link", 10) { laser_notifier_.registerCallback( boost::bind(&LaserScanToPointCloud::scanCallback, this, _1)); laser_notifier_.setTolerance(ros::Duration(0.01)); scan_pub_ = n_.advertise("/my_cloud",1); } void scanCallback (const sensor_msgs::LaserScan::ConstPtr& scan_in) { sensor_msgs::PointCloud cloud; try { projector_.transformLaserScanToPointCloud( "base_link",*scan_in, cloud,listener_); } catch (tf::TransformException& e) { std::cout << e.what(); return; } // Do something with cloud. scan_pub_.publish(cloud); } }; int main(int argc, char** argv) { ros::init(argc, argv, "my_scan_to_cloud"); ros::NodeHandle n; LaserScanToPointCloud lstopc(n); ros::spin(); return 0; } }}} Try creating the node above, publishing the point cloud and visualizing the point cloud in rviz. In rviz, this new cloud will look exactly the same as the original laser scan. You might note that you're catching a lot of `tf` errors. This reinforces the fact that despite the use of message filters you still need to use a try-catch loop. Don't give up on message filters, though, without them you would never successfully transform the cloud! You can increase the odds of correctly transforming the cloud by giving the message filter a tolerance: {{{ laser_notifier_->setTolerance(ros::Duration(0.01)); }}} In this way, the transformation will still exist 0.01seconds after the callback starts. === Conclusions === You now know how to listen to data produced by a laser scanner, visualize the data, and transform that data into a point cloud of 3D Cartesian points (x,y,z). === Video Tutorials === The following video presents a small tutorials explaining sone background information about laser scanners in ROS. <> <> ---- ## TutorialCategory ## FILL IN THE STACK TUTORIAL CATEGORY HERE