This tutorial will be exploring how to add a multitude of sensors to allow a robot to see and perceive their environment. These sensors obtain information which can be used to build and maintain the map of the environment, localize the robot on the map, and to track the obstacles in the environment.

Examples of some commonly used sensors are lidar, radar, RGB camera, depth camera, IMU, and GPS. To standardize the message formats of these sensors and allow for easier interoperation between vendors, ROS provides the sensor_msgs package to define the common sensor interfaces. Let’s take a look at some common sensor messages.

Table of Contents

Common Sensor Messages

In this subsection, we discuss some of the common types of sensor_msgs you might encounter when setting up NAV2.

sensor_msgs/LaserScan

This message represents a single scan from a planar laser range-finder. This message is used in slam_toolbox and nav2_amcl for localization and mapping, or in nav2_costmap_2d for perception.

sensor_laserscan.png

sensor_msgs/PointCloud2

This message holds a collection of 3D points, plus optional additional information about each point. This can be from a 3D lidar, a 2D lidar, a depth camera or more.

sensor_pointcloud2.png

sensor_msgs/Range

This is a single range reading from an active ranger that emits energy and reports one range reading that is valid along an arc at the distance measured. A sonar, IR sensor, or 1D range finder are examples of sensors that use this message.

sensor_range.png

sensor_msgs/Image

This represents the sensor readings from RGB or depth camera, corresopnding to RGB or range values.

sensor_image.png

Adding Gazebo Plugins to a URDF

Next, were going to show how to attach sensors to our simulated robot. To do this, we must add links and gazebo references to attach these plugins.

Use these examples as templates on how to add the following sensors to your URDF.

LIDAR

Visual Camera

Depth Camera