Reading from AR-Drone¶
Drone Update Frequencies: The drone’s data transmission update frequency depends on navdata_demo parameter. When it is set to 1, the transmission frequency is set 15Hz, otherwise transmission frequency is set to 200Hz. (navdata_demo is a numeric parameter not Boolean, so use 1 and 0 (not True/False) to set/unset it)
Driver Update Frequencies: The driver can operate in two modes: real-time or fixed rate. When the realtime_navdata parameter is set to True, the driver publishes any received information instantly. When it is set to False, the driver caches the received data first, then sends them at a fixed rate. This rate is configured via looprate parameter. The default configuration is:
Please note that if looprate is smaller than the drone’s transmission frequency, some data is going to be lost. The driver’s start-up output shows the current configuration. You can also use rostopic hz command to check the publish rate of the driver.
# Default Setting - 50Hz non-realtime update, the drone transmission rate is 200Hz $ rosrun ardrone_autonomy ardrone_driver _realtime_navdata:=False _navdata_demo:=0 # 200Hz real-time update $ rosrun ardrone_autonomy ardrone_driver _realtime_navdata:=True _navdata_demo:=0 # 15Hz real-rime update $ rosrun ardrone_autonomy ardrone_driver _realtime_navdata:=True _navdata_demo:=1
Linear acceleration, angular velocity and orientation of the drone is published to a standard ROS sensor_msgs/Imu message. The units are all metric and TF reference frame is set to drone’s base frame. The covariance values are specified through
cov/imu_or parameters. For More information, please check the Parameters section.
The normalized magnetometer readings are published to
ardrone/mag topic as a standard ROS geometry_msgs/Vector3Stamped message.
New in version 1.4.
The driver calculates and publishes Odometry data by integrating velocity estimates reported by the drone (which is based on optical flow). The data is published as nav_msgs/Odometry messages to
ardrone/odometry topic. The corresponding TF transform is also published as odom -> base transformation.
Both AR-Drone 1.0 and 2.0 are equipped with two cameras. One frontal camera pointing forward and one vertical camera pointing downward. This driver will create three topics for each drone:
ardrone/bottom/image_raw. Each of these three are standard ROS camera interface and publish messages of type image transport. The driver is also a standard ROS camera driver, therefor if camera calibration information is provided either as a set of ROS parameters or through
ardrone_bottom.yaml files, calibration information will be also published via camera_info topics. Please check the FAQ section for more information.
- The ardrone/* will always contain the selected camera’s video stream and information.
The way that the other two streams work depend on the type of Drone.
AR-Drone 1 supports four modes of video streams: Front camera only, bottom camera only, front camera with bottom camera inside (picture in picture) and bottom camera with front camera inside (picture in picture). According to active configuration mode, the driver decomposes the PIP stream and publishes pure front/bottom streams to corresponding topics. The camera_info topic will include the correct image size.
AR-Drone 2 does not support PIP feature anymore, therefore only one of ardrone/front or ardrone/bottom topics will be updated based on which camera is selected at the time.
Navdata message also contains information about the special tags that are detected by the drone’s on-board vision processing system. To learn more about the system and the way it works please consult AR-Drone SDK 2.0’s developers guide. These tags are detected on both video cameras on-board at 30fps. To configure (or disable) this feature check the Parameters section.
Information about these detected tags are published through the following field of the Legacy Navigation data message.
tags_count: The number of detected tags.
tags_type: Vector of types of detected tags (details below)
tags_height: Vector of position components and size components for each tag. These numbers are expressed in numbers between [0,1000]. You need to convert them back to pixel unit using the corresponding camera’s resolution (can be obtained front camera_info topic).
tags_orientation: For the tags that support orientation, this is the vector that contains the tag orientation expressed in degrees [0..360).
By default, the driver configures the drone to look for oriented roundels using bottom camera and 2D tags v2 on indoor shells (orange-yellow) using front camera. For information on how to extract information from tags_type field. Check the FAQ section in the end.